Ask HN: Anyone want to collaborate on a local-first AI-based research assistant (news.ycombinator.com)
Hi HN Community, I'm Venkatram, a sophomore who's on a mission to build a local alternative to proprietary third-party AI-based research assistants.
The idea is to turn documents into researchable assets that contain as much as information as the original information does, but it's more reusable.
Well, quite frankly, this is still under a WORK IN PROGRESS, so i'm still figuring on how it can be properly used, and I got to be honest here, i definitely need some help to build this, so if you wish, you are welcome!
TlDR: NotebookLM, but Locally with your OWN AI Model
Github: https://github.com/venkatram-s/gigabook-lm
AnythingLLM requires about 2GB of RAM, just to be idle, and in the RAM Crisis, that this world is in today. i want to bring better optimization.
And, AnythingLLM utilizes a "UI-first, Logic-second" approach. Meanwhile, to combat this, i'm using a "Logic-first, UI-second" approach.
Not to mention, A system like AnythingLLM isn't easy to work on lower-end computers, and I for one, do not like that, because, i'm using a low-end computer to work on this.