NewsLab
Apr 28 23:38 UTC

Show HN: LocalLLM – Recipes for Running the Local LLM (Need Contributors) (locallllm.fly.dev)

15 points|by Igor_Wiwi||2 comments|Read full story on locallllm.fly.dev
I built localLLLM: a small community project for running local models.

Live: https://locallllm.fly.dev

The goal is simple: if someone has model + OS + GPU + RAM, they should get steps that actually work (ideally one liner)

I need help populating and validating guides.

If you run local models, please submit one working recipe (or report what failed). Would love to hear general feedback as well!

Comments (2)

2 shown
  1. 1. modinfo||context
    This is nice idea! But opensource it and instant to save data to db, just save to markdown on github, then everyone can send PRs to edit/add instructions. More freedom and simpler.
  2. 2. doc_ick||context
    Agreed, otherwise it sounds like op is crowdsourcing just to avoid paying the manual effort.

    *Edit: op has been vibe coding all over, could just vibe code the guides themselves without human input?