Hi Hackers,
Excited to share a macOS app I’ve been working on: https://recurse.chat/ for chatting with local AI. While it’s amazing that you can run AI models locally quite easily these days (through llama.cpp / llamafile / ollama / llm CLI etc.), I missed feature complete chat interfaces. Tools like LMStudio are super powerful, but there’s a learning curve to it. I’d like to hit a middleground of simplicity and customizability for advanced users.
Here’s what separates RecurseChat out from similar apps:
– UX designed for you to use local AI as a daily driver. Zero config setup, supports multi-modal chat, chat with multiple models in the same session, link your own gguf file.
– Import ChatGPT history. This is probably my favorite feature. Import your hundreds of messages, search them and even continuing previous chats using local AI offline.
– Full text search. Search for hundreds of messages and see results instantly.
– Private and capable of working completely offline.
Thanks to the amazing work of @ggerganov on llama.cpp which made this possible. If there is anything that you wish to exist in an ideal local AI app, I’d love to hear about it.
Comments URL: https://news.ycombinator.com/item?id=39532367
Points: 49
# Comments: 11