Reor is an open-source AI note-taking app that runs models locally.
The four main things to know are:
1. Notes are connected automatically with vector search. You can do semantic search + related notes are automatically connected.
2. You can do RAG Q&A on your notes using the local LLM of your choice.
3. Embedding model, LLM, vector db and files are all run or stored locally.
4. Point it to a directory of markdown files (like an Obsidian vault) and it works seamlessly alongside Obsidian.
Under the hood, Reor uses Llama.cpp (node-llama-cpp integration), Transformers.js and Lancedb to power the local AI features.
Reor was built right from the start to support local models. The future of knowledge management involves using lots of AI to organize pieces of knowledge - but crucially, that AI should run as much as possible privately & locally.
It's available for Mac, Windows & Linux on the project Github: https://github.com/reorproject/reor