I work on Ollama. It's a good question since there are quite a few tools emerging in this space.
The focus for Ollama is to make downloading and serving a model easy – there's an included `ollama` CLI but it's all powered by a REST API. Hopefully, it's a way to support really cool applications of LLMs like OP's onprem tool.
OP's tool is more focused on ingesting and analyzing data. There seems to be quite a bit of interesting opportunity as an application of LLMs – e.g. analyzing not only local docs but data in a remote data store.
The focus for Ollama is to make downloading and serving a model easy – there's an included `ollama` CLI but it's all powered by a REST API. Hopefully, it's a way to support really cool applications of LLMs like OP's onprem tool.
OP's tool is more focused on ingesting and analyzing data. There seems to be quite a bit of interesting opportunity as an application of LLMs – e.g. analyzing not only local docs but data in a remote data store.