Hacker News new | past | comments | ask | show | jobs | submit login

The typesense performance is really impressive, seems at par with the (also) impressive meilisearch.

I'm going to try the free tier of typesense right now, perfect for my current use-case of site-search.

How large is the dataset for the books? How large nodes are needed?




Thank you!

The OpenLibrary dataset has ~28M records, takes up 6.8GB on disk and 14.3GB in RAM when indexed in Typesense. Each node has 4vCPUs. Took me ~3 hours to index these 28M records. Could have probably been done in ~1.5hrs - most of the indexing time was because of cross-region latency between the 3 geo-distributed nodes in Oregon, Mumbai and Frankfurt.


That's a lot of text to index, impressive!

Is the indexed size 6.8GB or 14.3GB?


Thank you! The indexed size is 14.3GB.


I'm curious, what do you feel meilisearch offers that is better than elasticsearch or solr?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: