Hacker News new | past | comments | ask | show | jobs | submit login

I had done something similar recently. Goal was to take a huge Postgres database and make it searchable and usable.

It ended up that doing a offline batch job to boil down the much bigger dataset into a single pre-optimized table was the best approach for us. Once optimized and tsvectored it was fairly performant and not a huge gain with Elastic. Still keeping the elastic code around “in case”, but yeah, Postgres search can be “good enough” when you aren’t serving a ton of clients.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: