The challenge is that there are several ways of doing this depending on your larger architecture. So the definition of "just work" depends on the context of your particular architecture.
For eg: you could hook into your ORM and send change events to Typesense like another commenter pointed out; if you have an event bus, you could hook into those messages and push to Typesense from a listener; you could use something like Maxwell to get a JSON stream from the DB binlogs and write the data you need into Typesense; or you could write a batch script which runs a set of SQL queries on a schedule and batch export + import into Typesense.
Also, typically you'd want to curate the fields that you put in a search index, vs what you store in your transactional database, so there is some level of data transformation that's needed anyway.
So long story short, it depends on your architecture!
For our setup with Typesense we created a mapping, did an initial bulk import and then set up ORM lifecycle events to sync insert/delete/updates from the database to Typesense. We are using Symfony with Doctrine as the ORM.
Keeping an external search system in-sync with the primary DB is certainly a pain point. The biggest problem is that the records in the DB will likely be normalized, while the records in a search store will not be.
You can "poll" for new records for indexing by using a query, but handling deletes/updates is the real deal breaker. You will then have to use both the binlog (atleast for mysql) and querying, at which point it becomes quite complicated to reason about.
That would totally be the dream though. No reindexing. Pg_search is def fast enough for a large swath of use cases.
What I do is I’ve built a real barebones search controller that connects to Postgres. My controller only lets you search on one attribute, and I use it for autocompletes. Anything more involved, I use typesense.
As I posted in another comment [1], the biggest problem is handling updates and deletes. Requires that you do some kind of logical deletion with an `last_updated_at` timestamp to fetch updates from a primary table.
I don't quite follow. For a system reliant entirely on Postgres, you can just fetch the records, and join them in real time, no? It would be much slower than elasticsearch, but still _good enough_ for a lot of use cases.
I was referring to an integration between Postgres and an external search system such that the data sync is automated. That was what the parent comment was asking about (I think).
What I really want is an integration between Instantsearch.js, and Postgres.
Typesense has written a custom adapter for Instantsearch & Typesense. I would love to just hook Instantsearch up to directly to my rails all. You could use pg_search to get 90% of the value, with next to no complexity on the backend.