Hacker News new | past | comments | ask | show | jobs | submit login

Seems like this would've been a completely ideal place to rock some mongo_fdw, which would give postgres the ability to query and extract data directly from mongo. https://github.com/citusdata/mongo_fdw



That adds quite a bit of latency to your database, which is already frequently the bottleneck.

In reality, most of what Mongo does can be done just as well if not better in Postgresql 9.3 (and I hear good things about 9.4). Then you can simultaneously have structured data via postgres


I don't think the parent's talking about doing all operations via the fdw, but rather using the foreign data wrapper to fill the table. So instead of pulling all the data into a python script and then pumping it out into the postgresql database, you're using postgresql itself to do the fill operation. Additionally, by using the foreign data wrapper, you could also create a materialized view based on the results, if you still want most activities using the mongodb instances for whatever reason.


And then you could use mongres when you eventually decided to replace that Mongo server without confusing the original PG database. It's elephants all the way down! :)


typically doing a \COPY is going to be much faster than an INSERT in general, so a SELECT/INSERT may end up a bit slower. i haven't benchmarked it, but am pretty sure i'm right here.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: