Hacker News new | past | comments | ask | show | jobs | submit login

The bigger issue is this idea that everything needs to live in one place. For the bulk of an application handling a billion requests / day I'd wager that most of that traffic is isolated to certain types of data.

I'd wager that because in almost every case I've ever seen it's true. You just don't tend to see every table in a normalized dataset bearing the traffic load.

If that is the case, rolling that particular piece of data out to a more easily scalable store will largely fix the problem, if caching, async writes and buffered writes didn't already.

Everything else can very easily sit in PostgreSQL, avoid race conditions, maintain data integrity, have permissions controlled and be accessed from multiple languages directly without requiring an API layer. Then you can use a foreign data wrapper to let PG query that other data source (mongo, couchbase, redis, whatever) and join the results with the other data in the database just like it's all one bit happy dataset.

As another poster said, a mess is a mess and honestly I don't know why he takes a shot at Rails since Rails has some of the best first class support for leveraging PostgreSQL features these days.

Wrote an entire post about it: http://www.brightball.com/ruby-postgresql/rails-gems-to-unlo...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: