Hacker News new | past | comments | ask | show | jobs | submit login

I think the OP is referring to the time before Rails natively supported foreign keys.

While I agree there's a place for business rules (the app), properly enforcing referential integrity and uniqueness can now easily be done in both - and I trust PostgreSQL more.




From the description:

> In order to load just 100 paginated items on his account history page, over 50,000 queries were triggered using so much RAM that it crashed the server.

Issuing 50k queries to get 100 items is just wrong, no matter how you look at it. I can see the attractiveness of writing all your logic in one language; and in a way it's unfortunate there's not a simple way to write your filter in Ruby, pass it to the database, and have it executed there. But you have to face reality and deal with the fact that such complicated logic needs to be close to the data.

Edit: Initially misunderstood to mean 50k items, not 50k queries, which is even worse.


Since this was written Rails has become much better about this kind of thing.

I think Arel and some of the other work that really helped was happening more than 10 years ago now (how time flies!) but it was a non-trivial upgrade so a lot of production sites didn't switch over for a few more years.


Makes sense, and current Rails definitely does encourage you to make foreign key constraints for foreign keys, and makes it as easy as anything else.

https://guides.rubyonrails.org/active_record_migrations.html...

https://guides.rubyonrails.org/active_record_migrations.html...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: