Coming from an area of programming where dataframes/tables are the first-class citizen this is mind blowing to me. Is there a reason for doing things this way by default? I guess it puts off having to think about complex queries, but it also means performance is degraded multiplicatively the more tables you're pulling in, right? Is the tradeoff really just "it's easier to set up any old garbage and fix performance later"?
Lazy-loading is usually the default ORM setting, so you wouldn't pull in any extra tables by default. You can setup Eager-loading on a per-query basis (i.e I know I'll need it here) or just set up certain relationship to always eager-load, depending on your needs.
You've hit the tradeoff right on the nail. ORM-based code is dead-simple to write since everything just looks like an object. Anyone who knows the programming language can start querying the DB without really knowing anything about SQL. This lets you write the code quickly and in a domain thats really easy to reason about.
I've seen many times where an endpoint might really only require 1 SQL query but it turns into a handful because of the ORM. Very rarely does it matter though, because each query takes < 5 milliseconds and you barely notice the slowdown.
If you know your ORM library well then you can get the benefits of both worlds. Mature ORMs will give you a ton of control (including letting you write SQL directly) when you need, you just have to be ok with the learning curve.
Columns though, you do have to restrict the column set if you don't want/need them all.
It's dead simple to do dead simple, and then it quickly gets hard to impossible to do the complex things you need, is my experience with Django ORM.
My favourite model personally is a very lightweight mapping into a model in the querying language, like Diesel or less. Django unnecessarily changes established terminology and introduces abstractions that are ultimately limiting.
IME, it’s more because most (citation needed) web devs are familiar with JSON and its nested format, and so it’s much easier to keep everything in that mental model, and let something else do the conversion.
SQL is of course not a difficult language, but I will give ORMs that it can be somewhat difficult to context switch between the two mental models.
Django, IME, does an extremely good job at producing decent SQL, most of the time. There are times when you can be faster, and there are certainly times where it’s massively slower, but the average is higher than if you just had everyone writing queries.
My counter to that is “have RDBMS SMEs create VIEWs for anything you want, and let devs just call those,” but then I get told I’m old-fashioned, and that no one wants to store any kind of logic in a DB, version controlled or not.
The steelman of their argument (coming from things as a database person) is thinking about the problem in a different way for a different domain, which can beget a "simpler" approach.
Something like a document db approach (one big json entry to rule them all) gets significant benefits in setting certain levels of abstraction as even possible.
Hey - we just get the user details and then we update some json and that's all we can do, and it kinda sucks, but it means we're going to put most of our computation in this website and really make something that looks like a modern web app.
It has a bunch of downsides, but you dont have to teach anyone SQL, its just a simple repository pattern + microservices + limiting the scope of any one thing.
Type checking is strong, tooling support is strong and you usually can stick in one language and go ham.