Hacker News new | past | comments | ask | show | jobs | submit login

> Abstractions lead to coupling and complexity.

So you write everything in assembly and rewrite it completely for every new target system? Seems a bit tedious to me /s

Just kidding abstraction serves it's function and the right amount of abstraction makes things better, more transferable, easier to maintain etc. I think for example about hardware abstraction layers (HAL) in embedded programming. But abstraction in programming is never a value in itself and its use needs to be weighed carefully.

I think it is best to teach people to stick to the data and to think about transformations between said data. If they don't know abstraction (aka beginner's spaghetti code) it is worth telling them about it.




I think ORMs fit that bill. For some reason they were (are?) incredibly popular, but I don't think they stood the test of time. In the case of SQL, it's already an abstraction. No need for elaborate wizardry to hide the fact you're about to JOIN a table or two. Just write the damn join.

Every. Single. Project. I see using ORMs resulted in a baseline 20+ queries being fired under the hood for every screen. They are not bad in and of themselves, I think that they attract or are amenable to a certain type of development mindset that ends in bad results.

I have a very hard time thinking of any kind of object oriented abstraction that turned out to be a very good idea.


OO abstractions are very effective in UI toolkits, but because a smaller proportion of folk are working on desktop apps any more, people tend not to get exposed to that use case.

I also think it depends on the type of ORM. Active Record? Really problematic except for simple use cases but because they make those simple cases easy they get popular on that basis. Data Mapper? Actually fine and very useful, but harder to write well and a bit harder to use.


Yes, that's actually a good point. UI widgets map nicely to objects, at least usually.


I can't imagine programmers using reasonably well tested and documented ORMs and still firing a minimum of 20 queries would be capable of writing a nice and equally bug-free SQL interface in place of that ORM...


I think the use of ORMs by people who understand databases is fine. I think when ORMs turn into, “you don’t need to know SQL!” Is when things go off the rails.

Everyone (management) wants a way to get productivity without loads of experience and tools that promise that tend to be misleading.


Are there really software jobs where managers are the ones making decisions like whether or not to use ORMs? Some of these comments make me feel like I've had a truly blessed career.


You better believe you have had a truly blessed career.

In most places the moment you say "this will take 67 minutes and not the 65 minutes you thought it would", all sorts of idiots that know nothing about programming will come and dispute your every technical decision.

Exaggeration of course, but not far from the truth. Worst of all are the CTOs that already sing only the CEO's tune, come and review your entire project in exhaustive detail, conclude that you did 99% of everything correctly and well and exactly how they would do it, then proceed to fire you over the missing 1%.

Again, kind of an exaggeration, but again, not far from the truth either.

In most places people are treated like interchangeable cogs. Better hold tight to your warm positions, it's not a given you'll find a better one if you figure that you must leave.

I dream of a workplace where after I prove my worth -- 1-2 months -- I'll just be left alone to produce and correct code and protect the company's interests without being second-guessed, even though I am one of the most productive devs there. I dream of that. I might cry if one day I realize that I've actually landed such a job.

And I am a senior. 21 years in the profession. Food for thought for you.

It's all about who you know and drink coffee with, apparently. Your abilities as a professional barely matter, it seems.


I’ve only worked in a few large companies where nobody even cared about code quality unless you brought down a production system with a bad change or something.

You’re seriously saying you’ve had 21 years of constantly being second guessed? What kinds of places are you working at?


I've been a contractor in the last 7 years and I've come to dearly regret it.

Never has my work been so second-guessed and ripped apart. I suppose my price was too high for them, not sure.

I'm still reflecting and have no good conclusions to offer.

(But part of the time I was in bad health and my work quality suffered. So that explains one percentage of the cases at least.)


I've been doing this 15 years and I've been dealing with second guessing or managers acting like we don't know what we are doing the whole time. The problem being the managers are MBAs who know little or next to nothing about software development. They don't like being told differently from whatever narrative they come up with and primarily solely focus on cost.

The company I work for considers themself a MSP. The parent company is a sales company selling non software products.

In any case, yeah it happens.


It’s usually indirect, “only use libraries from these approved internal systems”, “new projects use these templates”… etc. Eventually to go against this set of standards make you appear to be a squeaky wheel while everyone else seems to be working just fine.


Right but in those cases were these libraries selected by engineering managers and not developers? I definitely appreciate being stuck in a pattern but everywhere I've worked the standards were at least set by technical contributors (competent or not).


At a tech-focused business, probably, but think about all the programming jobs and IT departments that ultimately roll up to very non-technical people. Often what happens is a vendor becomes the chosen vendor for some expensive tool (think Microsoft, Red Hat/IBM, Oracle, whatever), and they have a solution for almost everything (theoretically). You start having to justify why you want to use a different thing to people who don't care.

Say you have 200 windows servers, and someone wants to use linux, "Why, what are you trying to solve?" Sometimes the industry forces the issue (think marketing people using Macs), but generally people are trying to streamline things they don't care about. "Use the tools that come in this box we buy from this vendor who we pay all our bills too, everything else is a weird liability I have to worry about hiring for and keeping track of."

The end result tends to be pretty mediocre though.


I don't think you are wrong and that's the actual cause of the problems, not the tools. But that's for another day..


The whole concept of a database as a separate system is an abstraction. An extremely successful one.


ORMs are okay, people just don’t know when to use them. They were always meant for OLTP, not OLAP workloads.

How else would you update this record, plus 10 other record pointing to it? That’s the point of ORMs, “writing” the boilerplate SQL for you, and also the other direction, mapping records to objects.

It’s not the tools problem that people refuse to learn new things and misuse it.


> I think ORMs fit that bill. For some reason they were (are?) incredibly popular, but I don't think they stood the test of time.

I don't think there is any truth at all to this personal belief. In some domains it's unthinkable to use anything other than the standard ORM. In C# you'd need to be nuts to roll SQL by hand instead of using Entity Framework, and Django speaks for itself.

The only drawback of ORMs is that their promise is that developers don't need to learn the intricacies of SQL and SQL-related design patterns, but in practice developers need to learn the intricacies of SQL, the ORM framework, and the SQL generated by the ORM. Naive developers might believe they are better off reinventing the wheel with ad-hoc SQL stuff, but that's another problem.


I am nuts! I often work on two main C# projects. One uses a data adapter with data table objects. The other, is about to have EntityFramework completely removed. Reduced it to only handling database connections.

Original developers used code first. Looked into the dynamic SQL created by the framework and it was overly complex statements. Replaced with simple hand crafted SQL and Dapper, used just to bind the results to objects. This cut transaction time down dramatically, about 30 seconds down to less than two. I could still cut that down further with batch transactions that are no longer logically grouped.

Simply combining SQL, CSS, and HTML with a custom DSL allows for creating interlinked reporting. Did this on a microC II OS embedded system with SQLite running on hardware with 8mb flash and 64mb RAM. Port the DSL, keep the same table structure, and the same reports can be used in a new environment with new hardware or different database backend.

EntityFramework might be useful in some places. I have yet to work in a domain where it is.


Yeah I think you are right to add the “depending on your environment” caveat. I have no experience in the Windows domain.


I second that OO abstractions shine in UI systems. Most complex UI applications use some form of ECS these days.


I've yet to encounter a single project which does not use ORM. Everyone uses it. So they definitely stood test of time. And popularity of ORMs says more about issues with bare SQL approach. IMO SQL is just bad. Developers want JSON, not CSV. Developers want to specify conditions with lambdas, not with strings.


I don't the "everyone uses it" argument works here. Sure, it has a lot of momentum, but I'm just not seeing the proposed productivity enhancements. I actually just spent a significant chunk of time to de-ORM a system to get some performance and, more importantly, clarity back.

IMO it's way, way easier to look at some queries instead of some leaky abstraction (on top of the already leaky abstraction that is SQL).

Again, I really don't think popularity is a sane metric anymore. (React, "Astro", Angular, web-dev in general)

Edit: I'm very much in a bubble. A lot depends on your tooling and environment. For example I can appreciate it being different in an MS environment with stuff like linq. I'm not in that environment.


Relational databases are an operational anti-pattern. ORMs have proven that.


Relational databases have shown, and will continue to show more longevity than object orientation.

Facts and the relationships between facts is a deeper foundational principle than the concept mashup of modularity, hidden [mutable] state, dynamic dispatch and interface subtyping that object orientation is formed from. Databases are more long-lived than applications, so applications tend towards adapting to the form of the data rather than the other way around.


The concept of mashup of modularity, hidden mutable state and dynamic dispatch, etc also stood its time, and so do relational databases.

There is a mismatch between the two models, but it can make sense to think of the entities at hand as objects at times, and relational data at other times. That’s why ORMs are a thing, and also why most popular programming language nowadays are multi-paradigm, so they can also support a more data-oriented approach.


Modeling a business’s processes does not equate to modeling a business’s data.

Starting with modeling data will immediately cripple software engineers from building what the business needs.

Starting with business models and then deciding what data storage is appropriate is a more practical way of designing software. Operational data very often fits in schema-less document storage.

Relational databases are excellent tools for analytics and reporting.

I stopped architecting software with relational databases “first” 7 years ago and will never go back.

Document databases are highly adaptive and denormalized data is inherently faster.

I’ve since found very rare cases where a boundary requires a relational database.

Welcome to my Domain-Driven Design rant.


relational databases are backed by relational algebra, an actual formal math definition. You can prove theorems like the output of Query X is equal to the output of Query Y, but Query Y is likely to run faster so that is what we will run

You can't outdate math (you can outdate math notation though, finding better ways to express the same information). To be fair SQL is not 100% relational algebra compatible, but it is close enough


I have the opposite feeling on RDBMS, based on my limited experience. I think you need a good reason to deviate from it, which is often very large scale. I'd say 99% of applications written are no where near this scale.


> But abstraction in programming is never a value in itself and its use needs to be weighed carefully.

I think this generalises well to anything in (or outside) programming. Slavish adherence to any principle without consideration for the underlying merits of the situation is likely to become a negative.


Actually it would have to be numeric codes, as Assembly is also an abstraction. :)




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: