Hacker News new | past | comments | ask | show | jobs | submit login

why Ada didn't take the world? It was literally first programming language taught at university.





Mostly,

- Compiler prices

- The few UNIX vendors that supported it like Sun, it was extra on top of C and C++ compilers, why pay more when C and C++ were already in the box

- Hardware requirements

Still, there are around 7 Ada vendors around.

https://www.adacore.com/

https://www.ghs.com/products/ada_optimizing_compilers.html

https://www.ptc.com/en/products/developer-tools/apexada

https://www.ddci.com/products_score/

http://www.irvine.com/tech.html

http://www.ocsystems.com/w/index.php/OCS:PowerAda

http://www.rrsoftware.com/html/prodinf/janus95/j-ada95.htm


The vendor list is really telling of the issues with Ada's adoption:

- Of my quick scan, only AdaCore supports Ada 2012 (where a lot of the really good stuff is implemented) the rest are stuck on Ada 95.

- None of the vendors seem to list transparent pricing.

If you are only selling based on volume and through a sales rep then you are off the bat excluding most startups, SMEs, and just general bootstrap curious folks.


The Community version (based on GNAT from the FSF) is free:

https://www.adacore.com/community

https://alire.ada.dev/

But yes GNAT "just" supports Ada 2012, 2005, 95 and 83:

https://en.wikipedia.org/wiki/GNAT


You forgot Ada 2022 :) https://learn.adacore.com/courses/whats-new-in-ada-2022/inde...

Adacore technically retired the Community version a few years ago, so the most up-to-date free version would be GNAT from FSF (which is what Alire brings in automatically with `alr toolchains --select`).


Ada 2022 has some amazing features too - like user defined literals. You can now create custom string types : )

https://learn.adacore.com/courses/whats-new-in-ada-2022/chap...


Going forward there is only one GNAT, and is under Apache 2 license, also a FOSDEM talk this year.

Just clarifying for other folks.


If the price is not on the website, you can't afford it.

Predicates are great but most of the good stuff including privacy, proper subtyping and abstract data types came with Ada 83. Rust can't hold a candle even to Ada 83 imo.

Ada has practically no mindshare, Rust does. Just like say, Scala, things can be technically 'good', but without adoption it isn't going to get the attention and visibility to compete with everything that does.

Enough mindshare to keep 7 compiler vendors in business.

How many would pay for a Rust compiler?


We're talking about mindshare, not commercial incentives. There are plenty of things sold to small groups of buyers with no significant mindshare. Mindshare does not equal commercial viability or sales numbers.

As for who would pay for a Rust compiler: probably not a whole lot of people unless that compiler has something special the offer that the normal compiler does not.

The same goes for a C compiler, there are Intel compilers that are supposed to be 'better', but as it turns out, in most cases not 'better' enough for people to pay for them. But even then, I would not be surprised if more people pay for the ICC than for Ada (but I would also not be surprised if the Ada compiler sales rack up more money than the ICC sales).


I agree there's probably far more commercial support than Rust, but:

- Ada is only a small part of what Green Hills offers.

- PTC mostly isn't even a development tools company, their main products are PLM and CAD (WindChill and Creo).

- I think RR is a one-man company.


PTC still maintain PERC (for some definition of 'maintain'). At least I hope so.

It works for plenty of other products, but yeah I do conced it could be better.

There are at least two more supporting Ada 2012, if you go into their docs.


AdaCore does support Ada 95 and this was 7 years ago.

Your first point cancels out your fourth.

I don't know shit about law, but I'm assuming it's not allowed for someone to build their own compiler and make their own business out of the Ada programming language?

The specification is open, if you want to make your own implementation you can.

This sounds like a fun business to start

Why would you assume that ?

because I know nothing about the way licensing about programming languages work lol

I don't think there's anything preventing anyone to sell an Ada compiler, tooling. It's an ISO standard and you can read its spec (the holy Ada Reference Manual) and rationale for free.

I'd say the biggest hurdle is that Ada is a very complex language to implement. I've been writing Ada for more than 20 years, and writing static analysis tools for 5 years with libadalang (an amazing toolbox...) now and I still find it hard to grasp the whole combinatrics of features.

Easy to use, but very hard to implement.


How much harder would it be to implement than say, JS? 3 orders of magnitude more difficult?

I wonder if there's a deno/bun style business in here. Would be cool to see Ada compete with Go/nodejs/java etc. Or do you think that would be a terrible fit for the language?


> How much harder would it be to implement than say, JS? 3 orders of magnitude more difficult?

We can look at the size of the GCC frontends for various languages to get some idea of the complexity compared to other languages:

    gcc/ada/*.ad*     : 712194 lines (excludes C files and gcc-interface folder)
    gcc/ada/libgnat/  : 438117 lines (standard library)
    gcc/ada/libgnarl/ :  61705 lines (runtime)
    gcc/c/            :  63816 lines
    gcc/cp/           : 279732 lines
    gcc/c-family/     :  53110 lines
    gcc/d/            : 212457 lines (185253 are DMD)
This probably isn't the best metric, and I may be missing some files as I'm not very familiar with GCC internals, but if I'm getting this right then we can see that just the Ada frontend has more lines than the C, C++, and D frontends combined.

> I wonder if there's a deno/bun style business in here.

You could do this with a library without anything special in Ada. You don't need to dig deep in to the internals like you do with JS.


I think Ada is seen as some esteemed language that only the military and space would have the expertise to use. I had that opinion for maybe a decade but it was very wrong. The other issue was that it is so powerful that compilers had bugs and by the time an affordable open source compiler came along I guess C++ was gaining libraries and the militaries don't release libraries.

The ironic thing is that Ada was designed to be cost effective over a projects lifetime. C costs you a lot further down the line.


One of the reasons of the whole safety hype going on, is that companies have finally start mapping developer salaries, devops and infra costs, to fixing all those CVEs.

> I think Ada is seen as some esteemed language that only the military and space would have the expertise to use.

I know Boeing is technically a space company these days, but they weren't when they created the 777. Several of its systems are written in Ada, including the cockpit interfaces, electrical generator systems, and primary flight computers.


Because ADA is maintained by big actors who think ADA is for "mission critical" stuff and not our lowly web apps.

Problem is, web is the new BASIC and many devs will start there and they will see rust first. And where's that ADA game engine ?

ADA can definitely claim tons of successes and very powerful constructs, but mindhsare is clearly not one of its selling points.


With respect, Rust isn't a great choice for web either. At least, not for web _sites_. I would still argue it's not very good for 90% backend API development either, but that depends on what's being done in the backend.

c is the new BASIC.

To me it seems that python is the current BASIC.

Rust is a bad choice for web and a meh/bad one for game development. It's certainly not good enough at either to have much of a technical edge over Ada.

There are worse choices in the compiled web language niche. Most of the competitors don't qualify, because people don't want to do formal verification for their side projects.

I think with the recent flurry of languages focusing on safety, Ada has been making a comeback (see: ada-lang.io, getada.dev, alire, etc).

This presentation in particular was in the Ada dev room at FOSDEM (I gave a presentation in that room as well), and there were over 100 people there; we actually ran out of seats in the auditorium.


FOSDEM was packed overall. I can't think of a subject track I went to that had spare seats. I noped out of more than one because I just couldn't get in.

I gave a presentation in the Quantum Computing track and I couldn't sit down to attend the other presentations :/

Rule number one of FOSDEM, track what you want to see, seat close to the door, and always miss the last five minutes of a talk.

There were a few where you could get a spot when arriving late, but, indeed, it was pretty packed overall.

There are many other factors that influence language popularity besides technical quality, like:

  - marketing;
  - big companies using it;
  - familiarity;
  - history of the creators;
  - history of the influencing languages;
  - timing;
  - luck;
  - regional usage;
  - etc.
Despite some programmers seeing themselves as fully rational making cold decisions, we're like everyone else.

> - marketing; - big companies using it;

These are the deciding factors.

If you look at which newish languages have gotten popular over the last few years, it was Rust, Kotlin, Swift, Go and Typescript.

Building a language and ecosystem around it takes a huge amount of resources, and often tedious work that doesn't happen if people aren't paid for it.

The street cred of "hey, large company X is using it, it must be good" is also very important.

(of course Swift and Kotlin are somewhat distinct as the platform languages for Android and iOS)


> The street cred of "hey, large company X is using it, it must be good" is also very important.

Yes, and also, "large company X is spending lots of money on it, so they aren't just going to abandon it once it's no longer the newest, coolest thing."


if "Space industry" isn't big I don't know what is

"first mover advantage" of those who later won?

ADA was way ahead of its time, thus compilers were slow and ressource (RAM) hungry, and worse: they were inaccessible for hobbyists or learners.

In contrast, pascal/turbo pascal was ubiquitous, and then turbo c++. You easily knew someone who could organize a copy and "keys" for it.


> It was literally first programming language taught at university.

Do you mean "It was literally first programming language I was taught at university."? because the first language ever taught was more likely to be one of the autocoder/assembly variants, or FORTRAN.


"First language" is generally taken as a term of art in this sector. It is the first language we're teaching students who we expect to learn other languages as well, so emphasis on "first" here unlike for say a "taster" course in another discipline where you're learning only say, Python, with no expectation you will ever learn other languages.

Edited to expand: For a First Language you can choose to pick a language that you don't expect your students will actually end up using, for pedagogic reasons, just as we might spend time proving fundamental things in Mathematics even though those are already proved and you'll never do that "in real life" after studying, a language which has good properties for learning about programming is not necessarily also the right language to actually write yet another web site, database front end, AI chat bot and video streaming service.

I've spent considerable time thinking about this and I believe Oxford and Cambridge were right to choose an ML as First Language. The MLs have desirable properties for teaching, even if you expect your students to end up writing Python or C++ after they graduate. I am entirely certain that my exposure to an ML at University made it much easier to pick up Rust than it was for people whose nearest previous languages were C and C++


It's indeed hard to imagine that the first programming language taught at any university would be Ada. That would at least mean that the university started teaching programming and computer science very late. There's been a number of main programming languages taught over the years. Back in the late seventies/early eighties, some universities in my region used Simula in their programming courses, for example.

Agreed. My mum learned basic Fortran at university in the early 1970s, before Ada existed. (It was done on punch cards, and they had to wait a day or so to find out if their programs worked!)

Unix came with a C compiler and its own source code, that could be easily bootstrapped and ported to other architectures. You cannot beat that.

At my engineering school they taught Ada. I never had the opportunity to learn Ada because they switched to Java a decade before I started.

My university had a model train set, hooked up to some ancient 386 machines (and we're talking late 2010s here) and it was used for a real time programming course which was taught in Ada.

Unfortunately the lecturer that ran the unit retired the year I started my degree and by the time I had the prereqs required to do the course the faculty had run the course once without the original lecturer and it was apparently a disaster so they canned the unit until it could be rewritten from scratch, sans train set ... and in Java.

I still think about missing out on programming a train set. Years later


> I still think about missing out on programming a train set. Years later

Sounds like a nice hobby project to indulge in during your free time! It's easier than it has ever been, and it's a nice rabbit hole do dive into.

They even have open-source DCC decoders nowadays!

https://www.opendcc.de/elektronik/opendecoder/opendecoder_e....

https://ilabs.se/news/introducing-our-new-open-source-dcc-de...


The University of Waterloo has a similar course, CS452: Real-time Programming.

It’s not quite the same as having physical access to the train set, but a student eventually wrote a simulator for the Märklin train set [0]. Another student wrote an emulator for the TS-7200 used for the class [1] if you don’t want to test your kernel in QEMU.

[0] https://github.com/Martin1994/MarklinSim

[1] https://github.com/daniel5151/ts7200


The university where I work used to teach Hard Real Time to electronics students with toy elevators. When (not if) you screw up your elevator smashes into the roof of the elevator tower and self-destructs the toy, reminding you that this is serious business and if you fuck up that badly in real life you'll be lucky if the victims live to sue you...

Sounds like finally satisfying that desire could be a fun Christmas project.

Many factors but one is interesting: the existence of a public test suite, ACATS http://www.ada-auth.org/acats.html

This is a good thing to have a test suite for a language but from a business perspective it increases barrier to entry, why? 1/ you start your new compiler with 10000 bugs (the failing tests in the public test suite) 2/ you get no client since clients want a compiler passing the public test suite 3/ no client means no money and this until you fix the 10000 bugs.

With programming languages that do not have a credible public test suite you can get away with shipping the compiler ASAP and get money from customers, then concentrate on fixing customer impacting bugs.

All in all a blessing and a curse, life is full of compromises :)


besides all the reasons others listed:

- "the common dev" isn't familiar with it

as well the hen/egg (half) problem of

- if most users are "specialists" then you don't need compiler/tooling/doc with very good UX for non specialists, but until you have very neat UX even for non specialists you will not get a lot of traction with non specialists and in turn it might seem pointless to fix the UX

I say half problem because in my experience fixing it is also beneficial for "specialists".

This problem also overlaps with perceived (and sometimes actual) gate keeping and/or elitism.

Basically the same reasons why e.g. Haskell is much less widely used then it could be (and why most usage is in companies mostly filled up with people from Universities which have it as a required (or strongly recommended) course.

Just with more issues (listed in some of the other responses) added on top of it.


> I say half problem because in my experience fixing it is also beneficial for "specialists".

This is critical. Once I realized that even experts didn't necessarily have homogeneous advanced knowledge of the entire language, it became easier to justify to myself spending time on improving the diagnostics of uncommon errors for clarity and learnability. An expert requires less explanation, but they still require one.

And I also suspect that spending time on better DX at the right time of the adoption curve has an outsized impact on it.

The bad part is that it is a hard, slow, endless and somewhat thankless job (you're going against the grain of the compiler design half the time, adding logic that needs to be maintained, and for every case you handle ten more pop up elsewhere that need to be handled in a different way).


It hurts my eyes (more seriously, the Pascal family look lost vs the C one, it's a popularity thing) and from my understanding, it took too long to become truly usable in the FOSS world (a bit like CL).

Better to hurt your eyes (which is nonsense unless a book hurts your eyes) than your brain. Optimised for the common operation of reading.

> It hurts my eyes

This is a common complaint I read, and I have never understood it.

My eyes are not strong: I wear spectacles of about -6.5 dioptres.

If text size is fairly small, it is REALLY difficult to distinguish

(...)

from

{...}

... on large screensful of text. And

[...]

... is not much more visible. Making that significant is terse, yes, but Sendmail is terse. Terseness is not an unambiguous virtue.

Secondly, indentation: I learned 2 or 3 languages in the 1980s before I learned C, and it's a given that you indent opening and closing control structures to match. That's how it works: you structure blocks of code to show the hierarchies.

But most curly-bracket language coders have their own weird schemes where the opening { is in a totally different column from the closing }. And they fight like alley cats about it.

https://en.wikipedia.org/wiki/Indentation_style

ONLY the GNU style listed there is sane.

I mean at least Allman and Horstmann styles are consistent.

It is much MUCH easier to pick out

BEGIN

    ... stuff...
END

... than it is to try to pick out { and } in some random blasted column.

And yet, all the fans squee at curly brackets. As smdiehl wisely said:

« C syntax is magical programmer catnip. You sprinkle it on anything and it suddenly becomes "practical" and "readable". »

https://x.com/smdiehl/status/855827759872045056

I never got it. It's obfuscatory. It is famed for being write-only. There's a competition to write the least-readable C!

C style hurts my eyes.

Pascal and Ada are vastly more readable.


If curly braces are not visible enough, you can also use <% %>. Still less of an eyesore than BEGIN and END.

I completely disagree. I could not disagree more.

A pair of readable words, words that of different lengths so are easily distinguishable without reading them characer-by-character, are far more legible than a pair of line-noise characters -- especially if those characters on their own have wildly different meanings in the language, and -- to make matters even worse -- are also extremely close to the characters for code comments!

Look, to spell this out:

You have your own opinion and that is fine. You are perfectly entitled to it.

But you are making out that your preference is some global truth, and I think you need to realise that what is easier or clearer for you is in fact less clear and much LESS readable for other people.

Those words were chosen for good reasons and have persisted through whole generations of programming languages for something like half a century, and there are good reasons for this.

The C designers chose things shorter and easier to type and that is their privilege. I personally, and thousands of others, dislike their choice and prefer Wirth's.

Nobody is absolutely objectively right or wrong here.

The point is that there are good reasons for both choices, and nobody -- including me -- gets to go and say one is better and clearer and the other is worse.

What you like, I hate. What I like, you hate. But we both have our reasons. I think mine are good reasons. Presumably you think yours are, although you haven't explained them. You just assert them as obvious global truths. They are not.


What language is that?

It's just C/C++ syntax.

Thanks. I was aware of the trigraphs but the digraphs had passed me by.

Indentation is readability thing. Curly braces are about scope.

Well obviously!

That is not terribly germane here, IMHO.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: