The vendor list is really telling of the issues with Ada's adoption:
- Of my quick scan, only AdaCore supports Ada 2012 (where a lot of the really good stuff is implemented) the rest are stuck on Ada 95.
- None of the vendors seem to list transparent pricing.
If you are only selling based on volume and through a sales rep then you are off the bat excluding most startups, SMEs, and just general bootstrap curious folks.
Adacore technically retired the Community version a few years ago, so the most up-to-date free version would be GNAT from FSF (which is what Alire brings in automatically with `alr toolchains --select`).
Predicates are great but most of the good stuff including privacy, proper subtyping and abstract data types came with Ada 83. Rust can't hold a candle even to Ada 83 imo.
Ada has practically no mindshare, Rust does. Just like say, Scala, things can be technically 'good', but without adoption it isn't going to get the attention and visibility to compete with everything that does.
We're talking about mindshare, not commercial incentives. There are plenty of things sold to small groups of buyers with no significant mindshare. Mindshare does not equal commercial viability or sales numbers.
As for who would pay for a Rust compiler: probably not a whole lot of people unless that compiler has something special the offer that the normal compiler does not.
The same goes for a C compiler, there are Intel compilers that are supposed to be 'better', but as it turns out, in most cases not 'better' enough for people to pay for them. But even then, I would not be surprised if more people pay for the ICC than for Ada (but I would also not be surprised if the Ada compiler sales rack up more money than the ICC sales).
I don't know shit about law, but I'm assuming it's not allowed for someone to build their own compiler and make their own business out of the Ada programming language?
I don't think there's anything preventing anyone to sell an Ada compiler, tooling. It's an ISO standard and you can read its spec (the holy Ada Reference Manual) and rationale for free.
I'd say the biggest hurdle is that Ada is a very complex language to implement. I've been writing Ada for more than 20 years, and writing static analysis tools for 5 years with libadalang (an amazing toolbox...) now and I still find it hard to grasp the whole combinatrics of features.
How much harder would it be to implement than say, JS? 3 orders of magnitude more difficult?
I wonder if there's a deno/bun style business in here. Would be cool to see Ada compete with Go/nodejs/java etc. Or do you think that would be a terrible fit for the language?
This probably isn't the best metric, and I may be missing some files as I'm not very familiar with GCC internals, but if I'm getting this right then we can see that just the Ada frontend has more lines than the C, C++, and D frontends combined.
> I wonder if there's a deno/bun style business in here.
You could do this with a library without anything special in Ada. You don't need to dig deep in to the internals like you do with JS.
I think Ada is seen as some esteemed language that only the military and space would have the expertise to use. I had that opinion for maybe a decade but it was very wrong. The other issue was that it is so powerful that compilers had bugs and by the time an affordable open source compiler came along I guess C++ was gaining libraries and the militaries don't release libraries.
The ironic thing is that Ada was designed to be cost effective over a projects lifetime. C costs you a lot further down the line.
One of the reasons of the whole safety hype going on, is that companies have finally start mapping developer salaries, devops and infra costs, to fixing all those CVEs.
> I think Ada is seen as some esteemed language that only the military and space would have the expertise to use.
I know Boeing is technically a space company these days, but they weren't when they created the 777. Several of its systems are written in Ada, including the cockpit interfaces, electrical generator systems, and primary flight computers.
With respect, Rust isn't a great choice for web either. At least, not for web _sites_. I would still argue it's not very good for 90% backend API development either, but that depends on what's being done in the backend.
Rust is a bad choice for web and a meh/bad one for game development. It's certainly not good enough at either to have much of a technical edge over Ada.
There are worse choices in the compiled web language niche. Most of the competitors don't qualify, because people don't want to do formal verification for their side projects.
I think with the recent flurry of languages focusing on safety, Ada has been making a comeback (see: ada-lang.io, getada.dev, alire, etc).
This presentation in particular was in the Ada dev room at FOSDEM (I gave a presentation in that room as well), and there were over 100 people there; we actually ran out of seats in the auditorium.
FOSDEM was packed overall. I can't think of a subject track I went to that had spare seats. I noped out of more than one because I just couldn't get in.
There are many other factors that influence language popularity besides technical quality, like:
- marketing;
- big companies using it;
- familiarity;
- history of the creators;
- history of the influencing languages;
- timing;
- luck;
- regional usage;
- etc.
Despite some programmers seeing themselves as fully rational making cold decisions, we're like everyone else.
> The street cred of "hey, large company X is using it, it must be good" is also very important.
Yes, and also, "large company X is spending lots of money on it, so they aren't just going to abandon it once it's no longer the newest, coolest thing."
> It was literally first programming language taught at university.
Do you mean "It was literally first programming language I was taught at university."? because the first language ever taught was more likely to be one of the autocoder/assembly variants, or FORTRAN.
"First language" is generally taken as a term of art in this sector. It is the first language we're teaching students who we expect to learn other languages as well, so emphasis on "first" here unlike for say a "taster" course in another discipline where you're learning only say, Python, with no expectation you will ever learn other languages.
Edited to expand: For a First Language you can choose to pick a language that you don't expect your students will actually end up using, for pedagogic reasons, just as we might spend time proving fundamental things in Mathematics even though those are already proved and you'll never do that "in real life" after studying, a language which has good properties for learning about programming is not necessarily also the right language to actually write yet another web site, database front end, AI chat bot and video streaming service.
I've spent considerable time thinking about this and I believe Oxford and Cambridge were right to choose an ML as First Language. The MLs have desirable properties for teaching, even if you expect your students to end up writing Python or C++ after they graduate. I am entirely certain that my exposure to an ML at University made it much easier to pick up Rust than it was for people whose nearest previous languages were C and C++
It's indeed hard to imagine that the first programming language taught at any university would be Ada. That would at least mean that the university started teaching programming and computer science very late.
There's been a number of main programming languages taught over the years. Back in the late seventies/early eighties, some universities in my region used Simula in their programming courses, for example.
Agreed. My mum learned basic Fortran at university in the early 1970s, before Ada existed. (It was done on punch cards, and they had to wait a day or so to find out if their programs worked!)
My university had a model train set, hooked up to some ancient 386 machines (and we're talking late 2010s here) and it was used for a real time programming course which was taught in Ada.
Unfortunately the lecturer that ran the unit retired the year I started my degree and by the time I had the prereqs required to do the course the faculty had run the course once without the original lecturer and it was apparently a disaster so they canned the unit until it could be rewritten from scratch, sans train set ... and in Java.
I still think about missing out on programming a train set. Years later
The University of Waterloo has a similar course, CS452: Real-time Programming.
It’s not quite the same as having physical access to the train set, but a student eventually wrote a simulator for the Märklin train set [0]. Another student wrote an emulator for the TS-7200 used for the class [1] if you don’t want to test your kernel in QEMU.
The university where I work used to teach Hard Real Time to electronics students with toy elevators. When (not if) you screw up your elevator smashes into the roof of the elevator tower and self-destructs the toy, reminding you that this is serious business and if you fuck up that badly in real life you'll be lucky if the victims live to sue you...
This is a good thing to have a test suite for a language but from a business perspective it increases barrier to entry, why? 1/ you start your new compiler with 10000 bugs (the failing tests in the public test suite) 2/ you get no client since clients want a compiler passing the public test suite 3/ no client means no money and this until you fix the 10000 bugs.
With programming languages that do not have a credible public test suite you can get away with shipping the compiler ASAP and get money from customers, then concentrate on fixing customer impacting bugs.
All in all a blessing and a curse, life is full of compromises :)
- if most users are "specialists" then you don't need compiler/tooling/doc with very good UX for non specialists, but until you have very neat UX even for non specialists you will not get a lot of traction with non specialists and in turn it might seem pointless to fix the UX
I say half problem because in my experience fixing it is also beneficial for "specialists".
This problem also overlaps with perceived (and sometimes actual) gate keeping and/or elitism.
Basically the same reasons why e.g. Haskell is much less widely used then it could be (and why most usage is in companies mostly filled up with people from Universities which have it as a required (or strongly recommended) course.
Just with more issues (listed in some of the other responses) added on top of it.
> I say half problem because in my experience fixing it is also beneficial for "specialists".
This is critical. Once I realized that even experts didn't necessarily have homogeneous advanced knowledge of the entire language, it became easier to justify to myself spending time on improving the diagnostics of uncommon errors for clarity and learnability. An expert requires less explanation, but they still require one.
And I also suspect that spending time on better DX at the right time of the adoption curve has an outsized impact on it.
The bad part is that it is a hard, slow, endless and somewhat thankless job (you're going against the grain of the compiler design half the time, adding logic that needs to be maintained, and for every case you handle ten more pop up elsewhere that need to be handled in a different way).
It hurts my eyes (more seriously, the Pascal family look lost vs the C one, it's a popularity thing) and from my understanding, it took too long to become truly usable in the FOSS world (a bit like CL).
This is a common complaint I read, and I have never understood it.
My eyes are not strong: I wear spectacles of about -6.5 dioptres.
If text size is fairly small, it is REALLY difficult to distinguish
(...)
from
{...}
... on large screensful of text. And
[...]
... is not much more visible. Making that significant is terse, yes, but Sendmail is terse. Terseness is not an unambiguous virtue.
Secondly, indentation: I learned 2 or 3 languages in the 1980s before I learned C, and it's a given that you indent opening and closing control structures to match. That's how it works: you structure blocks of code to show the hierarchies.
But most curly-bracket language coders have their own weird schemes where the opening { is in a totally different column from the closing }. And they fight like alley cats about it.
A pair of readable words, words that of different lengths so are easily distinguishable without reading them characer-by-character, are far more legible than a pair of line-noise characters -- especially if those characters on their own have wildly different meanings in the language, and -- to make matters even worse -- are also extremely close to the characters for code comments!
Look, to spell this out:
You have your own opinion and that is fine. You are perfectly entitled to it.
But you are making out that your preference is some global truth, and I think you need to realise that what is easier or clearer for you is in fact less clear and much LESS readable for other people.
Those words were chosen for good reasons and have persisted through whole generations of programming languages for something like half a century, and there are good reasons for this.
The C designers chose things shorter and easier to type and that is their privilege. I personally, and thousands of others, dislike their choice and prefer Wirth's.
Nobody is absolutely objectively right or wrong here.
The point is that there are good reasons for both choices, and nobody -- including me -- gets to go and say one is better and clearer and the other is worse.
What you like, I hate. What I like, you hate. But we both have our reasons. I think mine are good reasons. Presumably you think yours are, although you haven't explained them. You just assert them as obvious global truths. They are not.