Hacker News new | past | comments | ask | show | jobs | submit login
Clearing up Myths about Ada (pyjarrett.github.io)
138 points by jayp1418 on Sept 10, 2021 | hide | past | favorite | 115 comments




I’m kind of dismayed that none of these vendors seem to have a form with a buy button. Leads me to expect a lot of licensing problems.


But when you're coming from an open source perspective, or just expect tools to be free, like they mostly are for other languages (and should be imo), then those other vendors don't matter as you cannot afford them anyway and AdaCore becomes the only vendor, you can't afford gnatpro either btw.


Not everything has to be free beer.


Yeah, only if you want an appreciable number of people to use it.


There will never be an appreciable number of people writing the kind of software that justifies the use of Ada. If your project needs it, the cost isn't a concern.


Ada is a general-purpose programming language with an emphasis on safety and contract-oriented programming. There's a lot of software written in e.g. C++ today where Ada would be perfectly justified, if only it was more widespread and accessible.

Here's one unusual example, a BSD package manager written in Ada: https://github.com/jrmarino/synth


You definitely could write a lot more in Ada. The world would be a lot better off for it. Unfortunately, with the way things are going right now, I don't think it will happen. It takes a lot of active convincing to keep people from writing absolutely everything in JavaScript, so I don't think we'll be able to get to a point where Ada's ecosystem will be built out enough for it to be a common general-purpose language.


There are enough to keep 7 companies alive selling their compilers.


Like Windows, Mac, Oracle, z/OS, Photoshop or IDA Pro?


Green Hills Software is such a excellent producer of software, near perfection in every product...and no the Homepage is not a buy-able product of them ;)


If there are 7 companies selling compilers + FSF GNAT how can Ada Core be the only vendor?


Clearing up myths about Ada. You overlooked that context.


> I’m writing these in the truthful form, rather than stating the myth.

And it seems you overlooked the article. Anyone coming to the comments after reading the article can be forgiven for being confused by the reversal that happened here.


That is the myth, duh.


I also first misread it tbh, putting the statement in quotes would have made it more obvious.


Thinking before writing would made it more obvious too.


Two other beliefs about Ada:

1. It was designed by committee.

According to Wikipedia, it was designed by a small team at Honeywell Bull in France. A different connotation from the label 'committee'. What is the more accurate description? It may seem unimportant, but most programming languages are designed by one or two individuals and the word 'committee' is perceived negatively by many developers. (Aside: Julia was designed by multiple authors, but no-one says the language was designed by committee.)

2. Ada is a very large language. Not suitable for small projects.

This belief about the size of the language is true, but the question is: does it matter? I prefer small languages over larger ones, but I would be interested in the opinions of Ada developers. Is the language size irrelevant? And is Ada suitable for projects of any size?


While it is a large language, by now it is probably still smaller than C++20, C# 10, Common Lisp, Python 3.7 or GHC Haskell. :)


Thank you. That's busted a wrongly-held belief I've held about Ada - that it's size dwarfs other languages. Good to be corrected!


The language was naturally huge for 1983 hardware, and when compared to what 8 and 16 bit home computers were capable of.

Except we are now in 2021 and while Ada 2012 has naturally gained on features, since 1983 there are other ecosystems that grew beyond what is available in Ada, like my list above.


It's definitely significantly smaller than c++


To me the size of a language, and the size of a project, are totally different metrics. One could make the case that a large language provides developers with a more complete toolset suitable for smaller projects. My own experience with Ada may be different to that of others, however I've really seldom needed to use any non-core libraries to accomplish what I've needed to do. If you factor in the STL, I'm sure Ada would be far smaller than modern C++. Rust is no doubt smaller than Ada though, however I've never like the idea of needing to use third-party libraries for basic functionality that the language itself could provide.


You can absolutely use Ada as merely Pascal with a more pedantic type system. So very reasonably sized.

Importantly though, Ada is much more coherent (and less timtowtdi) than say C++ so as you do start bringing in more of the language into your codebase the "every one uses a different 10% of the language" problem of C++ isn't much of a problem for Ada.


It was designed by Jean Ichniah and his team, but he was the main designer.

You don't have to use the entirety of the language, you can use pragma Restrictions to get the compiler to ensure you don't use certain features. Some features don't even work for really tiny devices. Even tasking can be done one small systems, like a Z80, for example.


Regarding 2. You can write simple procedural programs with Ada, no need to use every last feature in a project.


Two more:

Ada is a name, not an abbreviation. It should be written "Ada" and not "ADA".

---

Keywords in Ada should be written in lower case letters:

It is a common misconceptions that Ada enforces or encourages upper case, but here is the official Ada 83 Language Reference Manual on the subject: http://archive.adaic.com/standards/83lrm/html/lrm-02-09.html... > Reserved words differing only in the use of corresponding upper and lower case letters are considered as the same (see 2.3).

The manual itself doesn't even use upper case:

> For readability of this manual, the reserved words appear in lower case boldface.

So even the Ada standard recognizes the upper case is less readable and undesired.


I have seen a claim that Ada now has or is getting something akin to C++'s destructors or Rust's Drop trait. What would Ada's version be called, and what would it look like?

To me, the destructor is the single most distinctive and powerful feature of C++, adopting it is what made Rust viable, and lacking it makes Zig overwhelmingly less interesting than it might have been. If Ada got destructors, that would promote the language, in my view, from dead to potentially viable.


Ada's version of this is called controlled types. A type can have a finalizer method that gets called when a value needs to be destroyed.


Is it possible to precisely control when it gets called?

If not, how predictable is it?


Yes, you have complete control over this. In short, it's a hybrid between manual and automatic memory management. Memory allocation is a complex subject in Ada, and I'm not familiar enough with it to explain all the nuances, but here [1] is a good discussion.

[1] https://stackoverflow.com/questions/67131931/does-ada-deallo...


Zig's thread on the topic is interesting reading - it has gone from open to closed several times, and includes interesting examples from the stdlib where resources were not closed properly: https://github.com/ziglang/zig/issues/782


Doesn't the Ada.Finalization package (since Ada 2005) give you this anyway?


No. To be useful it needs to be a core language feature.


Languages like Ada provide hooks, for the runtime and compiler to collaborate together, because safety is more relavant than performance at all expenses.

Any variable declared as controlled type will have their "destructor" called just like C++, and that is what matters.

Like like RAII doesn't do anything, if the type is heap allocated and due to a bug not deleted or freed(), and someone has to write the destructor code, just like someone has to write the controlled type.


> Languages like Ada provide hooks, for the runtime and compiler to collaborate together, because safety is more relavant than performance at all expenses.

Does Ada come with optional garbage collection, at least? Memory leaks are a big safety concern, perhaps the biggest in practical code, and IIRC Ada still doesn't have GC "by default" for some unimaginable reason.


Ada 83 had one, however production deployment scenarios never required one so Ada 2005 dropped it, and nowadays thanks to SPARK and controlled types is not required anyway.

Additionally Ada controls many types directly like strings and vectors, and storage pools.

I recommend the FOSDEM talk about Ada memory management.


The unimaginable reason is that one of Ada's core competencies is realtime embedded systems. Very often, these don't allocate or free memory ever.


Don’t a lot of those systems allocate and free slots in large arrays, which is almost as risky?


Ada arrays are actual types, they're bounds-checked, the language provides some niceties for indexing, and Ravenscar and SPARK are available. You're going to have less issues with Ada in that case than you are with just about any other language out there.


`Ada.Finalization` is a core language feature, it's a package defined as part of the standard. Like many Ada features, RAII (via Controlled Types) is something you can opt into, such as if you need more than just record initialization.


I am corrected.


It is a core language feature as of Ada95. Ada, unlike most other languages generates calls into the standard library to implement complex features, controlled types and tasks are two such features. And no, I don't mean generating simple calls to compiler intrinsics.


It is a core language feature. You don't understand what Ada packages are.


Core language feature and package are not necessarily the same thing. No reason to bump heads. Instead you could provide a link to the RM: http://www.ada-auth.org/standards/12rm/html/RM-7-6.html



What do you use destructors for?


Everything, really. They are the only bit of runtime automation ever invented, besides Garbage Collection, but manage any sort of resource, most trivially memory. Without, everything needs to be programmed by hand, everywhere, fallibly. With, libraries can do the work, invisibly and with absolute reliability.


Absolute reliability, well

    {
        raii_handle *my_type = new raii_handle(new type);
        ....
    } /// ooops
Naturally it doesn't happen just like that, rather in 500 lines of code function in code that has already been through 10 contracting agencies, including some offshored ones.


Oh, so all my smart pointers I have written in Ada don't finalise when Finalize is called, by the runtime?


The example is about C++.


He was complaining of no destructors in Ada, when there are.


While asserting the reliability of RAII in C++, which only works out if you don't have devs that do stupid stuff like the example above.

You might consider that example stupid and argue no one would ever write code like that.

To which I would answer you haven't reviewed enough offshored C++ projects.


Would maybe be nice if there was a static check in C++ to verify whether something was declared on the stack, then that could be used to eliminate some of the potential RAII mishaps.


Ignoring libraries does not demonstrate anything about their reliability.

  {
     auto p =
       std::make_unique<type>;
     ...
  } /// no oops.


  {
     auto p =
       new std::make_unique<type>;
     ...
  } /// oops again.
The fact being the "reliability" is a good as the hundreds of developers that touched the code.

Real reliabile features don't depend on their skills to actually work 100% of the time, and doing something like newing a RAII type would be a compiler error, but alas it isn't.


Anybody can write crap code in any language, Ada very much included. So, the example is meaningless BS.

What matters is whether it is easier to write crap that compiles than correct code, and whether the easiest sort of crap compiles. That is where there has been progress. If you never, ever need to write "new" anymore, why would you accidentally write it?

The BS example above does not, in fact, compile, because make_unique<> is not a type. So, it illustrates the progress I cite.


Yes the example is wrong, I was just trying to make a point while copy-pasting on the phone.

One thing Ada was designed for and C++ not, was to prevent crappy developers to mess up.

Anyone that has reviewed C++ code from offshoring Asian companies is well aware of what I mean.


Your point was wrong. Your example illustrated that your point was wrong.

Bad code from dodgy, offshore outsourcing services may be in any language. Bad Ada code is no better than bad code in any other language. Bad Ada code could be worse if it often works by accident, where other code might have failed visibly and been rejected.


> Bad Ada code is no better than bad code in any other language.

Ada's strictness allows it to catch many more errors at compile time than, say, C.

> Bad Ada code could be worse if it often works by accident, where other code might have failed visibly and been rejected.

This is backward. Errors will be more easily detected in Ada than in, say, C. C is full of footguns and undefined behaviour. Vanilla Ada still isn't actually a safe language, unfortunately, but as a matter of degree, it's much safer than C.


What about Python's "with" statement and context managers? Or Common Lisp's "with-" macros? Or Java's try-with-resources? I'm sure many other programming languages have automatic exception-safe resource management constructs as well.


"With" must appear in client code, much like "finalize" clauses in Java and in similar GC languages.

That is the opposite of automation: everywhere you need cleanup, you have to repeat the incantation, and get it right, again, every time. The point of destructors is that they are wholly contained in the library. It takes no extra client code to run them, and they are exercised identically on every single use of the type, so are well tested. Identically the same code runs on a throw, so there is much less risk from usually poorly exercised failure cases.


Freeing memory, releasing locks, closing files (though destructors are less expressive than separate close methods because destructors can't return error codes, which is why I want linear types which can make calling a close() method mandatory), committing transactions (though some argue it should be an explicit function call)...


In .NET land there are Roslyn analysers that trigger a compiler error if you don't call using on a IDisposable type.


Implementing RAII. It's the singlemost fundamental C++ idiom.


Indeed, and although already possible in the very first C++ compilers, it is still foreign enough that many include it as part of modern C++.


It is utterly foundational to modern C++. You could not leave it out and even make sense.


It foundational to C++, period. Even in Turbo C++ 1.0 for MS-DOS released in 1992 or thereabouts.

The fact that many still think it only came with C++11 or whatever "modern C++" is where the problem lies.


Even in Zortech C++'s 1987 release!


Thanks for the heads up.

Borland and Microsoft ruled the compilers party in the Iberian Penisula during those days. :)


It is the foundation of C++. Even in pre-modern destructors were the heart even if we didn't use them as much as we do now with modern C++.


We have always relied on destructors fully as much as we do now. What is new is that we hardly ever need to write one, anymore, or even to declare one. That is a consequence of a more mature standard library.

I wonder now whether Ada auto-generates destructors for types with a member that defines one. And whether its standard library is such as to make a need to code them yourself vanishingly rare.


Releasing allocated resources automatically, for instance. There are many possible applications.


I work with Ada professionally and there's a lot to love about it. However, I have one big gripe that constantly gives me problems. It has nothing to do with the myths in the articles, but it's rare that I get to talk about this language so I'm taking my opportunity.

Ada's syntax makes differentiating variables/arrays from functions unnecessarily difficult. This is because parentheses are used for both function calls and array accesses. And function calls without parameters do not require parentheses at all.

Lets play a game. See if you can figure out which is a variable, array, or function.

    A := B;
    C := D;
    E := F(G);
    H := I(J);
A, B, C, E, H, and J are variables. F is an array. D, G, and I are functions.

The only way to really tell what's what in many cases is to look up definitions. That would be fine... except there is very little competition for Ada IDEs. I use AdaCore's GNAT Pro Studio (GPS), but the interface is clunky and often very slow compared to modern JetBrains IDEs.

This syntax issue is a constant frustration and it's something which could have very easily been avoided.


> but it's rare that I get to talk about this language

comp.lang.ada, reddit.com/r/ada, irc, gitter, telegram, etc.

> A, B, C, E, H, and J are variables. F is an array. D, G, and I are functions.

That was by design, because when it was designed, ide's like we have now didn't exist so being able to change a variable to an array from a function or vice versa was made easier. There is another reason and that mathematically an array is a function.


Hence the strong need to suffix the variable name to denote what type the variable is.


> comp.lang.ada, reddit.com/r/ada, irc, gitter, telegram, etc.

Yeah, but I don't want to join a community just to make a single complaint about a 40 year old language.

> That was by design, because when it was designed, ide's like we have now didn't exist so being able to change a variable to an array from a function or vice versa was made easier.

What a bizarre decision for a language built around such a strict and elaborate type system. I'd consider it being difficult to swap functions and variables to be a feature.


Why would you care, though? An array is effectively a function that maps indices to values, so why not reuse the syntax? And it's not like Ada is the only one that made that choice - FORTRAN, BASIC, PL/I are some other examples.

If anything, I'd say that the missed opportunity was to make arrays compatible with function access types.


"However, there’s a version of GNAT released by AdaCore called “GNAT Community Edition” which is similar to FSF GNAT but does not provide the runtime exception."

Ok, so if I download the GNAT Community Edition from Adacore, how I do switch GPS to use the FSF GNAT tool chain? How about the static analysis tools? Windows and Linux?

Everytime I look at Ada, I want to start doing a project with it. What a wonderful tool.


I added a note about how you can use the Alire 1.1 release candidate to install toolchains.

> Ok, so if I download the GNAT Community Edition from Adacore, how I do switch GPS to use the FSF GNAT tool chain?

Toolchains are customizable within GNAT Studio. If you use `alr edit` your Alire-configured toolchain should be selected. I would check to be sure. I don't work on Alire, but that work last time I tried. (I've been using Visual Studio Code lately as my editor)

> Windows and Linux?

I've built and run several projects on Windows and Linux without a problem.


For anyone wondering about how to use the FSF version of the GNAT compiler: There are packages available in Ubuntu package repositories, very likely others too. GNAT's GPRTools are also available in a separate package. When I first tried Ada I was up and running in no time.


There are also toolchains shipped as part of Alire since 1.1:

https://github.com/alire-project/alire/blob/release/1.1/doc/...

So you have a workflow similar to cargo in Rust:

* Install package manager * Let package manager install toolchains * ??? * Profit


I’ve wanted to try Ada and Spark for a while mainly to try formal verification and because a professor I listened to for a while keep banging on that most bugs don’t need to be written if you use Ada properly. (Shrug.) Who knows, I’d like to try it sometime.


C++ is probably going to get contract support which if it works out will allow the same tools. I've read the latest paper, it isn't as powerful as Spark, but should get there.


Not trying to start a war, but do you genuinely believe that C++ even with solid contracts is a conducive environment to bug-free programming?

I am not very familiar with it (have published 1 small library and 1 PR to a large, popular framework), but when learning the language after having written most other things it felt like it had the most footguns and number of random rules you had to memorize + instances of undefined behavior.

I've never published anything with Ada (mostly because almost nobody uses it, so I've not had the same practical usecases) but I do follow the language passively and have experimented with it.

I feel much more comfortable that I could write a correct Ada program than a correct C++ program. I think I could write C++ for 5 years and not feel sure my program is entirely correct, even with IE "cppcheck", "clang-static-analyzer", "ASAN/TSAN/UBSAN" etc.


Very qualified yes.

First, contracts are step one: they enable additional tools that can do static formal proofs using those contracts. It is impossible to prove anything about code that has a buffer overlow. In theory we have can write those tools without contracts, but the runtime for anything more than helloworld is unacceptable so nobody has even written them, contracts are believed to help bring that under control.

second, I assume only the latest modern C++ is used. I expect tools to give up as soon as they see constructs that are hard to analyze. (new/delete is obvious - but they reserve the right to add more constraints)

Third, modern C++ if you stick to only that is a lot easier to write safe code than C++98. Things are getting better - or where they are not we can treat them like rust treats unsafe - sometimes you need to do tricky things that tools can work with: isolate them to only those areas, have your best write the code, test it hard, and pray they work.

Fourth, conductive to bug free development is relative. I won't claim C++ will be the most conductive to bug free code. Part of that is intentional: C++ aims to be useful in places where you are willing to make trade offs - if you are not willing to trade at least some productivity for runtime speed C++ is not the right language for you.

Note the last: I never claimed it will be better than Ada/Rust/whatever. I have no experience with either, but tend to believe those who claim they are more productive.

I maintain a lot of C++, which interoperates with other C++: adding another language is probably lower productivity than writing more C++ in the long run just because those interfaces between languages tend to be where things are the hardest. So I'm taking a different approach: getting involved with the C++ standard to attempt to make it more productive.


Fair enough, I appreciate the genuine and thought-out answer -- thank you.

  > So I'm taking a different approach: getting involved with the C++ standard to attempt to make it more productive.
This is fantastic -- kudos to you!


Lets see if it has better luck than the version that was temporarly accepted in C++20.


Well the people doing it did learn lessons from that. Only time will tell of course.


Your professor was assuming most bugs are bugs that Ada language features protect against?


> most bugs don’t need to be written if you use Ada properly.

I think both Airbus and Boeing have been doing their best to demonstrate the limitations of Ada and Spark.


Ironically, you're right. Given the incredibly strict regulatory environment they work within, and the incredibly small margin of error, the fact that two of the biggest manufacturers of commercial aircraft choose to use Ada/Spark speaks volumes.


How many fewer bugs would they have if they used a language with a real type system, like Haskell?


One question I would like to know:

"Why should I learn Ada if I am not programming AIM-9X heatseeking missiles in my day to day job?"


>> "Why should I learn Ada if I am not programming AIM-9X heatseeking missiles in my day to day job?"

Because it changes the way you think about programming. Taking about Lisp, Eric Raymond said:

"Lisp is worth learning for the profound enlightenment experience you will have when you finally get it; that experience will make you a better programmer for the rest of your days, even if you never actually use Lisp itself a lot." Ada provides a different kind of enlightenment than Lisp:

1) Ada has a powerful type system that lets you define exactly what values are allowed and how they should work:

https://learn.adacore.com/courses/intro-to-ada/chapters/stro...

https://learn.adacore.com/courses/intro-to-ada/chapters/more...

2) Ada has built-in concurrency primitives that very simple to understand and use:

https://learn.adacore.com/courses/intro-to-ada/chapters/task...

3) Ada supports design-by-contract programming to precisely define the semantics between different parts of your code. Design-by-contract is ideal for large codebases and helps to catch errors similar to unit tests, but within the code itself:

https://learn.adacore.com/courses/intro-to-ada/chapters/cont...

Even if you aren't writing code for heatseeking missiles, learning about Ada can make you a better programmer.


I loved using Ada for concurrancy at university. I'm always surprised how few languages seem to have such a good built in concurrancy toolkit.

I guess it's more use in the contexts where Ada is normally found.


It seems like Ada is an early attempt at what Haskell gives you. Why bother with it now?


Haskell and Ada have very different goals. Ada is designed, in part, to work on embedded and real-time systems.

It isn’t really an “attempt” either. The language exists and is used and has developed as experience was gained. Why bother with it? Because you could maybe learn something new despite it being old.


Ada was one of the first language I learned academically then I've never had the chance to use it again. It's not one of these fancy languages everyone want to use for their toy projects. However, I remember it was a pretty cool language. Better static checks than C, including a robust module system, clear semantic.

I'd like to give it a try again as an experienced programmer and see how the language has evolved over time.


Did you got to the University of York? I did and I've never met anyone from another uni who learned Ada.

I think I was lucky to go and work for a company where I got to do both Ada (specifically SPARK) and Prolog in a professional capacity. It wasn't until I used them "for real" that I really appreciated the benefits of both when used for what they were designed for.

Sadly Ada has a bad rep, in part I think because it was mandated on so many terrible death-march defence projects. It seems to be having a minor renaissance, with people like Nvidia using Ada 2012 and SPARK to code their Risc-v security chip.


No, French institution in the late 90s. At that time, common choices for beginner languages were C/C++, which weren't that great compared to Ada. Usual criticism was that Ada wasn't mainstream but I think it was a good pedagogical choice nonetheless (and it's not like we're learning only one programming language anyway). It taught me to think in terms of abstract data types without the added complexity of class-based languages.

Eventually, they switched to Java who became popular around that time.


I agree, I think it's a great pedagogical language for teaching a lot of concepts. Our algorithms and data structures course was taught in it. It was great for learning real-time systems as its built in concurrancy primitives are so good.


When I was in high school (the Swedish version of it) in the early 2000, the introduction to programming course used was Ada. When I then later entered university their introduction course to programming was C.

As a beginner language I recall it being quite nice compared to C.


Auburn University in the United States was big on Ada in the 90s but eventually Ada was mostly replaced by Java after 2000. One of the popular Ada IDEs, GRASP, was developed there.


I went to Bradford, learnt Ada there because Andy Wellings, I think it was him, or the other guy, used to work there before moving to York.


I'm pretty sure I had Andy for the real time systems course in my third year. A good lecturer if I remember correctly.


Here's one question not on the list: what's the Ada job market look like?

As a web dev looking to deep dive on some non-web topic with the pragmatic constraint that it must be a hireable one, Ada seems promising but I wouldn't know where to start looking for jobs.


Not good. It was supposed to be THE language for military work in the US (Ex: F-22 Raptor was mostly Ada), but they had trouble hiring people and got rid of the "use Ada" requirement, or so I've been told. Newer planes like the F-35 JSF have used much smaller amounts of Ada (mostly C & C++ iirc...there is a breakdown somewhere to compare languages used in both planes). It is also used in some critical infrastructure projects like trains and stuff.

In short, C & C++ took over. SpaceX surprisingly used a LOT of LabVIEW in their mission control.


Is anyone aware of any small/medium size companies in a regulated industry that are actually using FSF GNAT in production?

I come across a lot medical device startups/contracts who could be a really great fit for an embedded application in Ada but who will not have the budget for Adacore's GNAT. The production examples on Adacore's website are typically very large companies in regulated industries.


Sadly I’ve only ever seen Ada used in Legacy codebased at large companies.

I did see a very cool looking job listings the other day at a small company, but it involed getting rid of Ada.


Ada's use is certainly not limited to legacy codebases (see nvidia's recent foray into ada + Spark).

In particular, in the 'industrial provable' space, there's almost no competition I think. If you have to write the source code for an artificial heart, what are your options?


That is still a major problem. I helped a guy get fsf gnat running on yocto for his startup, they didn't use Ada in the end though.


Are there any open source projects written in Ada?


Anyone else click on this expecting a post about Cardano?


lol




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: