A lot of radar systems run on Ada. Its takes a little getting used to, but its a good language.
Ada has a c binding package so it can call down to the OS libraries (networking, shared memory), which is great, but kills some of the niceness of living in an ada world.
When I was trying out GO, I got a little ada flashback for some reason.
The package system was good.
I disliked ada strings though.
When I left the industry, they were looking for alernative languages for new projects.
GCC Ada compiler was being considered for maintaining older projects, seemed decent.
http://libre.adacore.com/tools/gnat-gpl-edition/
Although I've never coded in Ada I have had to read existing code to figure out what it was doing.
Ada struck me as a language that would be a pain to write in(at least it would take a long time) but it was a pleasure to read. Even without an intimate knowledge of the language (I was a C++ guy) I was able to figure it out.
Alternative to Ada is often writing Ada in some other language more painfully :( I think it's sad that companies move to restricted C/C++ subset just because there are more people with C/C++ in their resume.
it's verbose, but I guess a good IDE would write a lot for you anyways. When it's statically typed the IDE can help you a lot.
It touches the difference between what you want to read and what you want to write in software, but nowadays, stuff doesn't have to be written, it can be conveyed with color, font, underline etc.
But the real shocker is that you can't really write serious Ada for free, gnat is a pain, the free toolchain is a joke. My understanding is that there are good expensive private compilers out there. A few month back, I drank the cool aid and started a project in Ada, and the tools were a pain. The 'real time' system was ticking at a constant 1kHz. And making it tickless was looking tricky for a beginner. The formatting tool was bugging without an error message, the cross compilation toolchain for ARM is a pain, etc.
I took a course on it in the 80s when I worked for a government contractor. It was fussier than Pascal but once you got it to compile, the resulting code was pretty good. I imagine that over time you'd just adapt to the environment and not even notice it unless you also coded in other languages.
I got to use it a bit in a Real-Time Systems course I took recently. I wasn't too excited about it starting out, it didn't initially feel like much more than a different C. As I've been spending some time with Clojure's core.async, I was really pleasantly surprised when I came to realize that it offered a really nice CSP implementation! Didn't go much further than that, but seems like a nice language that I wouldn't mind using more.
Hmm? Ada uses the apostrophe for character literals and to introduce attributes. Strings are arrays; indexing uses the same `array[index]` syntax as any other array.
S: constant String := "Hello";
H: constant Character := S(1);
True, I was thinking about type conversion from String to Character, IIRC you have to use ' operator, something like my_string'first (not sure, I needed it 10 years ago), otherwise my_string(1) gives a String of length one, not the first Character.
Depends on how the array is defined. Ada allows you to have any discrete type as your array indices, so to get the first element of an array, you need to use my_string(my_string'first).
I would have enjoyed seeing a list like this in university where we had the option to learn Ada for a software engineering course.
Ada was mandated to be used
for military applications but became largely irrelevant for commercial applications. It's now considered a white elephant [1] by many.
To my knowledge, Ada is used in embedded/realtime systems where reliablity is of the utmost importance. The European Ariane rockets have code written in Ada on board, I think (these are civilian rockets, not military). There is a subset of Ada called Spark that allows - to a degree - automated verification of code.
Not saying I like it or anything like that, but it definitely has its place.
For sure. This "Who's Using Ada?" list that was posted gives a good overview for what kind of applications it's used for. You'll see many where lives are on the line (be it for good or bad).
The attribute mechanism is something a lot of languages would benefit from. Need to know the maximum value of an integer? Just ask the integer type to tell you. No going to a library to look up inscrutable constants.
It's like a million pages. A second language handbook shouldn't be thicker than K&R. Do you know any short books that really assume I already know how to program?
There is a Wikibook on Ada, although I cannot say anything about its quality.
Also, the Ada language standard and the rationale document are available for free. The standard is really more useful as a reference document, but it is surprisingly readable for a standards document.
This is certainly an impressive list of projects. Ada seems to be favored for critical software that can't ever fail. I've been considering how to achieve this type of software reliability for more "enterprisey" applications. Formally verifiable languages are certainly intriguing, but would be a hard sell for the managers at the Fortune 100 company where I work.
Reading the wikipedia page for Ada didn't really give much insight into why the language is so heavily favored for applications like avionics software. Built in task based concurrency is certainly nice, but not a game-changer. Can someone more familiar with Ada explain what makes it so popular for high reliability applications?
Or is it a question of culture and tooling rather than the language itself?
Like C, it's been around forever and is considered battle-proven and figured-out. Unlike C it has real-time features and a type system that isn't trying to kill you.
Also the syntax is designed to be as easy to understand as possible, typing out every word and using no three-letter abbreviations.
It's in general a language that was way ahead of its time, but survived because the DoD mandated it. Its main competitor is C, but that's mostly because the amount of engineers who know Ada out of school is really low, which is a shame, I think. Rust is way too new and Haskell has GC.
Or is it a question of culture and tooling rather than the language itself?
Pretty much. I write software for Medical Devices and culture and process is what it boils down to. Tooling helps. We use C, C++, C# among other languages. Language makes a little difference in reliability, but not much.
What matters in a nutshell:
Know what you're building: need good Requirements
Analyze those Requirements for correctness, clarity and uniqueness (no conflicts)
Make sure your design meets all the Requirements and doesn't add anything: if you add a feature not in a Requirement, how will the testers know it's there? Verify the design: i.e., does it do what you need and does it do it "well." Does it barely function, or will it handle what you expect to be thrown at it or fail gracefully? Separate the concerns: e.g., need to be sure that the temperature is correct? The temperature control code and the temperature verification code should be decoupled from each other.
Code: probably where language makes the most difference, although different languages will influence your design.
Verify all design artifacts: review your Requirements, Designs, and all Code against predetermined quality standards. Reject anything that doesn't meet the standard. Build what you planned to build, and document what you built. Most errors will stem from Requirements problems.
I used Ada on my first coding job - test scripts for Airbus engines back in the mid-90s. Nice to see they're still using Ada, though tbh I don't remember a single thing about the language except for you can't dynamically create objects
Ada used to be basis for CS Curriculm at Cal Poly Pomona (Circa '92- '98). Ada teaches good practice. One prominent feature is that there is no implicit casting; Add a float and int, you get a compile error. It's good to know that you have to do exactly what you want and the compiler enforces this.
Haskell is similar in this sense. And Golang (IIRC, only coded a few hundred lines in it two years ago). Example from a Haskell REPL, where Haskell won't even upcast an Int (machine int) to an Integer (unbounded int, python style):
Prelude> let a = 1 :: Int
Prelude> let b = 2 :: Integer
Prelude> a+b
<interactive>:4:3:
Couldn't match expected type `Int' with actual type `Integer'
In the second argument of `(+)', namely `b'
In the expression: a + b
In an equation for `it': it = a + b
Whenever I talk about Haskell I always try to make my comparisons to Java, C, or Python for maximum relatability.
I think saying "unbounded int" is enough, but then people might say "what exactly do you mean by unbounded?" so I say it's about the same as a python int. Haskell itself predates python.
And IBM vacuum tube machines had arbitrary precision integers in the mid 50s, and transistor based ones in the 60s ;)
> Whenever I talk about Haskell I always try to make my comparisons to Java, C, or Python for maximum relatability.
Okay, well, then, thank you for giving me the opportunity to educate people on a bit of history :-)
> IBM vacuum tube machines had arbitrary precision integers in the mid 50s, and transistor based ones in the 60s
I seriously doubt it. They may well have had variable precision, but I don't think they had unbounded precision. That is, there was always a limit such that a result over that limit would either wrap around or trap (I don't know which), but that limit could be different for different instructions. (I'll bet there was a hardware-enforced maximum limit too, on the order of 12 or 15 digits.)
It wasn't until 1969 that Knuth published algorithms for arbitrary-precision arithmetic. Also consider that arbitrary precision requires heap allocation, which certainly wasn't being done in hardware in the 1950s and -60s.
If you can substantiate your claim, I'll be suitably impressed, but I'm extremely skeptical.
To add some specificity to gamegoblin's reference, I quote from page 28 of the IBM 1401 Data Processing System: Reference Manual[1]:
The two factors to be combined are added within core storage without the use of special accumulators or counters. Because any storage area can be used as an accumulator field, the capacity for performing arithmetic functions is not limited by standard-size accumulators or by a predetermined number of accumulators within the system.
Okay, very good. I stand corrected. (Well, mostly. I would pedantically argue that you may have arbitrary-precision arithmetic, but you don't have arbitrary-precision integers until they're a functional datatype, so you can write simply 'a + b' to add them -- with the automatic allocation that that in general requires. But this is a quibble.)
I was actually a little excited when I found out that the university I went to was an Ada school. Except, it turned out to be outdated information; it was actually a C++ school when I was there, and it started the process of becoming a Java school before I left.
Wow, I went to GW and learned Ada in CS051. I didn't have much of a point of reference at the time to compare it to other languages, but it was pretty easy to learn and I enjoyed the class.
I am class of 05 so my intro class was the last one to use ada. I also remember it being very easy on beginners. At the time I was annoyed the underclassmen got java, but now that I know java, no thank you.
Nice list. I programmed in Ada for eight months in 1997 when I was on a co-op at Rockwell Collins, working on their simulation testbed for their general aviation flight management systems -- I recognize several of the planes listed in the Commercial Aviation section.
I ended up really enjoying programming in Ada and was unhappy about having to return to C++ when I went back to school that fall. It was a bit more work to write, but it was easy to read and when it compiled it usually did what you expected it to. It's a language that makes it difficult for you to shoot yourself in the foot.
As a college student, is Ada still worth learning for (Canadian) development jobs in critical systems? I'd like to think it would give me an edge coming out of school, but I keep reading that new applications are being written in C/C++.
Edit: In fact, I interned at an avionics company and they're slowly trying to entirely switch to model-based development, using certified software that generates the code for you.
We used Ada in my software engineering class in 1992 to try to build a system to do route finding on a map and then generate directions. Each build took 40 to 50 minutes on a pretty nice PC at the time. I don't miss those days.
This is a strange thing to say without explaining anyway. Would you care to expand on your point? I would expect Ada to run orders of magnitude faster than Python for a task like this.
I think that when you are doing exploratory programming and you're not quite sure about what you're doing you should have a language that gets out of the way as much as possible, which Python does.
Ada and other bondage languages are for when correctness is most important or you need structure because you know the codebase will become large.
> I think that when you are doing exploratory programming and you're not quite sure about what you're doing you should have a language that gets out of the way as much as possible, which Python does.
I agree with this, but am not sure it's on point. Every software engineering class I've heard of is about as far as you can get from exploratory programming.
I used Ada during my first internship (avionics software). However, the group I was part of was in the middle of phasing out Ada in favor of C/C++. This was around 2010-2012.
I worked on two geosynchronous communication satellites built on Ada (late 1980s using the TLD compiler). Both had a maximum of 48KB of ROM and 16KB of RAM. That included attitude control, command, telemetry, thermal/power management, fault manager, and little OS.
Since then, I've done pretty much nothing but C, but I do miss Ada's features from time to time.
Ada has a c binding package so it can call down to the OS libraries (networking, shared memory), which is great, but kills some of the niceness of living in an ada world.
When I was trying out GO, I got a little ada flashback for some reason.
The package system was good.
I disliked ada strings though.
When I left the industry, they were looking for alernative languages for new projects. GCC Ada compiler was being considered for maintaining older projects, seemed decent. http://libre.adacore.com/tools/gnat-gpl-edition/