Hacker News new | past | comments | ask | show | jobs | submit login
Who's Using Ada? Real-World Projects Powered by the Ada Programming Language (gwu.edu)
115 points by dodders on Dec 11, 2014 | hide | past | favorite | 73 comments



A lot of radar systems run on Ada. Its takes a little getting used to, but its a good language.

Ada has a c binding package so it can call down to the OS libraries (networking, shared memory), which is great, but kills some of the niceness of living in an ada world.

When I was trying out GO, I got a little ada flashback for some reason.

The package system was good.

I disliked ada strings though.

When I left the industry, they were looking for alernative languages for new projects. GCC Ada compiler was being considered for maintaining older projects, seemed decent. http://libre.adacore.com/tools/gnat-gpl-edition/


Yep, the Aegis Combat System is almost all in Ada (but simulated and developed mostly in C++ and Matlab)

http://en.wikipedia.org/wiki/Aegis_Combat_System


Although I've never coded in Ada I have had to read existing code to figure out what it was doing.

Ada struck me as a language that would be a pain to write in(at least it would take a long time) but it was a pleasure to read. Even without an intimate knowledge of the language (I was a C++ guy) I was able to figure it out.


Alternative to Ada is often writing Ada in some other language more painfully :( I think it's sad that companies move to restricted C/C++ subset just because there are more people with C/C++ in their resume.

1. The JSF air vehicle C++ coding standards http://www.stroustrup.com/JSF-AV-rules.pdf

2. MISRA-C:2004 Guidelines for the use of the C language in critical systems http://caxapa.ru/thumbs/468328/misra-c-2004.pdf

3. MISRA C++:2008 Guidelines for the use of the C++ language in critical systems http://frey.notk.org/books/MISRA-Cpp-2008.pdf


I have been visiting FOSDEM every few years and Ada presence in Europe has been slowly rising.

Just check the archives.

Most likely because for writing fail safe software, it is better to use a language with built in support for safety, than using crutches.


That's really encouraging! The C++/Java/Python monoculture coming out of schools suck.


Thanks for the links - I have been wanting to read the mistake guidelines for sometime.


Why cant they just learn ADA?


Because, employers want new employees to start coding NOW.


it's verbose, but I guess a good IDE would write a lot for you anyways. When it's statically typed the IDE can help you a lot. It touches the difference between what you want to read and what you want to write in software, but nowadays, stuff doesn't have to be written, it can be conveyed with color, font, underline etc.

But the real shocker is that you can't really write serious Ada for free, gnat is a pain, the free toolchain is a joke. My understanding is that there are good expensive private compilers out there. A few month back, I drank the cool aid and started a project in Ada, and the tools were a pain. The 'real time' system was ticking at a constant 1kHz. And making it tickless was looking tricky for a beginner. The formatting tool was bugging without an error message, the cross compilation toolchain for ARM is a pain, etc.


Yeah, the companies writing Ada code don't care about open source and amateurs don't write Ada code so nothing gets done.


Try Visual Basic, it's pretty similar.


I took a course on it in the 80s when I worked for a government contractor. It was fussier than Pascal but once you got it to compile, the resulting code was pretty good. I imagine that over time you'd just adapt to the environment and not even notice it unless you also coded in other languages.


Ada is a language that deserves more love.


I got to use it a bit in a Real-Time Systems course I took recently. I wasn't too excited about it starting out, it didn't initially feel like much more than a different C. As I've been spending some time with Clojure's core.async, I was really pleasantly surprised when I came to realize that it offered a really nice CSP implementation! Didn't go much further than that, but seems like a nice language that I wouldn't mind using more.


Beside being intimidating and having few syntax curiosity (' as string index operator) it really is a nice language.


Hmm? Ada uses the apostrophe for character literals and to introduce attributes. Strings are arrays; indexing uses the same `array[index]` syntax as any other array.

    S: constant String := "Hello";
    H: constant Character := S(1);


True, I was thinking about type conversion from String to Character, IIRC you have to use ' operator, something like my_string'first (not sure, I needed it 10 years ago), otherwise my_string(1) gives a String of length one, not the first Character.


Depends on how the array is defined. Ada allows you to have any discrete type as your array indices, so to get the first element of an array, you need to use my_string(my_string'first).


I would have enjoyed seeing a list like this in university where we had the option to learn Ada for a software engineering course.

Ada was mandated to be used for military applications but became largely irrelevant for commercial applications. It's now considered a white elephant [1] by many.

1: http://en.wikipedia.org/wiki/White_elephant


To my knowledge, Ada is used in embedded/realtime systems where reliablity is of the utmost importance. The European Ariane rockets have code written in Ada on board, I think (these are civilian rockets, not military). There is a subset of Ada called Spark that allows - to a degree - automated verification of code.

Not saying I like it or anything like that, but it definitely has its place.


For sure. This "Who's Using Ada?" list that was posted gives a good overview for what kind of applications it's used for. You'll see many where lives are on the line (be it for good or bad).


Ada was used on some of the modules on India's Mars mission: http://www.reddit.com/r/india/comments/1ujcmo/we_are_three_i...


The attribute mechanism is something a lot of languages would benefit from. Need to know the maximum value of an integer? Just ask the integer type to tell you. No going to a library to look up inscrutable constants.


In Haskell, you have typeclass values. In other words, generic values.

    class Bounded a where
        minValue, maxValue :: a
So it is always the same constant, and it just changes value by what its type context is.


C++:

  numeric_limits<type>::max ()


Ada attribute provide a lot more meta information about types and objects than just integer limits.

http://en.wikibooks.org/wiki/Ada_Programming/Attributes

Some of these would be methods in modern OO languages but many are unique to Ada.


So does C++; both C++ and Ada have traits the other doesn't. Good ideas go around.


In C# you have Int32.MaxValue. I always assumed it was pretty standard everywhere.

Btw, Ada is on my list of languages to learn since 2011. I guess I will put it first on the 2015 good resolutions list :-)


When you get started, here are the two books that seem to get recommended the most frequently:

Ada as A Second Language: http://amzn.com/0070116075

Programming in Ada 2005: http://amzn.com/0321340787


Thanks, I didn't know the first book but the second one has a more recent version covering Ada 2012.


Interesting, I didn't realize there is a new edition:

Programming in Ada 2012 by John Barnes: http://amzn.com/110742481X

I wonder if ADA wouldn't be better if they were subtracting features rather than adding them.


I have an old copy of 'Ada as A Second Language'.

It's like a million pages. A second language handbook shouldn't be thicker than K&R. Do you know any short books that really assume I already know how to program?


I don't know of any short books on Ada. Given the language, and the fact it's still growing, I doubt there are any.


There is a Wikibook on Ada, although I cannot say anything about its quality.

Also, the Ada language standard and the rationale document are available for free. The standard is really more useful as a reference document, but it is surprisingly readable for a standards document.


This is certainly an impressive list of projects. Ada seems to be favored for critical software that can't ever fail. I've been considering how to achieve this type of software reliability for more "enterprisey" applications. Formally verifiable languages are certainly intriguing, but would be a hard sell for the managers at the Fortune 100 company where I work.

Reading the wikipedia page for Ada didn't really give much insight into why the language is so heavily favored for applications like avionics software. Built in task based concurrency is certainly nice, but not a game-changer. Can someone more familiar with Ada explain what makes it so popular for high reliability applications?

Or is it a question of culture and tooling rather than the language itself?


It has a lot of safety checks built into the runtime, which make it a little slower.

x is and int between 1 and 200.

x gets asigned out of that range, exception is thrown. So we weren't manually checking everything.

Also I think it was mandated as a language for US government contracts, for a variety of reasons.


As with asserts in C, you can compile with that turned on or off.


Like C, it's been around forever and is considered battle-proven and figured-out. Unlike C it has real-time features and a type system that isn't trying to kill you.

Also the syntax is designed to be as easy to understand as possible, typing out every word and using no three-letter abbreviations.

It's in general a language that was way ahead of its time, but survived because the DoD mandated it. Its main competitor is C, but that's mostly because the amount of engineers who know Ada out of school is really low, which is a shame, I think. Rust is way too new and Haskell has GC.


Or is it a question of culture and tooling rather than the language itself?

Pretty much. I write software for Medical Devices and culture and process is what it boils down to. Tooling helps. We use C, C++, C# among other languages. Language makes a little difference in reliability, but not much.

What matters in a nutshell: Know what you're building: need good Requirements

Analyze those Requirements for correctness, clarity and uniqueness (no conflicts)

Make sure your design meets all the Requirements and doesn't add anything: if you add a feature not in a Requirement, how will the testers know it's there? Verify the design: i.e., does it do what you need and does it do it "well." Does it barely function, or will it handle what you expect to be thrown at it or fail gracefully? Separate the concerns: e.g., need to be sure that the temperature is correct? The temperature control code and the temperature verification code should be decoupled from each other.

Code: probably where language makes the most difference, although different languages will influence your design.

Verify all design artifacts: review your Requirements, Designs, and all Code against predetermined quality standards. Reject anything that doesn't meet the standard. Build what you planned to build, and document what you built. Most errors will stem from Requirements problems.

That's Software Quality in a tiny nutshell ;-)


I used Ada on my first coding job - test scripts for Airbus engines back in the mid-90s. Nice to see they're still using Ada, though tbh I don't remember a single thing about the language except for you can't dynamically create objects


Ada used to be basis for CS Curriculm at Cal Poly Pomona (Circa '92- '98). Ada teaches good practice. One prominent feature is that there is no implicit casting; Add a float and int, you get a compile error. It's good to know that you have to do exactly what you want and the compiler enforces this.


Haskell is similar in this sense. And Golang (IIRC, only coded a few hundred lines in it two years ago). Example from a Haskell REPL, where Haskell won't even upcast an Int (machine int) to an Integer (unbounded int, python style):

    Prelude> let a = 1 :: Int
    Prelude> let b = 2 :: Integer
    Prelude> a+b

    <interactive>:4:3:
        Couldn't match expected type `Int' with actual type `Integer'
        In the second argument of `(+)', namely `b'
        In the expression: a + b
        In an equation for `it': it = a + b


> unbounded int, python style

Arrrgh!! You mean "Lisp style". Lisp has had arbitrary-precision integers ("bignums") since 1971 -- long before Python was invented.


Whenever I talk about Haskell I always try to make my comparisons to Java, C, or Python for maximum relatability.

I think saying "unbounded int" is enough, but then people might say "what exactly do you mean by unbounded?" so I say it's about the same as a python int. Haskell itself predates python.

And IBM vacuum tube machines had arbitrary precision integers in the mid 50s, and transistor based ones in the 60s ;)


> Whenever I talk about Haskell I always try to make my comparisons to Java, C, or Python for maximum relatability.

Okay, well, then, thank you for giving me the opportunity to educate people on a bit of history :-)

> IBM vacuum tube machines had arbitrary precision integers in the mid 50s, and transistor based ones in the 60s

I seriously doubt it. They may well have had variable precision, but I don't think they had unbounded precision. That is, there was always a limit such that a result over that limit would either wrap around or trap (I don't know which), but that limit could be different for different instructions. (I'll bet there was a hardware-enforced maximum limit too, on the order of 12 or 15 digits.)

It wasn't until 1969 that Knuth published algorithms for arbitrary-precision arithmetic. Also consider that arbitrary precision requires heap allocation, which certainly wasn't being done in hardware in the 1950s and -60s.

If you can substantiate your claim, I'll be suitably impressed, but I'm extremely skeptical.


To add some specificity to gamegoblin's reference, I quote from page 28 of the IBM 1401 Data Processing System: Reference Manual[1]:

The two factors to be combined are added within core storage without the use of special accumulators or counters. Because any storage area can be used as an accumulator field, the capacity for performing arithmetic functions is not limited by standard-size accumulators or by a predetermined number of accumulators within the system.

[1] http://bitsavers.informatik.uni-stuttgart.de/pdf/ibm/140x/A2...


I was paraphrasing from http://en.wikipedia.org/wiki/Arbitrary-precision_arithmetic#...

So in the 50s they had integers with up to 2^9 - 1 (== 511) decimal digits.

In the 60s, the transistor-based one could do integers as big as your memory, the largest memory offered gave you up to 60K digits.


Okay, very good. I stand corrected. (Well, mostly. I would pedantically argue that you may have arbitrary-precision arithmetic, but you don't have arbitrary-precision integers until they're a functional datatype, so you can write simply 'a + b' to add them -- with the automatic allocation that that in general requires. But this is a quibble.)


ADA at university - never touched it again. It was ADA 83, and the ADA 95 object extensions looked horrible.

I remember seeing PL/SQL circa Oracle 8i and it looking a lot like ADA.


> I remember seeing PL/SQL circa Oracle 8i and it looking a lot like ADA.

Not a coincidence. The designers of PL/SQL were largely inspired by Ada's basic syntax and structure.

It's not an acronym, by the way, so no need to all-caps it.


The object extensions aren't bad; they just suffer from a combination of an unusual object model and an even more unusual use of terminology.


I was actually a little excited when I found out that the university I went to was an Ada school. Except, it turned out to be outdated information; it was actually a C++ school when I was there, and it started the process of becoming a Java school before I left.


Wow, I went to GW and learned Ada in CS051. I didn't have much of a point of reference at the time to compare it to other languages, but it was pretty easy to learn and I enjoyed the class.


I am class of 05 so my intro class was the last one to use ada. I also remember it being very easy on beginners. At the time I was annoyed the underclassmen got java, but now that I know java, no thank you.


Nice list. I programmed in Ada for eight months in 1997 when I was on a co-op at Rockwell Collins, working on their simulation testbed for their general aviation flight management systems -- I recognize several of the planes listed in the Commercial Aviation section.

I ended up really enjoying programming in Ada and was unhappy about having to return to C++ when I went back to school that fall. It was a bit more work to write, but it was easy to read and when it compiled it usually did what you expected it to. It's a language that makes it difficult for you to shoot yourself in the foot.


I'd love to hear about remote Ada jobs, and not requiring US citizenship...


Europe loves it some Ada in the avionics space.


Yes, but in those types of organizations remote jobs aren't really encouraged.


Though it would be cool if you could SSH into an A320 flight management computer from your apartment :-)


Or they could borrow me one for some time... :)


As a college student, is Ada still worth learning for (Canadian) development jobs in critical systems? I'd like to think it would give me an edge coming out of school, but I keep reading that new applications are being written in C/C++.

Edit: In fact, I interned at an avionics company and they're slowly trying to entirely switch to model-based development, using certified software that generates the code for you.


Ada is still extremely popular in mission critical software. I know the US military and NASA use it a lot.


We used Ada in my software engineering class in 1992 to try to build a system to do route finding on a map and then generate directions. Each build took 40 to 50 minutes on a pretty nice PC at the time. I don't miss those days.


Wrong tool for the job. I would do that in Python.


Not in 1992 you wouldn't.

This is a strange thing to say without explaining anyway. Would you care to expand on your point? I would expect Ada to run orders of magnitude faster than Python for a task like this.


I think that when you are doing exploratory programming and you're not quite sure about what you're doing you should have a language that gets out of the way as much as possible, which Python does. Ada and other bondage languages are for when correctness is most important or you need structure because you know the codebase will become large.


> I think that when you are doing exploratory programming and you're not quite sure about what you're doing you should have a language that gets out of the way as much as possible, which Python does.

I agree with this, but am not sure it's on point. Every software engineering class I've heard of is about as far as you can get from exploratory programming.


I used Ada during my first internship (avionics software). However, the group I was part of was in the middle of phasing out Ada in favor of C/C++. This was around 2010-2012.


Similar for me. My first ever programming assignment for a real company was converting an Ada project to C for the Boeing 787.


How was your feeling about the C codebase compared to Ada? Did one feel more comprehensible or reliable than the other?


Ada was my first language in college. At JC, our professors were DoD contractors at the nearby Air Force Base. Good times.


I worked on two geosynchronous communication satellites built on Ada (late 1980s using the TLD compiler). Both had a maximum of 48KB of ROM and 16KB of RAM. That included attitude control, command, telemetry, thermal/power management, fault manager, and little OS.

Since then, I've done pretty much nothing but C, but I do miss Ada's features from time to time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: