Hacker News new | past | comments | ask | show | jobs | submit login
Why Ada isn't Popular (1998) (adapower.com)
118 points by Jtsummers on May 30, 2014 | hide | past | favorite | 112 comments



As a Rust contributor, I'm perpetually trying to find people who are familiar with Ada to comment on our design. I know nothing of the language except for that it's allegedly designed for both safety and efficiency, which is exactly the niche that we're targeting. If there's anything that Rust can learn from Ada, we'd love to know while we still have the leeway to make breaking changes.

While we're at it, I'd love to see some Erlang programmers take a look at Rust as well and tell us where we're falling down on concurrency. Sadly, both of these languages seem underrepresented in the circles that I frequent.


Inviting others to critique your work. Great to see. I will have to have a serious look at Rust sometime then.


We've talked to Tucker Taft about Ada and Rust before, though it was a while ago and I mostly recall that we discussed aspects of Ada's standardization process (I think Graydon was a fan of Ada's spec). Niko would probably remember more.


And FWIW here's the research presentation Tucker gave while he was at Moco about his new language Parasail: https://air.mozilla.org/region-based-storage-management-para...


I don't know anything about Rust, but things I disliked about Ada (besides horrendous tooling support, free or paid) are:

- Complex language that isn't matched with an equally complex language standard (a lot of undefined behaviors or gaps where things are not explained as fully as they could be, but I don’t remember any specific instances, so maybe I'm crazy on this one)

- Too many situations with the syntax where you can do something only in certain situations, or the opposite. From what I see the only possible reason why in these bizarre situations is to lower the complexity of the compiler. For example, why can't I put a package instance in an array, when the generic package features are so strong that they can be pretty much (ab)used as object oriented structs. Try to search in Barnes' Ada book for terms like, only when, except when, however.

- Reuse of keywords that mean very different or slightly different things in one context versus another. An example would be delay, two lines of code might say delay 10, one actually means sleep the task for 10 seconds, the other means goto the next line of code if one of the previous points of entry in the lines above is called.

- The ability to tiptoe very easily around the safety mechanics of the language and start using it as if it was C, which is very tempting for two reasons: 1. Most programmers programming Ada were/are from a C background and the first things they pick up on are the ways to do things in Ada like you could in C while never moving on past that. It makes sort of sense to not move pass that if you aren't given time to fully digest the Ada way of doing things. 2. Takes less lines of code to program something up if you ignore the all the safety features that language can provide. I would have not allowed the ability to ignore all the things that make Ada stand out. It makes sense to not use Ada for any development, because programming C in Ada is going to be slower than programming C in C any day of the week.

- Hacked on OO. Can hardly find any Ada code online using OO language features added in Ada 95, but that makes sense because it's a bunch of keyword boilerplate on top of the existing records that's very unwieldy to use and ugly.

On the topic of Ada not taking off, I think one big reason why Ada can't take off the ground right now is that you can't find anything online about it. Try to search for some question you have, you'll find no answers, search the same question but pertaining to the Java or C version, and you’ll get a bunch of answers, blogs, even unanswered questions, Ada lacks that web presence greatly. Where are people going to talk about Ada and develop modern tools for it and frameworks in it if such places don’t exist? (comp.lang.ada is the only location on the internet with an active Ada community) It seems like that can only appear out of thin air if the language is new, where a sudden influx of many individuals feel like they can contribute where their energies are combined together, but if you get an individual here and there that wants to help out, but their energies are separated by one week or one month, then nothing for Ada will ever come about.

On the topic of concurrency, you may want to look into Ada on that too, concurrent structures are baked into the syntax of the Ada language, and can be quite complex/powerful/concise.


Seems like I can't edit my post, but I just re-read a bit and I meant to say "goto the next line of code if one of the previous points of entry in the lines above is not called within 10 seconds"

not

"goto the next line of code if one of the previous points of entry in the lines above is called"


What are some good resources for Rust? I've been coding Erlang professionally for one and a half years now (spearheading a lot of our development), and will be continuing to work primarily in it for some time to come, and Rust is on my list of languages to look at. I don't suppose there's any equivalent to Erlang's Learn You Some Erlang For Great Good (I'd assume not given the language's still being in flux)?


Right now, the best two are Rust for Rubyists (bias: I wrote it) and Rust By Example.

The official docs are undergoing a lot of work, so they'll be excellent too in the near-ish future.


ats is also doing some very interesting stuff around adding algebraic datatypes and linear types to a c-level language.

http://www.ats-lang.org/


Did you know that Truman State University used Ada in its CS curriculum?


The University of Adelaide in Australia's CS department also taught ada in the 1990s, due to demand from defence organisations in the city. It was thus the second programming language I learned after pascal (not going to count basic).

Unfortunately, or fortunately, pretty much everything I knew about the language has been GC'ed from my brain...


Thanks for doing what you do.


I think the biggest factor why Ada never 'made it' was that the compilers that were available were priced ridiculously.

One of the big drivers in language popularity in the long run is how many people can become familiar with a language by using it, and by having all the implementations be terribly expensive you're almost automatically limiting the number of people exposed.

I think companies that did this (much less so today than in the past) saw the languages as their profit centers, not the eco systems. Microsoft, Borland, Zortech and a whole pile of others were in a never ending battle to get the programmers on board. Every computer came with at least one programming language pre-installed (usually some form of BASIC). What you did after learning BASIC (and some of its limitations) was dictated by need and funds.

For most people that meant that their second language was some variety of assembler. For others to re-implement Forth (since that was doable). A C compiler was something you could still buy on a regular salary.

To get exposure to Lisp, ADA, Modula-2 or any one of a whole bunch of other high level languages required shelling out significant bucks. So during what we might refer to as my 'formative years' as a programmer none of these really nice languages made it to my computers.


It really wasn't ridiculous. I know, from the point of view of your budget, it really was, but...

The compiler writers were serving a tiny sliver of a market. At the time it took a lot of resources to build such a compiler, and then you had to support a bunch of not-off-the-shelf, militarized and embedded chips. You had to past a huge suite of tests, and if a customer submitted a bug, well, you just had to fix it in a timely manner. I'm sure we paid a ton for our compiler, but they also ended up flying people to us regularly from NYC to DC. The compiler writers were all but on speed dial. I'm sure we got our money's worth out of it.

Now, what we could have gotten from the same amount of money applied to a C compiler plus extensive tooling, testing, and so on is an interesting question...


It was ridiculous. These languages are now all dead, because they never gained the mindshare compared with the languages that were free, which eventually led to their decline. That's the point.


I understand the point perfectly. It does not change the economics of the situation for the compiler writers. There is no way a company could have offered an Ada compiler for, say, $100, or even $500.


Isn't there some way they could have implemented tiered pricing? An $X00 Learning/Evaluation Edition and an $X,000 (or $X0,000) Professional Edition?


That's a good point. I wasn't in the business of writing the compilers, just using them. At my University we had Ada for free (I studied with Mike Feldman, listed below); I don't know what the licensing was for it to the school.

But, you know, this was heavy iron. At work the compiler ran on VAX, compiled to 1750A. Who wants that besides DoD? So,there was probably a x86 back end, I don't know. But still, we were buying extra memory modules and processors for the VAX just to get it to be able to handle the compiler. No way that was going to be running on the boxes of the day (where 'the day' = 1990-4 time frame). Even then, we were "submitting" jobs - you'd start a compile, go to lunch, come back, and maybe, just maybe have an image ready to go.

Okay, so let's say they engineer it to somehow require low memory so it'll compile on a PC of the time. Then what? The prevailing opinion was pretty hostile to Ada, even within the DoD community where it was being used. People didn't want Ada, they weren't clamoring for it, the only market for it was for the programs where it was mandated, and programs would often do whatever they could to get exemptions written for them. Maybe it was different in Europe, I don't know. But at the time it was considered to be bloated, 'designed by a committee', with a lot of unnecessary features and limitations. I don't agree with that assessment, and of course there were people with different views on the subject, but it was a pretty hostile marketplace. Most of the positive press for Ada was accompanied with praise for heavy process - waterfall, design first types stuff. That didn't appeal for a good reason, and it is hard, without experience, to recognize that the language did not require working that way. Heck, we did iterative development with it long before XP and Rapid Development were coined. The average consumer just did not want Ada. Are you going to go into a company strategy meeting, suggest giving away your very expensive compiler that won't really work well on the machines of the day, risking your career? It was a different world, we didn't talk about gaining 'mind share' and the like back then. Things like gcc, while available, were openly scoffed at, except by the uber-nerds. I tried to push gcc early on, and just got laughed at. OF COURSE we are going to throw money at Microsoft; that's what responsible businesses do.

As I stated above, Feldman was initially my advisor in grad school; he was pushing Ada adoption very hard - you can see where that went. Dismissing all of that history and experience with 'ridiculous' is pretty hostile and condescending, not to mention unthinking (I know you didn't say that). We may be gray beards, but we know a few things too! :)

It was a language before its time. I guess it's a 'shame', but there is plenty of room for a free, strongly typed, engineering language. Perhaps Rust is it, perhaps something else is. I don't blame 1980s companies for not operating as if it was 2014.


Something I see missing from computers these days (and lots of days previous) is a programming language designed for non-programmers.

The likes of Ruby and Python are _approachable_ but everything is a mess of libraries, RVM and lookalikes, dependancies, and a whole lot of complication. This is great for the target audience but less so for your average user.

Excel and the Microsoft like are currently filling the role, and doing so poorly.

The scientific community has things like R and MATLAB which do a good job for scientists.

Everybody else _can_ learn a language, but it can be daunting and it's definitely geared towards hackers and professionals.

In an ideal world all computers would ship with a native programming language, a big fat manual for users, and a set of software that's core design allows for and is meant to be programmed for everyday usage.

This language would have to be relatively simple, stable over time, and more monolithic than most. It would have to sacrifice speed and functionality for approachability, but if it were done well it could be world-changing.

Sadly the market is moving in the opposite direction (but in some sense the same direction too) Everybody these days is using an app with simple buttons which doesn't do much at all. It's successful because it's so approachable, but scary because it's so controlled by the vendor.


Audio/music: puredata [1] is pretty visual to start with and you can play around with things, gets away from the REPL completely. ChucK [2] and supercollider [3] require typing code into a terminal, but are feasible perhaps because of the minimal complexity of most live coding sessions.

Graphic artists have Processing [4]. REPL but with simplified grammar.

Perhaps the end-user programming future is in domain specific programming environments (R like but for different disciplines?)

[1] http://puredata.info/

[2] http://chuck.cs.princeton.edu/

[3] http://supercollider.sourceforge.net/

[4] http://www.processing.org/


Processing is a wonderful library/programming environment. An intro to programming class taught in Processing would be cool.


The languages aren't the problem. It's the ecosystem around them. The software that we use nowadays is highly complex, multifaceted and nested under tons of layers of abstraction.

There's only so much languages can do about this. Programming these days requires a lot of domain-specific knowledge like never before, if you intend on doing something larger than text adventure games.


It's a bit dead now, but there is a great little Pascal derived language called Turing used in Ontario schools. It is shipped as this one exe which is a text editor as well as running the code with F1, and docs on F8-10.

It made it super simple to get stuff on the screen and start making a simple game. It made our high school programming classes awesome. There were no libraries by third parties or anything, but we could implement everything for a 2-D game ourselves and it was pretty great. There was never any setup problems or pain like that, which was also awesome. Also a code-formatter was on F2, so everyone's code looked sane.


We're working on that at the moment. We had a post on HN recently with some thoughts about tooling - https://news.ycombinator.com/item?id=7760790 . Our goal is to make something that is as approachable as Excel or VB6 to begin with but has a smooth learning curve into a general purpose language with sane tooling.


> In an ideal world all computers would ship with a native programming language, a big fat manual for users,

The Archimedes I grew up with had BBC BASIC (with its assembler) and more of the machine's user's guide was programming language manual than anything else.

python is sort-of close given it's designed as a teaching language.

many of my fellow perl hackers came from a humanities background and say they found the human-language-ness to make perl more approachable than anything else.

I wonder whether an analysis of what python and perl do and don't have in common with BASICs might help answer the question of how to make a language as accessible as possible ...


That is the description of Python, Ruby, Scriptol and some others. But programmers moving from Python to Go, shows that experienced programmers are often more concerned about execution speed, even at the cost of work time.


Execution speed is important for some, but simplicity seems like it's a larger part of the Go adoption story.


In an ideal world all computers would ship with a native programming language, a big fat manual for users, and a set of software that's core design allows for and is meant to be programmed for everyday usage.

AppleScript?

In practice it's kind of awful — glaring omissions and nasty ambiguities abound — but it is on a lot of computers, and the ambition is pretty much in line with your request.

And with a bit of persistence it does let you automate a load of stuff pretty well (e.g. I have a part-AppleScript, part-Ruby script to back up my iPhoto library to Flickr).

Plus, of course, there's Automator, for the real non-programmers. :)


Take a look at Racket.


Lua: simple syntax like Basic, powerful and conceptually similar to Javascript, used as scripting language in many video games (from World of Warcraft to Crysis) similar to VBA/Office, has a state-of-the-art and very fast JIT virtual machine "LuaJIT" like JS V8, JavaVM


What about GNAT? I remember sniffing at Ada back in the mid-90s, and at that time GNAT had implemented most of Ada 95 (which had a lot of new OO features that seemed on par with Borland Pascal). I did not choose Ada because Borland was so much better -- IDE, big community, many libraries -- and then Delphi came along; Ada looked more advanced, but not fun.


By then the damage was done. I'm talking about early to mid 80's.


Learned Ada in '80 at university, so the compiler cost was not a factor. Like many others in the UK I did not have a computer at home.


Lucky you! I spent the first decade or so of my professional life buying machines + software just trying to keep up with the world, I barely ate in those days. (the downsides of not finishing school...)


I remember there being quite a lot of interest when Ada95 came out, actually. I don't think it was too late at that point.


It was also initially considered to be unimplementable by a number of compiler people (including me). We were wrong, but it was a while before a working Ada compiler was produced.

C had the great advantage that a C compiler would fit in MS-DOS's address space with some room left over for the symbol table.


What were your main reasons for thinking Ada could not be implemented?


It was a very large language, relative to others of the day, and I had no idea how to do it.

Ironically, I picked C++ to implement, having no idea that it was unimplementable :-)

(It took until the aughts for EDG to be the first to do a complete implementation of C++.)


Thanks (both for the answer, and for ZC++, my first real C++ compiler on windows after 'cfront' (on xenix iirc, but it's ages ago...), MWC before that).


My first C++ compiler was cfront and SAS.

It is difficult to think of Ada as a 'large language' compared to everything else. I'd love to see Ada and Modula-3 have a resurgence.

http://en.wikipedia.org/wiki/Modula-3


What benefits would you see Modula-3 offering over the numerous other languages available today?

There wasn't anything particularly special about it when I last used it in the 1990s. It wasn't a bad language, but there wasn't anything particularly compelling about it or its implementations, either. There was nothing that really made it more useful than C++, or even Delphi and Java.


I can't add much over what wikipedia [0] says. One of the most attractive features for me was the rapid compilation times and the module system (they go hand in hand).

Modula-3 predated Java by about 10 years and I'd say the type system [1] still surpasses it. Modula-3 was nearly everything that Java ended up being, a safe performant systems language.

[0] http://en.wikipedia.org/wiki/Modula-3

[1] http://www.cs.cmu.edu/~rwh/courses/modules/papers/cardelli-e...


This is an especially interesting question given how much smaller and faster Pascal implementations were for DOS than C compilers. What did Ada add that was so complex (and how's it compare to the complexity that C++ added to C)?


Ada has packages, generics, exceptions, concurrency, default values, operator overloading, user defined types, fixed point, subtypes (so you can, for example, limit an integer to the range 0..100, and then the compiler has to check to see that it is used correctly everywhere), pooled memory, built in run type checking, pragmas, etc., etc.


Pascal also had subtypes, or at least Borland's did.


The standard library is robust... The language itself isnt terribly big, it does support threads though.

NYU made a free Ada compiler in the 90s, it's part of gcc now. Ada's problem wasn't only compilers, it was the culture. Hackers hate COBOL, it's still routinely used to cutdown a language ("Java is the new COBOL") yet very few have ever used it but it has that rap. Ada is a DoD specified language that was designed with real software engineering problems and science in mind, it has a bad rap because of that.

It's a different world now, the tooling alone changes the equation (at the time both Ada and pascal were seen as too verbose, begin and end were a lot more letters than braces, now our editors do that stuff for us..) being a DoD specified cousin of pascal is what hurt Ada, then there was the compiler selection... It always seems like Ada is ready to be attractive, there is a community around rust and go though.


I started programming in the early 80s, and never used Ada. At the time, Ada had a reputation as "that over-bloated language that people in the defense department have to use." This perhaps underserved reputation is addressed in section 7 of the linked article.

From my point of view at the time, any language that the US Government would mandate as a required technology MUST be flawed. A language embraced by a large bureaucracy must be full of large bureaucratic nonsense. That was the thinking, at any rate. Ada, along with Cobol, was a frequent target of snarky jokes. In the mid-80s, the cool kids were programming in C (while being paid to program in Basic or Cobol or Fortran).

I freely admit these notions were likely borne out of ignorance, but I imagine this reputation stifled its adaptation outside of government circles. Would you want to use a language mandated by the DMV?


One other factor I would mention is the massive uncoolness of Ada. This seems to come from three directions:

1. Invented by and for a bureaucracy known for its extravagance and wastefulness.

2. Verbosity of a degree only before seen in COBOL.

3. A focus on avoiding errors above all else. For people who have a positive/benefits focus like most hackers, this negative focus is very unappealing and provokes references to anal-retentive pedants and so forth.

I would agree with others who pointed out the other problem "won't run on a computer I can buy - and anyway I can't afford the compiler". This is not to criticise the compiler developers but nonetheless it was a problem.

http://en.wikipedia.org/wiki/Worse_is_better


Oh please, the comparison of Ada to COBOL is invidious. Ada simply requires you to be explicit about your intent. This is a feature, not a flaw. You mean to say that you've never been burned by an implicit conversion in C, or a misplaced semi-colon? Ada is no less verbose than Java.


I think grandparent refers to what appears to be gratuitous verbosity, i.e.

    procedure Foo is
    begin
    ...
    end Foo;
    
vs java

    void foo() {
    ...
    }
Even if one wanted to be explicit about indicating what "End" refers too this could have been

   proc Foo
   ...
   end Foo
EDIT: when I write "appears" I mean "there may be a perfectly good reason for it but isn't obvious", not that it is in fact gratuitous.


I am an Ada developer, but I think it is objective to say that anyone who opposes a language because there fingers will have extra work probably doesn't belong in this field. If you consider the development process as a whole—research, planning, development, verification, etc.—those extra keystrokes add an exceptionally marginal amount of time to the development process, but reduce time so much more by making the code more intuitive to read. Don't let me lead you to believe that Ada's words make it intuitive; that would be disingenuous, but the syntax has been formed since its inception to be readable by developers and non-developers alike. This is an important distinction with something like Java, neverminding that you don't have to explicitly instantiate generics in Java. One of the key objectives of Ada is code that is especially intuitive to non-developers. There's a lot going on in the language. I hope this helps.


Variable and type declarations happen between the procedure foo and the begin.


I would mention here Richard Gabriel's essay "The End of History and the Last Programming Language", which you can find at http://www.dreamsongs.com/Files/PatternsOfSoftware.pdf

His argment that "Languages are accepted and evolve by a social process, not a technical or technological one." seems to apply here, also (in the context of the time) "Successful languages must have modest or minimal computer resource requirements."


I am a 25 years old student from one of the top CS engineering school of France. People who enter there usually have a heavy math background, but are fairly new to programming.

The first semester all the students are introduced to basic programming concepts, and the language that's used for that is Ada. Then Ada is used to teach algorithms and compilation class. In fact in the second year, we had a full-time project where we wrote a compiler for a language close to Java (in terms of syntax and features), in Ada.

I remember being in my python phase at the time, and bitching at Ada's verbosity. But I realized how comfortable it is to let the compiler do most of the debugging for you, especially on very large projects that are structured as a pipeline (eg. a compiler, every part depends heavily on the other). When it compiles, it works 99% of time. Static typing/subtyping, generic package, all the time spent in the console with GNAT yelling at you really teaches you how to structure, secure and bullet proof your code. Ada really is a great teacher!


(Fellow ENSIMAG-er, I presume?)


yeah :) a safe bet!


Even though Ada is now completely forgotten by most people, it's worth mentioning that the GNAT compiler is being actively maintained by the company AdaCore (which core business, as the name suggests, is about Ada). http://www.adacore.com/


Right. Also note that GNAT is just GCC. The most recent language standard is Ada 2012. (I was involved in the design of the Ada standard container library, which originally appeared in Ada 2005.)


Glad to see that this old post of mine (originally posted on comp.lang.ada) has been revived on Hacker News!


I first learned Ada in a university programming languages class in 1984. It was chosen by our professor as the culmination of classic programming language ideas. We did not have access to a full Ada compiler, but we used "Janus Ada", which implemented a subset. I really liked it because it was very consistent and readable and you could create sophisticated data types, and the compiler strictly enforced their use.

A few years later (1988), I was working an a large military aircraft project that used Ada. We did a huge up front design with reams of design documentation. The first code that was written was a package of data types to ensure consistency. Unfortunately, there was very little Ada experience among the team and we sometimes designed ourselves into corners. Such problems were often solved by using "unchecked conversions" to get around the type enforcement, which defeated the purpose. The problems we had were similar to problems huge C++ projects would have a few years later.

The other problem we had was with the compiler. The government had placed high performance demands on the compiler, so it was highly optimized. Unfortunately, it would sometimes optimize away code that you needed. Fortunately, we could generate a code listing that interspersed the assembly output of the compiler with the corresponding Ada source code and thus find the problem. Usually adding an intermediate local variable to break up a complex calculation solved the problem.

In 1990, I got hired by a start up that was doing military contract work simulating radars for flight simulators. They had done all of their work in C, but now they had a contract that required Ada, so I became their "Ada expert." All of the C guys hated Ada because of its strictness. Ada wouldn't let you mix numeric data types without an explicit conversion, and that really chafed the C folks. It reminds me of today's dynamic/static typing debate.

The Navy ended up canceling the aircraft so they no longer needed our radar simulator. The next project I was on decided to use the new C++ language everyone was excited about. Ada was starting to fall out of favor, so waivers for the Ada mandate were getting easy to obtain.

At they time, I was thrilled to use C++. It seemed edgy compared to the bureaucratic nature of Ada. Now, however, I don't know if C++ was an improvement overall. It has some advantages, but also some disadvantages. I'm happy to see that the Rust guys are trying to learn from Ada. I was an inexperienced kid when I used Ada, so I am by no means an expert, but from what I do know, I think it still has something to teach us.


This is horribly pragmatic, but a reason that can contribute to a langugauge's success and popularity comes from how easy it is for new users to "pick it up", "find libraries that they find useful" and build working prototypes for "itches they want to scratch".

I'm making broad generalizations, but as examples, if you wanted to build a program to perform some numerical analysis or maybe build an analytical simulation you could pick fortran and you'd likely find libraries and examples to help you out. If you wanted to build a simple web page backed by a database you could grab php & some of its libraries and you would be quickly constructing web pages. With java you could find examples of tools / libraries for churning through databases and also plugging into various web application frameworks. If you wanted to write a PhD about computer languages you could use lisp :P. I've found great value in python personally because the examples were decent and more importantly it was easy to see how to build the programatic bridges into the other languages and libraries I wanted to use. I'd strongly argue that ruby and node.js popularity can be partially traced to seeing the "fun" and / or neat examples that are shown off in tutorials. They show how to leverage the constructs provided by each language's common libraries and the resulting programs are interesting / neat to some number of potential adopters.

In the 90s when I looked at ada there wasn't much in terms of libraries that I could leverage to explore problems that interested me at the time.

Again, not every language has to be ideally suited to writing video games or web pages to be useful. Ada the language might have outstanding academically interesting aspects. If it doesn't help me solve real world problems I care about though it's less likely I'll invest the time to learn about it.


I like to categorize many features that lower initial difficulty as "deferred technical debt." There are type system features that ensure a higher degree of correctness, but it's not fun debugging compiler errors when you're trying to get something working.

For example, being forced to handle errors immediately via return codes or option types is not "fun". By comparison, exceptions act as a giant GOTO and no one blinks an eye.

Dynamic typing is also another form of deferred technical debt. It is preferable to handle type errors at runtime or through testing instead of at compile time. People who claim very few bugs are a result of type errors typically do not have experience with ADTs or stronger type systems than Java / C++ / C.[0]

> Most programmers think that getting run-time errors, and then using a debugger to find and fix those errors, is the normal way to program. They aren't aware that many of those errors can be detected by the compiler. And those that are aware, don't necessarily like that, because repairing bugs is challenging, and, well, sorta fun.

I don't think this attitude has changed in the past 16 years. People prefer to debug a run time stack rather than deal with compile errors, perhaps even more so with the rise of dynamic languages.

[0] Clojure community likes to argue that bugs arise from mutability more than type safety, but I don't have not experience to comment either way.


> People prefer to debug a run time stack rather than deal with compile errors

...probably because most popular languages give you compiler errors that sound like "#@#E WTF @#@$ WTF2x @#/&^( WTF3x..." whereas a stack trace is something that one can easily follow and make some sense of even if you are just touching the languge for the first time.


I disagree strongly with exception versus error codes. It sounds reasonable until you look at what real programmers (the kind that isn't perfect) will do and how it affects production code.

Real life programmers don't know the stack up and down and haven't read the documentation for the stuff they use. They will not have thought through every possible error case. Sucks, but that's real life. So any solution that depends on either of those will simply fail. So there's 2 ways to have parts of your program signal errors to other parts. The question that matters is what will mediocre programmers do ?

If you force em to give errors through return codes, they will flat-out ignore the errors. That means errors, and all meta-information about them (like which file it is that won't open, or which host doesn't respond, that sort of thing) disappears into a black hole. It MAY end up in some log file, or it may not. Crucially, sometimes it disappears into a logfile that is custom to the library being used (at least in C). Needless to say, you will have memory leaks when this happens, and you will have invalid state. It may also lead to crashes directly, if it doesn't it will lead to "delayed crashes" (like OOM), and all sorts of fun behavior (which is why some system engineers prefer direct crashes : it makes the point where the problem gets created easy to identify).

If you give them exceptions they will not handle the exceptions, nor will they get try ... finally correct. This will lead to memory leaks and inconsistent data. And it may lead to program crashes, BUT crucially it will preserve the error that was originally detected, and log it in the main program, making fixing the error critical.

Is this "deferred technical debt" ? I can understand your reasoning, but I think checked exceptions are a much, much better solution than return values.

What I find especially irritating is the moronic attitude that testing somehow makes up for not having static types. I have never seen a program that has half the tests any statically typed language automatically executes. Not even once.


Yeah I totally agree here.

There is a lot of hand wringing about hiring 'the best programmers' etc, but the reality is everyone is human and humans make mistake. Lets make the systems resilient to mistakes that humans make - strange how this seemingly simple statement is actually quite controversial around here!

In truly large systems, all of the above happens and more. This is why error handling in C is so problematic, and why we toss it with Java. That go is bringing it back, is a little worrysome. Then again, go might be a hackers language, and might ultimately end up being the repository of write-once programs that aren't maintained.


> I disagree strongly with exception versus error codes. It sounds reasonable until you look at what real programmers (the kind that isn't perfect) will do and how it affects production code.

I prefer option types or checked exceptions, but mentioned return codes because Go is making them popular again.

The most recent GnuTLS bug comes from an ignored return code:

http://blog.bro.org/2014/03/dissecting-gnutls-bug.html


> What I find especially irritating is the moronic attitude that testing somehow makes up for not having static types.

Or that static types are useless, because "you will need to make unit tests, anyway". As if the static types don't save you from making a lot of unit tests.

But I guess it's useless to wear a seatbelt, because if you crash head-on with semi-trailer you will die, anyway.


> There are type system features that ensure a higher degree of correctness, but it's not fun debugging compiler errors when you're trying to get something working.

> For example, being forced to handle errors immediately via return codes or option types is not "fun".

> It is preferable to handle type errors at runtime or through testing instead of at compile time.

Is that right.


Yeah, the wording kind of sucked there. It's better stated as, "Dynamic language users prefer to handle type errors at runtime or through testing instead of at compile time."


To put it another way

"It's the ecosystem, stupid."

Apologies to Bill "It's the economy, stupid" Clinton.


Poor James Carville. According to Wikipedia he was the one who coined that phrase. But Clinton gets credit for it, much like (IIRC) Edison got a lot of credit for the work of his minions.


My dad is a retired DoD engineer who worked extensively with Ada. Sent him this link, and this was his response:

Yeah, I liked Ada because it was very readable, and decomposition into smaller units was easy. I always liked the package concept.

Our compilations weren't all that slow. Maybe an hour or so for 500,000 lines of code. It was always best to compile units individually, not to build everything and compile all at one time. It could take days to get through compile-time errors that way. A lot were due to mismatching data types.

Once we tried to upgrade to a new version of the compiler, but couldn't get it to work with our legacy code. We had the vendor send us an engineer from Phoenix, and he could never get it to work, so we gave up and kept using the old compiler.

I hear Ada is still popular in Europe. There were a lot of useful improvements in Ada95 that eliminated some of the clunky features.

On large systems like ours we had subdivided the code into many separate functional units. Each unit consisted of a directory of multiple packages that all worked to perform a certain function. Each FCI directory would usually contain an interface spec and code for interfacing to any other FCI that needed to share data with it. There were a couple of ways to share data. One way was to send an FCI a message that we were putting data into shared memory, and then the other would grab the data. Another way was to just send a message containing the actual data. This worked ok for small amounts of data. I think to communicate with external devices at the lowest level we were calling c code.

Software engineering in a large project is actually more fun than in a small group I think. There's more activity and hustle and bustle. Also it does force a certain amount of discipline that one would rarely do on an individual basis. Some of the code reviews could be brutal. Or it least so it seemed.

Testing was always a challenge on the real hardware because it always involved multiple computer systems networked together sometimes with wireless devices, and it could be a challenge just to get them all stable and talking together, let alone getting your own code working. When we went to a test facility for a week or two to test we always kept our fingers crossed. Usually nothing would work for the first two or three days and then on Thursday and Friday the problems started to go away and were magically solved, and we could go home successful. It was always very nerve wracking because of course there was a whole management schedule that depended on it.

Of course the boring part was writing software requirements documents, software design documents, test procedures and the like. We weren't supposed to design while coding, and weren't supposed to write any code until our design was complete. But a lot of times we didn't know how to design it, so we would write code in advance and call it prototyping. But if we reused our prototype code we could get huge code counts during the code phase, because we had already written that during the design phase and charged it to design.

Since we got assessed partly by the amount of code we could write in a given time, sometimes you would find people, who instead of writing a subroutine would just duplicate the same code multiple times. It increased their code count. It really didn't pay to write really tight, highly efficient code because of the development time constraints. However, then during testing you might have to go back and fix it to perform better.


> I think to communicate with external devices at the lowest level we were calling c code.

As someone who spent a few years maintaining the ada ->c code wrappers, our team would use I would say he's correct.

We used ada wrappers to make the system calls to c to do networking and unix ipc. While ADA can call c directly, we had to support HPUX and SUN so sometimes we had to write c code and call that (#IFDEF HPUX). This was a pain.

The package system was very good, especially at dividing work among a large team (30+ programmers).

We had a lot of fake external hardware simulators we used to test our code against. This worked well as long as our simulators worked like the actual external hardware. It did with some frustrating exceptions of course discovered during the final test.

Dealing with time is a pain in all languages (UTC vs GPS) I think sidereal time thrown in for fun.

I remember some of those procedures (design->code->integrate,formal test->ship) that DOD projects had. We counted SLOC (Lines Of Code), but thankfully our reviews never were based on them. Code reviews could be useful or just frustrating. I suspected somepeople were just difficult so they wouldn't be invited to as many of them

Interesting to hear others perspectives on it.


>Most programmers think that getting run-time errors, and then using a debugger to find and fix those errors, is the normal way to program. They aren't aware that many of those errors can be detected by the compiler. And those that are aware, don't necessarily like that, because repairing bugs is challenging, and, well, sorta fun. You are not giving a programmer good news when you tell him that he'll get fewer bugs, and that he'll have to do less debugging.

What. the.


Sure, it might sound crazy, but I've met plenty of programmers who were proud of using tools that make them do lots of extra work. It is a skill, even if it's not the most cost-effective way to get a working product.


In university I actually tried to get into Ada, especially since I studied compilers with professor Ploedereder (great guy!) who was working on (and at some point chairing) the standardization in ISO. I loved the story about how the committee was in a seriously fight about whether Sunday or Monday should be the first member of the week enum.

It just wasn't my thing, although I deeply admire the language. There has been fantastic new stuff in the 2005 and 2012 versions. Like the Ravenscar profile for embedded and safety-critical systems.

And just like the "Young Lispers" did with Lisp, Ada had some sort of renaissance, where a lot of Ada stuff was going into Debian and young Open Source people dreamt of re-writing the Internet. :-)

I think it was partly the tool chain that repulsed me. There is really only one compiler in existence if you're a hobbyist: GNAT.

Nowadays Adacore are pretty open, having a big Open Source compiler download page and all, but back then, you could get GNAT, as shipped with GCC, or "big GNAT".

GNAT, as shipped in GCC, was always at least one version behind, and since Adacore always hired everyone who got familiar with the GCC frontend, it was highly dependent on Adacore. So no "second source" where GCC maintainers could step up if Adacore ever left. The Ada frontend just was a second class citizen.

"Big GNAT" was something everyone would have liked to use, but it was expensive. So that was out of the question.

Well, I got it, as part of a special University cooperation program, but only after signing that I would only use it for the assigned coursework, that I would never distribute it to anyone etc. pp.

Oh, and how did Adacore manage to keep it closed? I mean wasn't the predecessor, on which GNAT was based, GPL'ed?

Yes, and "big GNAT" also was. Kind-of.

Rumor has it that Adacore basically told all of their customers unofficially "sure, you're legally free to distribute it, it's GPL after all. Just don't be surprised if we don't pick up the phone for a few hours when you need support".

As far as I know it never really leaked. For some time I was actively looking for some "big GNAT" archive on the net, but never found one.


Ada's main popularity problem appears to me to be due to its advantages not being immediately obvious when writing small programs. For example, comparing a small program written in C and Ada the Ada version would appear extremely verbose to most C programmers.

Ada's advantages start to really show when you deal with much larger pieces of software; what appears verbose or overly-strict at a smaller scale provides valuable assistance when dealing with large codebases.

Unfortunately, many seem to dismiss Ada after doing little more than looking at small code examples and complaining about verbosity...


We learnt Ada at University in the late 90s. It was OK, but nothing amazing.

Then I went into the real world, and !y first job was with Delphi (Object Pascal).

It's surprising no one discusses Delphi with Ada, because it had many of the same features with the same Pascal ancestry, but better tooling and aimed at commercial as opposed to defense work.

For a while it was very successful. It lost because of commercial reasons, but for a long time it was a better Visual Basic.


I used Ada(95) a lot. All the radars built by the company I worked for are using it today.

Ada has a lot of thing going for it I miss. It was rubust. Custom types (this is an int from 0-100, it goes out of range throw an exception). Records (structs) were nicely implemented. The package system really worked well for large software.

It was interesing. It suffered a lot from running on systems that are written in c. Although you can make system calls through a wrapper it was odd, so we wrote packages to interface with the underlying c libraries we need to call (java did this as part of core). We had a lot of libraries to make the networking and unix ipc work the same across our hpux and sun systems. The displays were written in C as it just wasn't really feasable in Ada.

It never reached the critical mass to get great tools, so debugging could be a pain. It didn't have a lot of useful third party libraries that make a language powerful and fun.

When your ada compiled there was a good chance it would work.

I do miss parts of it. When I see Go code sometimes I get flashbacks.


I know this is tangent, but as someone who hasn't programmed Ada, I'm struck by how the superficial syntax of VHDL is similar to Ada.

See: http://en.wikipedia.org/wiki/VHDL#Design_examples vs: http://en.wikipedia.org/wiki/Ada_programming_language#Langua...

Same "use packagename". Same "entity foo is", same block definition by "begin [...] end entity"

Coincidence, or is there a historical reason for this?

Edit: The answer was in the VHDL article itself: "Due to the Department of Defense requiring as much of the syntax as possible to be based on Ada, in order to avoid re-inventing concepts that had already been thoroughly tested in the development of Ada,[citation needed] VHDL borrows heavily from the Ada programming language in both concepts and syntax."


VHDL was modeled after Ada.

And yes, the languages look almost identical.


Right, VHDL designed by the team at Intermetrics (which has also had a long association with Ada -- Ben Brosgol was the designer of the Red language that lost out to Green).


Holy Shizzle, you met Jean Sammet?! I am super curious as to what she thought was wrong with Ada (not that I was disagree).

Reading her paper on FORMAC now, http://dl.acm.org/citation.cfm?id=155372.

I chatted with Ivan Sutherland a couple years ago. Nice, prescient fellow. His Fleet project is really cool, http://arc.cecs.pdx.edu/


So who here (at HN) is using Ada, what for, and do you expect it to become more or less widely used in the future?


Until leaving Oracle about a year ago, I'd been doing a lot of development in PL/SQL, which is extremely similar and compiles to the same ADA bytecode (called DIANA).

Most of Oracle's E-Business Suite is written in PL/SQL, although some components are being re-written in Java (ADF) in the new Fusion Middleware stack.

If your company is using Oracle for payroll or financials (this includes lots of companies, including Apple, Google etc), it's likely some of the customizations have been developed recently, and they've been developed in PL/SQL. Many of the components in EBS have not changed in decades

So to answer your question, developers at Google right now are writing in Ada (well almost):

> https://www.google.com/about/careers/search/?#!t=jo&jid=1144...

It's not easy to rewrite either. Writing ERP software is extremely hard, because it can't be cleanly modeled in a few abstractions. There are a lot of edge cases. I would suggest reading Ward Cunningham's http://c2.com/cgi/wiki?WhyIsPayrollHard


Intresting nugget, I never knew that. I have done a lot of PL/SQL development and thought it was pretty OK. Not my favorite language, but not the worst either. I do recall the occasional obscure "Cannot generate diana for an object" errors and never really knew what those meant; the Oracle documentation was very vague and basically said it was a "contact support" type of error.


Yes, PL/SQL occurred me too. It has its flaws, but I greatly prefer it to T-SQL.


I use Ada everyday developing a predictive dialer and contact management system for call centers and it's web-based front-end. Interest has been steadily growing since I started with the language in 2006. One, possibly useless, indication is that the #ada channel on freenode used to hover in the single digits and now regularly hovers around 70. People regularly enter the channel asking how to get started.


We still have some Ada code lingering in a suite of avionics simulation tools; I am actually right now in the process of migrating our code to GNAT from a proprietary Ada vendor.

But we don't tend to write new code in Ada. My observation in the avionics / defense world is that Ada is being used less and less, in favor of C, C++, or sometimes Java. Even some old Ada code is being rewritten in other languages.


I hear Kongsberg writes their missile software in Ada.


Technically not the linked page's title (the delightfully uninformative: Ada - AdaPower.com - The Home of Ada), but the title of the post in it.


That's just fine. More than fine: it looks like the author's title for what he wrote, which is what we want.

From https://groups.google.com/forum/#!topic/comp.lang.ada/JwLB7Y... I surmise that the content is from 1998?


Yes, forgot to look that up to put in the submission title. Thanks.


When I first started college, the intro to computer science courses were taught in Ada (this was back in '97). I don't have too many memories of it, but I do remember how verbose the language was, and that drove me pretty crazy. Its not very good for teaching programming to someone who has no idea what they're doing, but after much more experience, I can appreciate the language for what it is and what it can do.

And yes, I do realize that the verbosity and safety are good qualities and certainly can teach a lot, but for a 17/18 year old who may have never seen a programming language before, its pretty crazy. It should definitely be taught at higher levels, so as to reinforce good programming standards and practices. Have a few years of wild screw-ups and crazy bugs and gain an understanding of why you need safety and so on.


I too, encountered Ada in college (around 1990, give or take a year) and yes, compared to other languages I was familiar with at the time (BASIC, Pascal, assembly, C, Fortran) it was quite verbose and the compiler (we had to log into the VAX system to use it) was very fussy---to the point where we joked that once we got past the compiler, the program would not crash (now, whether it would produce the correct output remained to be seen).

It's amusing to think these days that C++ is probably larger and more complex than Ada ever was (or even is) but that it crept slowly to that complexity level, whereas Ada was "large" at the start but hasn't gotten much bigger. It seems that if you want a large language to be popular, you should start with a small language and grow it over a twenty year period.


Accessibility was a problem. As a poor student I couldn't afford to get a cheap Ada compiler to play with, while Assembly, BASIC, C, Scheme, Lisp, and Pascal were readily available for free or for little cost.


I think Ada is a really nice language. It's not my go to because you have to type so much. Even with vim auto-completion it still feels like I'm typing a lot of code.


Nice list, incomplete though: - since a long time, nobody know (or care) what Tony Hoare said in 1980 about Ada, yet Ada isn't taking off. - GNAT is free, yet Ada isn't taking off.

So?

It's difficult to say but IMHO what he forgot is: the syntax. Ada was designed to be verbose, I think that programmers don't like verbose syntax.. Did someone tried to make a terser syntax for Ada? Elixyr seems to revive interest in Erlang, so..


I would love to know some examples of 12-to-1 language design arguments that Jean Ichbiah vetoed. What is a good source for the Ada design process?


maybe http://archive.adaic.com/standards/83rat/html/Welcome.html

i.e.

    The choice of identifiers for reserved words and attributes depends primarily on convention. 
     Preference is given to full English words rather than abbreviations since we believe full words to be simpler to read. 
     For instance procedure is used rather than proc (in Algol 68) and constant rather than const (in Pascal). 
     Shorter words are also given preference: for example access is used in preference to reference, and task is used in preference to process.
There should be an older version too.


"and nowadays, everyone seems to think exceptions are a Pretty Good Idea." - really? I don't think so.


side question: any C++ or C compiler extension which forbids some bad programming practice, or some set of more strict C/C++ dialect ?

Ada seems to have safety in mind, but isn't this achievable with static analysis ? Aren't there attempts to attach static analysis onto a compiler ?


Oh, yes, that's what the SPARK language does. There's also CodePeer, which does a static analysis of your Ada code. GNAT itself also does many static checks.


Why Ada Isn't Popular (2014) : Everyone outside the DoD wrote it off as dead in 1998.


Not just DoD (though, admittedly, there are a large number of military projects in this list):

http://www.seas.gwu.edu/~mfeldman/ada-project-summary.html


Sadly, the DoD doesn't require Ada anymore and many software failures can be attributed to bugs that Ada would have caught at compile time. If I were doing flight control software it would probably be Ada checked with Agda.


I know, it's frustrating.

My previous employer chose C over Ada for all their projects, and they paid for it when you looked at the overruns and additional moneys spent on testing and analysis tools that Ada provides as a language feature.


Ada-based languages (such as SPARK) are still used in developing modern aviation systems.


http://en.wikipedia.org/wiki/SPARK_(programming_language)

It would be interesting to have a version of SPIN [0] using this. This [1] looks really fresh. And I love the idea of creating new languages out of a subset. Almost all teams create an ad hoc language subset, formalizing it can be really powerful.

[0] http://en.wikipedia.org/wiki/SPIN_(operating_system)

[1] http://www.spark-2014.org/about/


Hah! This brings back memories as in 1998 I had Professor Feldman as a CS freshman at GWU. And, you guessed it, we learned Ada as our first language.

The main thing I remember is that it was very annoying to be fixing indentation issues on a terminal text editor, although learning about packages was very cool. Irrationally, the indentation strictness annoys me in modern languages like Python, too :) that, and using := instead of = felt like a waste!


Love the subtle approach into "Ada isn't popular because people are irrational and believe things that are totally off-base from reality."

I've never hacked Ada; it seems to have been totally eclipsed by languages like OCaml that give you a rich type system and a good degree of speed, but when I was learning about programming and languages it seemed insane that people didn't go for Ada. Was being able to be sloppy really that alluring? Did people just love segfaults?

It makes a lot more sense to think of it in terms of irrational actors making choices for largely social/political reasons rather than calculated cost-benefit models.

Basically everything makes more sense that way. HN just had a big thread about the death penalty that illustrated this perfectly. Some people just believe things because they want to. It's really, really hard to be accurate to reality.

What could have been done to make Ada popular? How can we hack humans to prevent huge losses like this from occurring? What's the Ada of 2014?


> What's the Ada of 2014?

Please post more meta questions. This is important.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: