Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

True, but nevertheless Ada is an important language in the history of programming languages.

After 1970, there were less and less innovations in programming languages, in the sense of features that have not existed in any earlier languages.

Many new languages have been introduced since then and some of them might be better than most previous languages, but usually the new languages offer only new combinations of features that have previously existed in different languages and not anything really new.

Ada is important, because in 1979 it included a few features never provided before. Some of those features have been introduced only recently in more popular languages, while others are still missing from most languages.

For anyone who wants to create or improve some programming language, Ada is on a long list of mandatory programming languages that must be understood well, before attempting to do anything that is intended to be better. Sadly, there are many examples of ugly misfeatures in recent programming languages, which demonstrate that their authors were not aware about which was the state of the art 40 years or 50 years ago, and they solved again, but badly, problems that were solved well already in the distant past.

Unfortunately Ada also had defects, some of which were mandated by the Department of Defense requirements.

The defect that is most universally accepted is that it is too verbose.



>> The defect that is most universally accepted is that it is too verbose.

Another huge barrier (especially to early adoption) was the cost of Ada toolchains.

Even today, there are proprietary Ada implementations that cost thousands of dollars per seat.


There are also C and C++ toolchains that cost similar amounts (if you want to use them for safety critical systems). But they do have more free or open source options than Ada does. Fortunately FSF GNAT is free and unencumbered (unlike AdaCore's release of GNAT GPL).


> But they do have more free or open source options than Ada does. Fortunately FSF GNAT is free and unencumbered (unlike AdaCore's release of GNAT GPL).

Sorry, I don't quite get what you're trying to say here. You mean unencumbered by being free or open source?


GNAT GPL removes the runtime exception, so if you build something with it linked to its standard library, it's also supposed to be open sourced. This means you can't (in a legal/technical sense, not a true prohibitive sense) make closed source software with it. FSF GNAT doesn't remove that exception, so it can be used in releasing closed source software.

That's the encumbrance that GNAT GPL imposes and FSF GNAT does not.


Got it, thanks!


>Some of those features have been introduced only recently in more popular languages, while others are still missing from most languages.

Can you give an example of an ADA feature missing from most languages? I know ADA is supposed to be good for writing reliable software, are there any important features related to that which other languages could adopt?


A feature that for a long time was missing from most languages, but which has been adopted by many during the last decade, is to accept separator characters in numbers, to improve the readability of long numerical constants.

Ada introduced this in 1979, by allowing "_" in numbers. Cobol, in 1960, allowed hyphens in identifiers, for better readability. Because hyphens can be confused with minus, IBM PL/I, in 1964, replaced hyphen with low line, which remains in use until today in most programming languages. Ada extended the usage from identifiers to numbers. Most languages have followed Ada and also use "_" for this purpose, except C++ 2014, which based on a rationale that I consider to be extremely wrong, has substituted the low line with single quote.

While this is an example of an Ada feature that could be easily adopted in any other language, other features are more difficult to adopt without important changes in the language, so they did not spread much.

An example is the specification of the procedure/function parameters as being of 3 kinds, in, out and inout.

This feature was not invented by the Ada team, but by one of the authors of the DoD IRONMAN language specifications, maybe by David Fisher, but the DoD documents do not credit any authors.

In the predecessor of Algol, IAL 1958, the procedure parameters had to be specified as in or out. However this feature was dropped in ALGOL 60. Nevertheless, there was a programming language, JOVIAL, which, unlike most programming languages, was derived directly from IAL 1958, and not from the later version, ALGOL 60.

So JOVIAL inherited the specification of parameters as in or out. JOVIAL happened to be used in many DoD projects, and because of this it influenced the initial versions of the DoD requirements, which eventually resulted in Ada.

The first DoD requirements included the specification of in and out parameters, but in the beginning the authors did not have a good understanding about how the parameter specification should be used, so the DoD requirements had a bad wording, implying that this specification is meant to determine whether the parameters shall be passed by value or by reference.

After several revisions of the DoD requirements, in 1977, the IRONMAN requirements were issued, which were very much improved. By the time when IRONMAN was written, the authors had realized that whether the parameters are passed by value or by reference is an implementation detail that must be decided by the compiler and which must be transparent for the programmer.

Moreover, they realized that 3 categories must be specified, i.e. out and inout must be distinct, because the semantic is very different and the compiler must do different actions to implement them correctly.

Many current programming languages are much more complicated than necessary because they lack this 3-way distinction of parameters.

The language most affected by this is C++, which has struggled for 30 years, from 1980 until 2011, until it has succeeded to include in the language the so called "move semantics", to avoid redundant constructions and destructions of temporaries. Even if now the extra temporaries may be avoided, this requires a contorted syntax.

All such problems could have been trivially avoided since the beginning, if C++ had taken from Ada the "out" and "inout" specifications. During the transition from C with Classes to C++, in 1982-1984, C++ was nonetheless strongly influenced by Ada in the introducing of overloaded functions, overloaded operators and generic functions (templates).

While C++ has introduced the reference parameters, to avoid to write large quantities of "&" and "*", like in C, it would have been much better to apply the Ada method, where it is completely transparent whether the parameters are passed by value or by reference and the programmers never have to deal with "&", unless they use explicit pointers and pointer arithmetic, wherever pointers are really needed for their extra features, not for telling the compiler how to do its job.

This purpose is as obsolete as the use of the keyword "register" for telling the compiler where to allocate variables. Even the C/C++ compilers ignore the fact that the programmer writes that an input parameter shall be passed by value and they pass it by reference anyway if the parameter is too large. This should have been the rule for any kind of parameters.


Thanks, interesting! However I can't help but notice that the in/out/inout stuff wouldn't do much for a modern dynamic language like Python or Ruby... are there features of Ada that those languages could do well to adopt?


> many examples of ugly misfeatures in recent programming languages

I've seen more misfeatures that seem to come from people not knowing the history of JCL. The lesson would be to plan for expansion of the semantic space.

> The defect that is most universally accepted is that it is too verbose.

It feels less verbose than Java or C#.


How verbose is it compared to others like Go or Rust ?


Somewhat more verbose.

A great part of the verbosity is due to the fact that unlike CPL/BCPL/B/C and the languages inspired by them, which have replaced the Algol statement parentheses begin and end with "{" and "}" or similar symbols, Ada uses relatively long words as statement parentheses, e.g. loop and end loop.

On the other hand, a good feature of Ada is that it followed Algol 68 in having different kinds of statement parentheses for different program structures, so you do not need to spend any time in wondering whether a closing "}" matches the one opened by a "for" or by an "if", many lines above.


Moreover, besides the long statement parentheses, the other main source of verbosity in Ada is that Ada does not use abbreviations.

Most programming languages use a set of abbreviations that have appeared in either PL/I or Algol 68, but Ada does not use them, for example Ada uses constant, procedure, character, integer instead of const, proc, char, int.


Even setting aside keywords, the real verbosity with Ada is that nearly everything is explicit, not implicit. In C you have many implicit conversions between (similar) types, in Ada these are always explicit. In C++ you have implicit instantiation of templates when they get used, in Ada you must explicitly instantiate a generic before it's used.

On the other hand, arrays carry their range information with them and you don't need to pass that explicitly like in C. And having types with explicit ranges means you can use them and trust that they'll work correctly (which may include erroring out when used incorrectly, like adding 1 to the largest value), but in most other languages you'd have to include explicit range checks at (potentially) numerous locations throughout the code (did we start with a correct value, did we end with a correct value).

Tradeoffs.


You are right.

However the Department of Defense requirements prohibited any kind of implicit conversions, without making any distinction between safe conversions, which preserve the value and which are reversible, and unsafe conversions, like truncations or roundings or signed to unsigned conversions.

The complete lack in Ada of some very frequently needed implicit conversions is annoying and it does not decrease the likelihood of bugs, but it increases the likelihood of bugs due to code bloat that can obscure the erroneous absence of some meaningful operation.

However, this defect is on DoD, not on the Ada authors.


Honestly, I don't get why people care so much about typing loop and end loop etc in their code. But after using nim where I can just ignore the default style and write everything in snake_case (which brings me joy as I find camelCase harder to read and annoyingly ugly), I think more language should be style agnostic (with tools for converting between styles for others' reading benefit). Ada could support curlies AND verbal curlies allowing people who don't think that there is any benefit in loop/end loop to just use {}.


With good tooling, it might be possible to have autocomplete once you write the b of begin or the l of loop, which would reduce the typing verbosity.


Like autocompletion, also syntax coloring becomes more important with verbose programming languages, to help distinguish the relevant text between the large areas occupied by keywords.


That's a good point too.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: