Hacker News new | past | comments | ask | show | jobs | submit login
Doctor Fortran in “The Future of Fortran” (intel.com)
25 points by nkurz on March 28, 2015 | hide | past | favorite | 17 comments



Fortran is still going strong. At a High-Performance Computing seminar I attended a few years ago, the speaker said that around 50% of high performance computing applications are written in Fortran (you know, these shiny little programs that can hog a million CPUs for a few hours or weeks or months). The other half being mostly C and C++.

However, programming in Fortran is uncomfortable. Just like programming in C. The advantage of Fortran over C in High-Performance Computing is that it's less uncomfortable to write maths and things with arrays in Fortran.

If you ever need to handle enormous arrays without breaking a sweat or solve systems of equations with hundreds of thousands or millions of variables, you can give Fortran a try. The syntax has come a long way since FORTRAN 77.

Fortran's strength has always been speed. In the early days most Fortran programs would run faster than C programs. The trick is to intentionally leave out features that bloat the language such that all there's left in the language can easily be optimized by the compiler. Only after the C99 standard (I think), C caught up to Fortran in most applications due to stricter aliasing rules.


The main reason Fortran is so extensively used today is not because of the merits of its new features but the incredible amount of (intrinsically complicated, but even more because it is highly optimized) legacy code. I guess that most of that 50% running in HPC is FORTRAN77 written many, many years ago, with a bit of Fortran90/95 and only an insignificant fraction of Fortran 20xx.


At NASA Goddard, they run climate models on our supercomputer and they are mostly in Fortran. I asked one of the developers once and he said the bulk of it was in Fortran 90/95. But there is still a lot of Fortran 77 code around, particularly in well-debugged libraries. This developer wanted to use a more modern dialect and take advantage of some of the new features, like object orientation, but it was hard to get the scientists to learn the new features. That's a big part of why Fortran endures, it's simple inertia.

During development of the original ANSI standard for C, the Cray rep and others wanted to add features that would enable C to kill Fortran. This meant adding complex numbers (which happened in C99 and was made optional in C11), and doing something about unrestricted pointers. The latter lead to the noalias fiasco, but ultimately we got strict, which accomplishes the same goal.


Assuming the FORTRAN77 was optimized for hardware circa a period when FORTRAN77 wasn't considered an ancient language, is it still optimized on modern hardware? Because it seems the vast increase in the relative importance of cache alone, not to mention the great changes in cache sizes, would make some optimizations fairly pessimal these days. (For example, loop unrolling blowing out I-cache.)


Languages aren't optimized, compilers are, and fortran compilers commercial and free are released regularly.


> Languages aren't optimized, compilers are

Actually, specific pieces of code get optimized, and compilers that can "undo" optimizations put in to the source code by hand are so rare I daresay they don't exist.


The intel compiler is considered to generate runtimes that runs about twice as fast as runtimes from gfortran. Haven't seen any recent tests of CLANG/LLVM yet. The statement that compilers, not languages, are optimized generally holds true for Fortran.


"Considered" maybe, but not measured in practice. Of course you need equivalent optimizations to generate similar code -- Intel even defaults to incorrect optimizations. GCC is also substantially more reliable in my experience watching users who insist on using Intel on HPC systems without actually measuring anything.

Of course compilers are relevant, but Fortran has long presented optimization possibilities that aren't typically available in other languages, like not passing by reference.


It's also that for many HPC science things; the programming effort isn't anywhere near the main effort. The science behind the code takes years and the code execution can take hours, days, or months. The people writing the code are scientists, not programmers, and aren't all that interested in the languages they use being 'nice'.


This is my experience. It is so much easier to get a Post Doc up to speed in Fortran, than pretty much any other language. The best case scenario is that professional programmers write the frameworks and infrastructure, and that non-programers produce subroutines and other functional parts of the code. The worst case scenario is when the entire code base is produced by non-programmers...


No, Fortran is just plain fast. It's not just institutional inertia.

Luckily there are things like Julia now that may eventually change the landscape.


Can Julia compete with Fortran and C? I still thought Julia still uses BLAS.


So do Fortran and C. BLAS is written in assembly.


> I guess that most of that 50% running in HPC is FORTRAN77 written many, many years ago

Is this blind speculation?


It's a guess based on my limited experience.


unpossible.

       DO 100 I=1,10
  100  SUM = SUM + X(I)
No longer supported ? hWhat ? And don't get me started about COMMON and EQUIVALENCE.

To be blunt, FORTRAN 77 was probably the most practical language ever devised. What is this monster ?

I predict more code written in FORTRAN 77 is being used today that all of the code written in later dialects. And this will be the same for the next 20 years.


Note that none of these will actually be removed from any compiler you are likely to use, but their use will be flagged if you ask for standards checking.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: