Hacker News new | past | comments | ask | show | jobs | submit login
Fortran Web Framework (fortran.io)
272 points by neatze on Sept 13, 2021 | hide | past | favorite | 167 comments



Let's not forget about a C++ committee member tweeting that Fortran does a better job at package management:

https://twitter.com/blelbach/status/1387618008185524225


I wanna begin my journey with learnin C (again). Why isn't there a normal package management system? I mean there was a lot of time to solve this issue, is it a technical issue or just lack of innovation in C?


Lack of innovation (which has tended to focus on other, less limited, languages), different culture (pre-web greybeards, embedded software), fragmentation (there's a lot more C compilers in use than almost any other language, many of which have one implementation).

Plus many of the implementations are closed source.

C also tries to make as few assumptions about the platform as possible. It was only recently that support for ones-complement arithmetic was dropped http://www.open-std.org/jtc1/sc22/wg14/www/docs/n2218.htm / despite never being used for decades https://stackoverflow.com/questions/12276957/are-there-any-n...

C is also somewhat unique in supporting cross-targeting, where you build a program on platform A that will run on platform B (such as a microcontroller), and the target platform may be very different (such as no filesystem, no OS, different instruction set). Most other modern languages use intermediate bytecodes and try to be "WORA" (write once run anywhere).


> I wanna begin my journey with learnin C (again). Why isn't there a normal package management system?

That sounds like a distro.


Could u explain further?


As absurd as it sounds, perhaps the simplest way to manage the dependencies of your C projects is to install the library package offered by your distribution.

C has a terrible dependency management story.


Why is it like that though? Do C programmers generally have less of a need for 3rd party libraries?


The C programming language predates the time when using third party libraries was common.

When it became commonplace to depend on third party or system libraries, because of the special status C enjoys in the software infrastructure, OS and distro developers bend over backwards to accomodate C linkage, providing pre-packaged dynamic linking objects.

The status quo is actually rather unfortunate. Several package management tools exist to attempt to address the issue, but there is no de-facto standard and every project makes its own choice.


  > The C programming language predates the time when using third party libraries was common.
The English language predates spaceflight, but has adapted. PHP predates proper security practices, but has adapted. Javascript predates the Document Object Model but has adapted.

For that matter, Javascript also predates the time when using third party libraries was common.


I'm not passing a value judgement, that was simply an "historical" account of what happened.

C had such a powerful influence on computing that it didn't need to adapt to the world, the world adapted to C.


Yet language still takes time to adapt. Formally, it takes years for words to enter the dictionary. Informally, it takes time for new vocabulary to spread. For example, in the early days of computers, it was very common for terminology to be used incorrectly.

Of course, there is more going on with C than the use of new "words" introduced through libraries. Even though it is viewed as a portable language, most of that portability is between hardware architectures and operating systems of the same class. Almost everything above that is handled by vendor specific libraries so there was likely less of a drive to standardize on library management. This has negative consequences in today's environment, but it would have taken foresight to predict that outcome.


The english language doesn't really maintain retrocompatibility though


Be the change you want to see, then?


You're not the first to suggest that I need better package management... what has my ex been telling ?!?


In a manner of speaking (and historically), yes. Code that people write in C is often low-level enough so that it only requires the standard library and the OS API.


Yes, what you see in JS is totally insane compared to what people used to do in c and c++.

And you just install packages from your linux distro repository. If you want to distribute your own code, rhen you make a package out of it and that will handle your dependancies for you.


This doesn't work very well in the common scenarios such as:

- you are not using Linux

- you need a newer version of the library than your distro provides

- you want to use a package that is not popular enough to be packaged by your distro


Because Unix is the C package manager.


No, it is that most 3rd party libraries a C programmer needs are already offered as part of the OS distribution.


I disagree with this. There is no C equivalent to the C++ STL included as part of any OS distribution, which means that any C code needing something as basic as a hashtable has to rely on a nonstandard third party library.



libdb


Well C packages are usually just libraries you install with a distro's package manager. Distros (distributions, e.g. Linux distributions) are mainly an effort to integrate many programs and libraries into a usable complete OS, the package manager of a distro is solving the purpose of a "package manager" that would often be designed for a specific language but for C programs as well as C++, or anything else they wanted a package for in their distro.


A lot of questions arise: 1) How do you do deployments? e.g in Node/Ruby/Python u can simply have a package.json (or Gemfile) with all the dependencies. 2) How do you dependencies? Let's say I am building a new C library that depends on 5 other dependencies, how do I let users know?


> How do you dependencies? Let's say I am building a new C library that depends on 5 other dependencies, how do I let users know?

Via a Dockerfile? ;)

But seriously, this is typically done in the INSTALL.txt installation instruction file for your library, and it’s up to the user to make sure any dependencies are properly installed and configured.

For dependencies that aren’t commonly installed, tools will often include their source and build them from scratch.

It’s a misnomer to say C package management is broken, because it doesn’t have any.


1. The same way you deploy Ruby/python to your system? They are written in C and most teams have no issue getting those deployed. If your answer is that you are deploying Ruby and python using the system package manager; then the answer here is that you deploy your dependencies for your C code using your package manager.

2. cmake and autoconf will both let you list dependencies and provide error messages to tell users to install them. It won't auto-install dependencies like a package.json or tell a user where to get the dependencies, but the message that something needs to be installed can be passed to the user.


> How do you do deployments?

For executables: actual real statically linked binary. You have a single executable containing all the dependencies and all necessary runtime support.

For libraries: no standard mechanism. On Linux, you can generally make a "libfoo" package that specifies the dependencies. On Windows, this is a bit more of a problem, but people can and do just hand out a DLL.

Or you make people build the dependencies from source as well.

(As an old school C developer, I see containers as occupying the same conceptual space as statically linked binaries for distribution. It's just that very few other languages will let you make a true standalone executable.)


You can do deployments with dependencies by building a distribution package of your own software, using the distribution's packaging tools. The tools do work quite well. The real problem is that this only works reliably for a single OS distribution. To support another Linux distribution, you need to redo the packaging. Even distributions that use the same package format (such as rpm or deb) are not quite compatible, since the names and versions of the dependencies vary.


> 2) How do you dependencies? Let's say I am building a new C library that depends on 5 other dependencies, how do I let users know?

https://cmake.org/cmake/help/latest/command/find_package.htm...


I just make a PKGBUILD file for my project. Lets me specify dependencies, build and install instructions, anything it needs. Then I can build the package and install it on any Arch Linux system.


in C++ (and by extension C) you can create very robust crosscompileable projects with complex dependencies trees using CMake + Hunter + toolchain files.

There is a bit of learning curve but everything is there for a reason.

- Toolchain files define your target system and build toolchain.

- Hunter specifies your dependencies withing CMake (using git and hashes) and does proper incapsulation/namespacing (which isn't present in "normal" CMake) as well as setting up the proper "inheritance" of headers and stuff.

- CMake does the final linking of build artifacts and headers and does the final build command generation.

A proper setup can build both dynamic and static builds for any system with virtually no platform specific switches


Not sure why folks think a “nodejs” type package manager is required…

For instance, what is this if not package management?

apt install libpng-dev


Sure that’s a package manager, but it’s a distro specific one. Now make one that works on all Linux distributions. And now make one that works on all operating systems.

This is the problem that npm, pip, and whatnot solve. They work the same, (almost) everywhere.

C/C++ don’t have a proper answer for it.


Well, the difference between C and Python or Node though is that C is used by the operating system APIs and so it is somewhat tied to each system and their package managers. In contrast, Python/Node exist for the most part at the application layer.


You can install apt / dpkg (or dnf / rpm, pacman, portage, nix, ...) on pretty much any distro. You are really just asking why there's no de-facto standard package manager for C like there is for those other languages.


It's distro-specific, for one.


At least it's not language specific.


It's pretty much C-specific. Packages for other languages tend to be pretty difficult to work with when installed this way (e.g. you end up with problems if you have 2 projects depending on different versions of the library)


How so? It's normal for incompatible versions to get parallel installations.


I feel the same way. I’m trying to learn C but there are a lot of things about it that just make it more difficult than it needs to be to learn. Like why is * used in so many different ways!? Why is it that I need to learn multiple different programs to build and install my programs (nmake, cmake, make, shell script or even .bat). Why don’t I just have a compiler already installed on Windows?

And then there is specifying linking files and include files, and headers, it makes it so challenging to even run other people’s code


For the same reason you don't have a compiler already installed on commercial UNIXes, it is extra revenue for the platform owner and regular users don't need it.


For C there isn't really a problem that a package manager would solve. I can think of some problems it would cause, though.


Citation needed. I am confident that it is possible to write reusable C code, and thus there is a need for packaging such reusable code and manage them


And C has a solution for that:

Package: tarball of code.

Manage: download tarball of code and build/install libraries using configure/make. Then tell the compiler where the headers are and the linker where the libraries are. Easy peasy!

(OK, we know it’s not quite _that_ easy in many cases.)

The elephant in the room is that people on HN and elsewhere violently nod about the need for a “proper package manager” when they are actually in disagreement about what a “proper package manager” needs to do. Meanwhile, it turns out that you can do without one, as the C community and all the “non-proper package managers” out there amply demonstrate.


The nearest thing to a standard solution may be autoconf, which is horrendous.

C targets an incredibly broad spectrum of hardware and software platforms, compiler implementations, and OS layouts. That's both blessing and a curse. I think it would be worth at least trying to converge on a "package" system, even if it takes the next 20 years.


> The nearest thing to a standard solution may be autoconf

I believe that is changing. A quite large number of influential projects have standardized on Meson. That includes systemd, Xorg, PipeWire, GLib, libui, hexchat, pacman, and (almost) all Gnome projects. It seems fair to say that most people (a) like it, and (b) see it as the most likely path forward for build systems.


Like the user you’re replying to, I to find the concept of a C package manager baffling. I don’t know what your background is, but some reasons for my reaction:

- a plethora of details that are implementation specific; - different compilers and platforms that differ in important details;

- as a consequence of above, most code ends up targeting some limited set is compilers and platforms;

- no standard build system to build package management on top of

- the header file system, which allows api declaration and implementation to be decoupled

By the time you’ve narrowed down enough to make these tractable, you might as well use a given operating system’s package management tools as your “C package manager”.


I once worked on a C program. It has been the worst experience of my programming career, right next to updating excel's vb macro to meet newer business needs.

The mere fact that I could not build the project on my local machine (windows) and have to login to a Unix (IBM AIX) was baffling. Yep, Unix, not even linux. And because it was a business critical program which worked, business refused to make any drastic changes to it unless they were absolutely necessary, even if it was a bug fix to a frequently occurring issue.

And I maintained it till 2018 when I left that job. I am pretty sure it is still there in same state.

So, I understand the build tool woes, and cross platform issues. however, I understand it to mean that package manager is not feasible in current environment, not that it does not add value. Recently there was a post on zig, which aims for providing a standard build tool for C (& zig), and later, a package manager for C (and zig)

So, I think such efforts are not only beneficial, but will become reality in future


C++ has all of these problems, but Conan and vcpkg still exist. They aren't perfect, but they're so much better than nothing.

You're also ignoring the fact that Linux isn't the only operating system in the world. Some of us write multi-platform software, and package managers make it so much easier.


I recently started working in C++ again and my experience with C++ package managers were pretty dismal (I tried vcpkg and Conan). Just trying to setup a PoC project with only two dependencies took me multiple days and neither Conan nor vcpkg worked. And this was on an Intel MBP so not exactly some exotic platform.


I agree wholeheartedly with you on that. It's certainly a hard problem right now, which is why I don't like people that just say "it's not a problem with C/C++" or "just use the system's package manager".


Are you saying the standard library is powerful enough for most things?


Are you saying you need a package manager to use something other than the standard library?


There is, Nix :)


How does this work in practice with it's "rolling release" nature? Can you designate specific package versions for all your dependencies? Like having a very new openssl and an older zlib version?

With complex software, just grabbing the latest version of every dependency will lead to broken builds (or ones that only work on x64 but are broken on other platforms)

Last I looked at GUIX (which I guess is similar) you are tied to a release "version" of the package-manager with whatever package version it comes with. But maybe I misunderstood


> Last I looked at GUIX (which I guess is similar) you are tied to a release "version" of the package-manager with whatever package version it comes with. But maybe I misunderstood

Guix has Inferiors [0] for this nowadays.

There's also a lighter-weight but less robust approach: define a new package inheriting the package in question, changing only the version number and source URI.

[0] https://guix.gnu.org/manual/en/html_node/Inferiors.html


> How does this work in practice with it's "rolling release" nature? Can you designate specific package versions for all your dependencies? Like having a very new openssl and an older zlib version?

NixOS (the distribution) is not actually rolling: it has stable channels that are released twice a year, and an "unstable" channel that normally updates every few days (depends when all tests passed). You could call the latter "rolling", but it's a bit different compared to actually rolling distributions like Arch Linux.

> Can you designate specific package versions for all your dependencies? Like having a very new openssl and an older zlib version?

Normally only the last version is packaged, but you can change the version of the package and install multiple versions of the same library without conflicts. Some big libraries have multiple branches for backward compatibility (eg. boost175, boost174, openssl_1_1), for the others you have to override the package (it's pretty easy, but you'll likely lose the binary cache).

> With complex software, just grabbing the latest version of every dependency will lead to broken builds (or ones that only work on x64 but are broken on other platforms)

If you really need that kind of stability, a common solution is to pin the version of the whole Nixpkgs. Something a little less drastic is simply using the stable channel, which only receives security updates.

> Last I looked at GUIX (which I guess is similar) you are tied to a release "version" of the package-manager with whatever package version it comes with. But maybe I misunderstood

I'm not familiar with Guix, but I think you can do what I described above with it too.


Thank you for clarifying and explaining :)

I mixed things up a bit. Guix doesn't have a stable channel but Nix does. Twice a year sounds quite fast - but maybe that's just what I'm used to from Ubuntu LTS. I can't imagine being a package maintainer and having to deal with build issues that often (I guess it only targets x64 so it's not too bad)

> If you really need that kind of stability

I guess I don't really 'need' it .. but I just rather my system not be constantly updating and changing under me.

Looks like you have run stable and add unstable packages as needed:

https://discourse.nixos.org/t/installing-only-a-single-packa...

(like running LTS and adding PPAs)


You may be interested by Spack [1] if you're fine with linux / macos, it gives you all the versions, and you can specify lower and upper bounds for dependencies. Not only that, it also gives you conditional dependencies through variants, compilers, architectures, etc. Also it allows you to compile everything from sources for your micro architecture.

[1] https://github.com/spack/spack


For C++ package/lib management we use Conan+Cmake. I understand that Conan works also for plain C projects.


Could be worth looking at CMAKE; you can even link to github repos like Go offers.


I assume it's because C compiler output is deeply intertwined with the host operating system as a whole.


The closest thing to a module metadata system is the pkgconf, but it does nothing about actual module import/building. You have to use external tools to achieve the later.


C++ has package management?


There's like 50 different ones with different models of what a package is, capabilities, etc. But at least when it's such a pain you don't end up with JS mess where everything depends on everything and deleting a trivial package breaks half the web.


> and deleting a trivial package breaks half the web.

I assume you’re talking about the left-pad incident? It happened over 5 years ago and policies have been put in place to stop it happening again. It was a mistake, they learned from it and moved on.

Secondly, this issue isn’t specific to JS or NPM, a few years ago someone deleted their Golang project on GitHub and broke the ecosystem in a similar way. If anything, it’s less likely to happen on NPM today as it acts as a cache


I don't think your parent comment was hating on NPM / Node specifically, but rather was pointing out that the biggest determiner of whether programmers on a particular platform depend willy-nilly on whatever libraries they can find just to make coding small features more convenient is whether the language ecosystem they're in makes that easy to do.

For that reason, your point about Go only further reinforces their point, since Go was (I think) the first language to make importing another project from the web completely trivial, just one line of code.

On the one hand, I think it's bad for a language ecosystem to repeatedly get in the way of doing something that it many cases it does make sense to do, but on the other hand if you think (most) programmers are going to immediately take us to modern programming dependency hell without that, you can start to see it as a kind of silver lining.


> But at least when it's such a pain you don't end up with JS mess where everything depends on everything and deleting a trivial package breaks half the web.

JS doesn't have real issues, when I say JS I say in the context in which it was created, AKA the browser. Browsers have an huge API where everything and its contrary are possible.

The problem is Node.js delegating the most basic things to a for profit company, NPM and it was 100% by design... with Node.js, you can't even parse a multipart request without a 3rd party package... the whole "unix philosophy" for packages was purely marketing bullshit and someone got very rich exploiting Node.js bad choices (Isaac...)


> you can't even parse a multipart request without a 3rd party package

Of course you can (if you want to), how do you think all of those 3rd party packages are built in the first place? Using NodeJS APIs of course, that you can also use if you want to.

But why re-invent the wheel when someone already created it? Just make sure the library you include serves one purpose, has a light amount of code, actually does what you want it to and doesn't change their own API willy nilly. Following these guidelines (for any language ecosystem you use) leads to a lot less hassle when it comes to dependencies.


> Of course you can (if you want to), how do you think all of those 3rd party packages are built in the first place? Using NodeJS APIs of course, that you can also use if you want to.

Don't be obtuse, by that logic you don't need to use any third party package to write a professional app backed by database in node.js, just write your own MYSQL/Postgres driver? hey /s

My point is I shouldn't have to download a package or write a multipart request parser to manage files sent to a http server.

Someone made a profit out of that stupid situation with a terrible package manager, NPM, all by design.

Node.js creator himself said that relying on NPM was a terrible mistake, that's why he went on creating DENO.


> My point is I shouldn't have to download a package or write a multipart request parser to manage files sent to a http server.

Honestly, how many languages ship something like a multipart parser with the core API? And to be frank, I don't think I'd like the language I'm using to do this, only ~30% of my projects touch web-related stuff anyways.

> Someone made a profit out of that stupid situation with a terrible package manager, NPM, all by design.

I agree that NPM/NPM Inc is horrible, but for lots of other reasons. Also don't think it was on purpose, just poor and rushed design.

> Node.js creator himself said that relying on NPM was a terrible mistake, that's why he went on creating DENO.

So? Doesn't mean he is right, who knows what have happened with NPM? Maybe NodeJS would never have taken off in the first place. Brendan Eich also apparently doesn't like gay people, does that mean every JS developer needs to think like him?


> So? Doesn't mean he is right, who knows what have happened with NPM? Maybe NodeJS would never have taken off in the first place. Brendan Eich also apparently doesn't like gay people, does that mean every JS developer needs to think like him?

He is absolutely right, NPM was a grift all along. The whole "unix philosophy" argument to justify a paper thin std lib was a farce. NPM architecture is terrible to begin with. NPM is designed the way it is(was, fetch X times the same package instead of linear dependencies) because NPM corp targeted growth as a startup, not eco-system stability.


Again, agree that architectural/technically speaking, NPM and it's registry is horrible.

But that NPM was a grift, std lib was a farce and everything designed to fuel growth of NPM Inc is gonna need to have some more evidence behind it than you feelings.

And please, I really wish you do have proof of this as I'd like it very much if NPM Inc got put into their place. But I find it unlikely they designed things for this purpose. In the end (at least for me), the quest for truth is more important than what I think is right.


Node.js already has a pretty extensive standard library though, sure not quite Python but still much more than C


Python standard library is 1000x more complete and useful than anything JS offers out of the box.


As someone that uses Node a lot- Could you give some examples?


> Node.js already has a pretty extensive standard library though, sure not quite Python but still much more than C

No it doesn't, by any modern language standard (Go, Python, PHP, Java...)


I think node.js still have a bigger standard library compared to Rust. You can write a http server (including https and http/2) easily with only standard library.


I disagree with literally all you said. Modules in the browser used not to exist until very recently. NodeJS/npm is a very decent ecosystem. It's not perfect, it's software. NodeJS explicitly strove to keep its standard library small unlike e.g. Python whose stdlib is also a huge legacy.


> I disagree with literally all you said. Modules in the browser used not to exist until very recently. NodeJS/npm is a very decent ecosystem. It's not perfect, it's software. NodeJS explicitly strove to keep its standard library small unlike e.g. Python whose stdlib is also a huge legacy.

Node.js creator created DENO because he thought relying on NPM was a terrible mistake. If even the creator of Node.js said that, then he knows that a bunch of grifters profited from the bad choices made back then.


Can you name a single better package manager functionally comparable to npm? Delegating the whole packaging system to its own world is indeed questionable. However, in the context of node it works very well because it is designed with npm in mind. As for your corporate grievances—-managing a packaging ecosystem is a huge maintenance and cost burden. You have to host the packages, provide CMS, deal with abuse, security, user complaints and all the other nonsense. These are inevitable things with a centralized package manager. If you don’t like it, there’s always the git way.


You're not getting it. Node.js itself should have had a more substantial standard library, that's all I'm saying. Who effectively owns Node.js since NPM is bundled with Node.js? Whoever owns NPM, therefore Microsoft now.

Weren't people here outraged at Copilot? Do people really believe that Microsoft isn't running copilot on NPM packages?


I think Maven is better managed, and more simple than npm.


Julias package manager?


Yes, vcpkg, NuGET and Conan.


Conan among others.


Written in Python, incidentally.


Which is why I refuse to use it, and use vcpkg.


Previous discussions:

https://news.ycombinator.com/item?id=11938405 (227 points by mapmeld on June 20, 2016 | 90 comments)

https://news.ycombinator.com/item?id=13226174 (165 points by da02 on Dec 21, 2016 | 113 comments)

https://news.ycombinator.com/item?id=22120285 (422 points by lelf on Jan 22, 2020 | 247 comments)


I think of Fortran as one of the most important half-dozen languages in use. But I think of computers as devices for computing things, rather than primarily for serving advertising over the web.

But I would’t use a Fortran web framework. It’s not well-suited to that.

Fortran is one of only two languages with a concise array syntax that’s used at the most demanding levels of high performance computing. The other is Julia.


Fortran's array syntax isn't really all that common in HPC codes, in my experience. The initial implementations in the 90's weren't great, and then things got themselves into a death spiral of HPC codes avoiding features that weren't performance-portable and compiler vendors avoiding work on features that weren't essential to current HPC applications.

(This has been a repeated pattern in Fortran since then, too, unfortunately. J3 invents something, often with flaws that go unnoticed until somebody tries to implement it; compilers don't invest in as-yet unused features, especially flawed ones; codes don't use them because they're not portable, or not performance-portable; repeat. DO CONCURRENT is the latest example -- J3 defined it as a serial construct and included semantics that can preclude parallelization...)


I could imagine using a Fortran Web framework when the goal is to build a monitoring console or API into a large Fortran project.

In situations like that, the effort to make it so that you can build the Web-facing parts in a more Web-friendly language is often far greater than the effort to just build the bits you need in the main language.


As someone who works professionally with large Fortran applications - the last thing you want to do is put Fortran anything on the web. Everything, and I do mean everything, is such a pain in the ass with Fortran. To the point where large scale applications are riddled with trivial overflow, race condition, and injection bugs because the language actively fights against doing things the right way.


APL?


It's efficient and certainly has concise array syntax. But it's not actually used for high performance simulations. Nobody is running APL on petaflop machines to simulate galaxy mergers.


I actually have a surprising amount of experience in Fortran, as both of my major "research" projects in college had to do with legacy Fortran code bases. The first was fitting a series of potential energy calculation libraries with Mathematica bindings, and the second was converting a library that did a particular kind of blur and glob detection to FITS datasets to C.

The second one was extra fun because the library was actually written in "FORCE", an extensive macro-extension of Fortran, for which we no longer had a compiler. All I had was docs, the original published paper, the original source, and a handful of input-output examples on substantial datasets

(for context, the datasets were essentially a 1024 x 1024 x 1024 array of floats that needed to be convolved with two different Gaussian kernels and compared pixel by pixel in an interesting way).


Always interesting to me to see how much "IT-first" educated people think that Fortran is a dead language. While when learning physics first you will quickly see how very much alive it still is (newest standard is "Fortran 2018" btw) Every weather forcast you are seeing was computed using Fortran for example :P


Why is that, though? Is it merely because it's been institutionalized, or is there an inherent advantage that Fortran has (over say C or Rust) for things like weather forecasting?

By no means am I suggesting that it wouldn't have an advantage; I'm just a young whippersnapper who has never had the chance to write any Fortran.


It's designed first and foremost to be a language to help compilers turn large matrix math operations into automagically vectorized and parallelized operations. If Rust is the language for close-to-metal memory-safe systems programming instead of C, modern (F90 and later) Fortran is the language for close-to-metal high-performance computational programming instead of C.

Thinking about it, I'd say the comparison to Rust for mastery of its particular domain is actually quite apt.


It's a beautiful language!

I cut my teeth on Matlab, and flipped a coin for Python or Fortran as my language to focus on after Matlab tried to charge our HPC center per core.

Python won the toss, but I've always had a love for Fortran.


Rust is actually the first serious contender in this domain, given that it has native support for parallel, concurrent and distributed programming and also deals elegantly with the aliasing issue that hampers optimization in C/C++. It's nice to see that LLVM is getting an official Fortran frontend, it will ease interop (including with Rust) and make it easier to compare code generation across projects and languages.


It's been my understanding that Fortran is loved by physicists due to it's procedural approach, and doesn't require learning objects and inheritance, etc.

There's a lot to learn in higher level physics, so having the simplest language, without tons of features to fiddle with, while also being very, very performant is preferable.

I have a feeling that much of this work end end up in Julia however.


No. Fortran is loved by physicist because it has an excellent 'impedance match' for the problems they are trying to solve. It has a multi-decade track record of expressing physics problems so it's the computer lingua franca there. And it has an extensive, high-quality and (perhaps most of all) tested ecosystem they can work in. It's not rocket science (unless, of course, it is); right tool for the job and all that.

FWIW...Fortran has had explicit OOP features since at least the 2003 standard (e.g. "type extends"). You aren't "required" to learn those features to use Fortran, but that's true of any number of ostensibly OOP languages.

Lastly...endless people (virtually always non-physicists who wrote a couple of lines of Fortran-77 in college) are continuously popping into the conversation with "well of course dump musty old Fortran for the NewHotness language because ew Fortran". Hasn't happened yet, and the Fortran folks have evolved from -77 to -90, -95, -2003, -2008 and lately -2018, so why would it?


Modern Fortran has oop with one major design mistake:

They use % as the accessor instead of . (Yes I am joking around) ;)


Was there once a battle between Fortran and APL and its derivatives for dominance of that niche?

I’m wondering why Fortran is dominant in physics yet APL in finance.


I think that at least part of it is APL's dependence on specialized hardware (at least back when I last encountered it in the 80s). At the Claremont Colleges, most computing access was on VAXes running VMS, although there was a single Unix machine at Harvey Mudd and Pomona College had an IBM minicomputer with 3270 terminals (plus a couple of the IBM graphic terminals) and it was Pomona's system that was the only one that had APL because the 3270s could use the specialized character set while Mudd (the science/engineering school) did not. The big language push at Mudd back then was towards APL, perhaps because a lot of aerospace companies sponsored clinic projects (all engineering students and some other majors did a sponsored real-world project in their senior year).

I remember writing some data analysis code in Fortran for my freshman physics lab and the TA was surprised to see my choice of language.


The barrier to entry to doing fast linear algebra from scratch is the lowest of any of c,c++,Fortran.

Python and matlab being to slow at “low barrier to entry level” for sure anyway.

I’m not going to comment on julia because I’m still drinking coffee. In theory… blah blah maybe it should fill this role.

Sit down, write stupid simple Fortran code to solve a large linear algebra problem (a discretized partial differential equation system) and things like matrix multiplication and numpy style mat(:) access are built in. (Numpy borrowed the syntax from fortran, actually)

For a scientist, this is way easier than wading into c++ and getting it right; or sticking with c, and avoiding the footguns.

Fortran gives you maximum serial speed with minimum effort… it is easy to write efficient serial code, while knowing almost nothing of programming. Memory management sure, but even there, there out of the box are no pointers to fumble and your allocations are going to be contiguous. Optimal Array Memory access comes down to knowing Fortran is column based.

Ah but then you need to go parallel, where the documentation more in c these days.

But you’re already comfy with Fortran now, so, scientist that you are, you dig and experiment until you get it running in openmp, mpi, and cuda, etc.


There's a long explanation but the summary is that FORTRAN was one of the first languages that targeted the highest computing demands so there was functional and financial interest in making it highly performant. Hardware was often designed around FORTRAN and a tight relationship over decades between those who wrote FORTRAN, those who wrote compilers and tooling for FORTRAN, and those who sold hardware for FORTRAN to run on in large high-performance/supercomputing worked together make things run stupidly efficiently and had decades to do so.

C has some similar legacy but came a little later to the game and wasn't as intuitive to people driving the money in computing as they were with FORTRAN. It's also heavily optimizeable from a performance perspective but often missed the initial buy in and momentum, though developed much of its own.

So FORTRAN has a lot of investment and strategic advantages to be used for things like weather modeling, at least for specific underlying libraries. Modern work tends not to start from greenfield in FORTRAN, it often starts in C or really a high language like Python to provide theoretical proof of concept and just write/use wrappers to highly optimized codebases like those for FORTRAN for the numeric fundamentals in the model. Lots of scientific glue code these days with not a lot of stuff making it into optimization levels like you see in BLAS, LAPACK, ScaLAPACK, etc.


It is faster than C for numerical computing. One of the reasons is that aliasing is not allowed by the language, so the compiler can make some optimizations that C compilers can't, because e.g. the C compiler cannot assume that two pointer arguments to a function do not point to the same memory location.


This was the historical reason for Fortran’s speed advantage over C. But more recent C compilers have a flag to turn off checking for aliasing, so you can write numerical C code with Fortran performance. It’s more verbose and annoying to write, though, because it lacks Fortran’s array syntax: loops for everything.


Flags to turn off aliasing checks do exist in C compilers, but they result in 'slower', less optimized code. They're intended for the common case where types are reinterpreted in a way that isn't expressly allowed by the C/C++ standards, in which case the compiler can't use type information to infer that two pointers don't alias. FORTRAN can assume lack of aliasing in many other cases.


Oh, interesting. Maybe I was thinking of the C `restrict` keyword, rather than a compiler flag. This is an instruction to the compiler, used in a pointer declaration, that there is promised to be no aliasing. It's supposed to allow C to get Fortran speeds in functions where it's used.


It could allow C to close the gap in the future, but it's not there yet. Fortran is fundamentally and pervasively noalias, and its compilers have decades of experience operating under that assumption. C/C++ doesn't use it nearly as much, and as a result the parts of the compilers that use it aren't very mature.

Rust is also pervasively noalias, but it's taken years for it to actually enable "noalias" on LLVM. Rust code uses that feature more than it's ever been used before, which keeps exposing codegen bugs that no one noticed before. Once those get flushed out, I still expect it will take a while for the resulting optimizations to reach the quality & maturity that Fortran has had.


That's exactly why it matters that LLVM is officially getting a Fortran frontend. This will make it possible to leverage these optimization strategies and bug fixes across languages and projects.


It also allows the existence of a Fortran REPL, something I never thought I would see: https://lfortran.org/


This whole conversation has been very insightful.

From the noalias = no panacea

To the Fortran repl

Thank you. This was inspiring.

Maybe it’s time I learn me some llvm too. Cheers.


Think programmer hours are expensive? Go look at the price for Physics and Engineering PhD programming hours.

Pretty much everything you would ever want to do related to partial differential equations or computational fluid dynamics has already been written in FORTRAN, and is damn good code.

It helps the the language itself is more than capable of great perforamance and that NASA has poured massive amounts of cash into making sure it stays performant on newer hardware.


Back in college I had an internship at an aerospace company. I remember having the exact same thought that "Fortran is dead". I could not have been more wrong, Fortran at least at engineering shops is alive and well.


i love how fortran and matlab are in their own industrial bubbles


I have seen quite a lot of industrial machinery that are programmed in... LabView!


What is Matlab's moat? Did free software not catch up?


It's the toolboxes and also SIMULINK. SIMULINK + Signal Processing Toolbox + Control Systems Toolbox is incredibly powerful for Physics and I don't think free implementations really touch that


NASA switched from SIMULINK to ModelingToolkit (Julia) for modeling spacecraft dynamics [0]. They found that the latter was much easier and gave a 15,000x (!!!) performance improvement.

[0]: https://www.youtube.com/watch?v=tQpqsmwlfY0


Thousands of professors that can't be bothered to learn Julia.


I teach an undergraduate control course that uses Matlab. I'd love to use Julia (I use Julia for all my research) but last time I checked, it was not straight forward to interface to hardware using Julia (our labs use Matlab to control motors). There is a control systems package for Julia but it is no where as complete or polished as the Matlab control systems tookbox (or even the control systems package in octave).


No clue. Octave is here for us… so at least we can run their code and ask any pertinent questions to expedite the port. ;)

(Yes I know it’s simulink for many, but I’ve met some pure computational folk who swear by matlab as their prototyping tool and python is a bug ridden rat nest of footguns to them.)

C’est la vie


It did - there's a free implementation known as Octave.


Octave does exist, but it’s not covering the full set of things Matlab does. If your work is focused on the computational math side it’s fine, but if you have any interest in the simulation stuffs Octave hadn’t caught up and wasn’t even trying to, at least when I last looked a few years ago.


The third party domain specific apps built on top of MATLAB is their moat.


it's good to know that it did; i switched over to general software engineering a few years ago (worked in aerospace before). My college taught matlab as a class, and required you to buy the student version.


Well, Fortran was intended for use in scientific computation. In the 1950s and 1960s, the percentage of computing time spent on scientific computation was much larger than it is today. So, naturally, Fortran had a much more prominent position back then.

These days, scientific computing is a relatively minor niche, because people have found so many other things to do with computers that science is just one of many use cases.

Fortran was aimed at a specific audience, scientists and engineers, and that audience seems - AFAICT - happy to keep using it. One may consider it an impressive success story.

Personally, I have never used Fortran for anything beyond the hello-world-level, so I have no strong opinion on the language as such; I do know it has evolved a lot, probably more so than C.


Granted this was about 20 years ago, yet when Fortran was still labelled as dead. Part of my undergraduate Physics degree involved programming in BASIC, the reason for this was that they felt it was both a readily available language and that it would make for a smoother transition to Fortran provided that we followed their stylistic guidelines. Sure enough, I was dealing with Fortran code a couple of years later. (In that case, it was legacy code but it didn't take much poking around to discover that modern versions were used for HPC.)


Fortran _is_ dead. I work with it professionally. The tooling is next to non-existent, the compilers are trash, the standards committee is in some weird backwards compatible with FORTRAN 66 circle jerk to the point where you can't have a sane modern revision (think the complete opposite problem as Python 2 -> Python 3).

The only reason Fortran is still around is institutional inertia.


I was talking to a maths prof a year ago or so. It seemed that there was some Fortran stuff, although C++ seems the favourite.

Personally, back in the days, I never really got into the NAG libraries. I found it easy enough to roll my own. Maybe some of the stuff can save you time, but I ended up preferring hand-coding my algorithms.

The greatest thing to happen to Fortran was Fortran 90. None of that column malarky. Modules was kinda OK, but if I wanted to use well-structured code I probably wouldn't be starting with Fortran anyway. I don't think ANYONE I knew at the time used to new features other than format-free.

I tried poking around with allocatable arrays at one point, but didn't like it. I guess it was Fortran's way of trying to be like C.

One thing that is rarely discussed is that a programming language isn't just specification, but it is a culture and philosophy shaped by the programmers themselves. One guy made a reference to Cobol, and how object-orientation was an unused feature. He said that what the designers failed to consider is that Cobol programmers just don't "do" object-orientation.

Fortran - even Fortran 77 (IIRC) has some nice little features, like being able to specify parameters in a separate file, and read them in one line. I doubt most of the other guys in the faculty were aware of that feature at the time.

And my standard anecdote ... when I went to a new job, they actually had a little bit of programming in Fortran. We had Visual Fortran. I had to set up the environment to fiddle with something for a client. Set-up was straight-forward. I opened the project file, and the things opened and compiled without a single hitch. I was shocked - shocked I tell you - at how simple the set-up was. Normally one would expect endless futzing around to get the programming environment and libraries in place.

But that's not the real anecdote. One day I was passing by a meeting room. I overheard an outside consultant in discussion with some of our guys about a replacement for Fortran. He was going on about how flexible the system would be, and hey, it you needed to set up extra parameters you could always add them to an XML configuration file he was proposing.

And I thought to myself ... my God, that's complicated. In Fortran, you can read an array into a file in one statement. Bam! His method would have involved external libraries and some serious effort to get going.

I always joke that people should be forced to write Fortran for at least a year. For those that want to be JavaScript developers, two years. That should make people think much more simply about what it is you're trying to do.

A year ago I added programming microcontrollers to my list of things programmers should be forced to do. A resource-constrained environment should focus their minds a little more.


> extra parameters you could always add them to an XML configuration file

This is what Fortran's namelists are for.


> Does it still work > If yes, it's not dead

Another point: This thinking cheapens learning. If you only learn, "what isn't dead", you stifle your curiosity and don't learn the whole picture of a thing.


Nobody is suggesting you only learn "what isn't dead". Experimenting with new tech is good. But you should also know _something_ about real, relevant tech and not just the bleeding edge, which will all disappear in a year.


I remember seeing this in practice at a meetup - I believe it was at the Boston AWS meetup at Vistaprint several years ago. Someone was running FORTRAN web services as CGI applications, inside of nginx, inside of a container, running on top of Docker Swarm. Say what you will - it seems pretty practical.


You gotta love it when the first example includes an SQL injection: https://fortran.io/static/model.html


The "s" in FORTRAN stands for "security"


I remember a couple of years ago someone created an MVC web framework for COBOL, but I don't think it was intended for serious use. More like the web framework equivalent of esoteric programming languages.

This looks, let's say, less obviously whimsical. Is there a serious use case for this, or is this another case of "becase we can"? Honest question.



That's the one!


I'm still waiting on a MUMPS web framework to make old Perl code look legible.


I got curious, but I'm disapointed to say that mustache does not currently have a Fortran implementation.

Of course, there's a C implementation and bindings are only ever a few lines away...


Surely this is going to be the Next Big Thing.


I hope not, its been on my list for ages because barely anyone (compartively to other languages) still works with it, but a lot of legacy systems still rely on it.

If hipsters come in and take that from me I'll have to learn something even worse


I actually founded /r/fortran as a joke because I thought nobody used it anymore, but it was later adopted by people who in fact use modern FORTRAN. Later I met someone who has a very good job programming in FORTRAN for a 20,000 CPU supercomputer which is used to simulate nuclear explosions. It is in fact a current, relevant language that has specific unique strengths.


Well I didnt want it to come to this, but COBOL it is then...

On a serious note, at least right now it is quite a niche language that appears to have more demand than supply, so might still be a good idea. Even better in fact if its at least attempting to be modern


> I actually founded /r/fortran as a joke

WATFOR it..


Oh man, you just gave me a sense memory of the smell of the Waterloo documentation in the UIC computer lab.


not really legacy: https://en.wikipedia.org/wiki/Fortran#Science_and_engineerin...

LAPACK upstream wont move to another language, unless there are significant advantages: https://en.wikipedia.org/wiki/LAPACK And this is not in sight (yet) for operations on block-data like matrices.


It might; hipsters love old stuff so...


And Linux dev's loves to reinvent the wheel, but this time as a square ;)


Surely this project should be named Fortran on Forklifts?


Let's try... Fortran on Freightways Fortran on Fibers Fortran on Ferries Fortran on Fast Lane Fortran on Foundation Fortran on Fences


Fortran on Flames


Any tutorials on Web Fortran?


Does it compile to WASM?


Getting Fortran running on WASM is a little bit of an exercise in madness [0], but is possible, if involving a lot of moving parts.

[0] https://chrz.de/2020/04/21/fortran-in-the-browser/


"the network is the supercomputer"(TM)


I was shocked to see this language popped up as a web framework. Before this, it was something belonged to a computational science lab with a BIG project in 1990s.


Could be useful for wrapping old FORTRAN code with an API and test suite that can be later reused if implementation is changed


Or you could just use Fortran's C bindings to your language of choice. String support in Fortran is irritating to work with at best.


You know that a language has horrible string handling if the C binding to the language are an improvement...


There are zero job ads for FORTRAN or cobol in my country’s job site.

Languages don’t get no deader than that.


𓂀𓅃𓀃?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: