Hacker News new | past | comments | ask | show | jobs | submit | theodorethomas's comments login

> why some of the very best theoretical physicists, who were entirely free to work on whatever they wanted, continued working on string theory

Historically, it is not unusual for intellectual wrong turns to persist for decades or centuries. On the flip side, brilliant insights can also be abandoned too early.

The problem here is that data has dried up and cannot guide us. Some future tech will open up new data and then progress will resume. Without data, physics becomes theology.

__________________________________________ Feynman's talk "Seeking New Laws" excerpt: "But the age that we live in is the age in which we are discovering the fundamental laws of nature. And that day will never come again. I don’t mean we’re finished. I mean, we’re right in the process of making such discoveries. It’s very exciting and marvelous, but this excitement will have to go.

Of course, in the future there will be other interests. There will be interests on the connection of one level of phenomena to another, phenomena in biology and so on, all kinds of things. Or if you’re talking about explorations, exploring planets and other things. But there will not still be the same thing as we’re doing now. It will be just different interests.

Another thing that will happen is that if all is known– ultimately, if it turns out all is known, it gets very dull– the biggest philosophy and the careful attention to all these things that I’ve been talking about will have gradually disappeared. The philosophers, who are always on the outside, making stupid remarks, will be able to close in. Because we can’t push them away by saying, well, if you were right, you’d be able to guess all the rest of the laws. Because when they’re all there, they’ll have an explanation for it.

For instance, there are always explanations as to why the world is three dimensional. Well, there’s only one world. And it’s hard to tell if that explanation is right or not. So if everything were known, there will be some explanation about why those are the right laws.

But that explanation will be in a frame that we can’t criticize by arguing that that type of reasoning will not permit us to go further. So there will be a degeneration of ideas, just like the degeneration that great explorers feel occurs when tourists begin moving in on their territory."


https://wg5-fortran.org/N2201-N2250/N2212.pdf is John Reid's complete document.


Mid 1980s, trying to connect from one London University computer (at Imperial, a CDC Cyber?) to another (at Queen Mary?), to try some symbolic algebra package that was only available on a Unix minicomputer: spent a good 2 hours just finding the right terminal settings and configuration settings for all the intervening software. Left me feeling I would rather code what I could on an 8-bit 2Mhz 64K machine.


Yep - those were the days when there was a pretty good chance that people doing Uk computer science degrees had a working knowledge of Z80 or 6502 machine code from hacking Elite/Jet Set Willy/Manic Miner on their ZX Spectrums/BBC Micros...


I suspect not. Children are exposed to far less text than LLMs. LLMs are parlour tricks that teach us nothing about how humans do it.


People mention the no-aliasing, the compilers, the intrinsics, the libraries, and the expressivity but one aspect of the difference is ignored and it is this: C is a language for specifying behaviour of hardware (time sequence of detailed states), Fortran is a language for specifying computation of values. Fortran abstracts far more of the hardware than C and consequently, a Fortran compiler can benefit from quantum processors, mind-readers or time-machines, should they ever be invented.

A Fortran program that reads its input, calculates and finally writes out its output does not have to execute any particular instruction at all. As long as the answer is "AS IF" it had done the user-specified computation, the Fortran compiler has done its job. In between I/O, it submerges into the ineffable like a Cold War SSBN.

C is about the instruments, the players and the conductor, Fortran is about the music.


Maybe this was once true, but the hardware that C was designed for specifying the behavior of was a PDP-11. Nowadays you are programming an abstract C-ish virtual machine that provides certain semantic guarantees that don't necessarily map terribly well to the physical hardware. For example, if you write to a memory address, and then read from the memory address in the "next instruction", you expect the change to be immediate, even though the code is actually running on a pipeline that could be dozens of instructions deep, with several layers of cache between the core and system memory. So in a sense there's not really a qualitative difference between C and Fortran - they are both for specifying a sequence of operations on an abstract machine, relying on the compiler to implement that machine - and indeed modern optimizing C compilers actually provide very few guarantees about specific assembly instructions to be executed, happily rewriting or omitting code so long as it executes "as if" it ran on the C virtual machine.

See "C is not a low-level language" - https://queue.acm.org/detail.cfm?id=3212479


I don't see how C can match Fortran's abstraction level and still reliably control hardware that uses memory-mapped I/O.

C, as an operating system implementation language, is trying to do something fundamentally different than Fortran.

You live by memory address, you die by memory address.


> For example, if you write to a memory address, and then read from the memory address in the "next instruction", you expect the change to be immediate

This would also be true for assembly, hardly a high level language


On modern CPUs assembly is a high level language (or rather, it's a language that doesn't have any of the advantages of traditional low-level languages, even if it also lacks the advantages of traditional high-level languages. Much like C)


> you expect the change to be immediate, even though the code is actually running on a pipeline that could be dozens of instructions deep, with several layers of cache between the core and system memory.

I expect the hardware to handle cache coherency in that situation. What the compiler does should be irrelevant.


Right, but the point is that the hardware is still "meeting you halfway" to present the appearance of something which isn't actually happening. Those pointers in C aren't really "memory addresses" at all, they're keys in a key-value store managed by the hardware to preset the illusion of flat, contiguous memory, as mandated by the C programming model.

So maybe it's accurate to say that C is "more compatible" with real hardware, in the sense that its abstract machine is more isomorphic to what's really happening than Fortran's is. But it's not exactly "closer to hardware" in the way we might be tempted to think; it's more of a lingua franca that your processor happens to speak.

If you're still tempted to consider C "close to hardware", consider that you can compile the same code for a Z80 and a Threadripper. What hardware exactly are you controlling that's common to both?


> as mandated by the C programming model.

As PhilipRoman said, this is also true of assembly (or any other programming language model[1]).

> If you're still tempted to consider C "close to hardware", consider that you can compile the same code for a Z80 and a Threadripper. What hardware exactly are you controlling that's common to both?

In both of them I can write to a memory-mapped I/O device, if it has one. I can write a custom memory allocator for a pool that I'm managing myself. I can't do either of those in Fortran or Javascript.

[1] Why does it have to be true of any other programming language model? Well, maybe I exaggerate slightly. But can you show me a (single threaded) programming language where "a = 1" does not mean that on the next line, a will be 1?


> But can you show me a (single threaded) programming language where "a = 1" does not mean that on the next line, a will be 1?

MIPS I.

https://en.wikipedia.org/wiki/Delay_slot#Load_delay_slot


>But can you show me a (single threaded) programming language where "a = 1" does not mean that on the next line, a will be 1

Generally agree with your point, but just to play the devil's advocate, in a CPU with exposed pipeline and no interlocks, setting a register to a value doesn't guarantee that a following instruction reading from that register will see the last value written.


With D language it can be the music, instruments, player and the conductor all at once [1].

Fun facts, Walter the original author of D language wrote his popular Empire game in Fortran [2]. Some of the ideas that make Fortran fast is incorporated into D language design and this makes D is as easy if not easier to optimize than Fortran [1].

[1]Numeric age for D: Mir GLAS is faster than OpenBLAS and Eigen:

http://blog.mir.dlang.io/glas/benchmark/openblas/2016/09/23/...

[2]A Talk With Computer Gaming Pioneer Walter Bright About Empire:

https://madned.substack.com/p/a-talk-with-computer-gaming-pi...


Oddly, when people mention Empire, I think of Peter Langston's Empire rather than Walter's.


Yes, you're correct. C was created to control a CPU, it is a low level language with a comfortable syntax. C abstracts the hardware. But Fortran has nothing to do with hardware, it is just a notation for computing matrix algorithms. Fortran can be thought as a primitive APL. You can do all kinds of optimizations in Fortran that you cannot do in C, because it doesn't care or know about the underlying hardware.


That was maybe true for C in the seventies but there's practically no difference anymore e.g. C has an as if rule too.


The point is, it shouldn't.

https://news.ycombinator.com/item?id=30022022 How ISO C became unusable for operating systems development


The only reason you would want bit-reproducibility is because you haven't done the numerical analysis and have no clue how many digits of your "answer" to trust.

As far as I know, two sectors claim they need it: finance and climate.

"Do you want a better answer?"

"No, I want the same wrong answer that I got last Tuesday."

Science/Mathematics can't fix this.


> The only reason you would want bit-reproducibility is because you haven't done the numerical analysis and have no clue how many digits of your "answer" to trust.

I can confidently say that this is not the only good reason. Other reasons include:

- You want to compare different runs by hashing outputs (e.g. to find the first computation step where they diverged). Very useful for debugging, and also useful to determine whether you accurately reproduced a result (e.g. a customer problem).

- If your program has a single floating point comparison, there is no such thing as "enough significant digits" - with reasonable assumptions about the distribution of "unreproducability", your logic is now divergent (and your output will jump between different values) with a certain probability. At that point we're no longer talking numerical analysis, it's straight up "divergent results".


There's also "cover your ass". At least I've heard tales of major aerospace companies keeping warehouses of old sun hardware in case they need to demonstrate the simulations they ran back in the 90s were not fabricated...


I’ve yet to meet a customer that cares enough to pay for the necessary numerical analysis.


It is unsafe if the code was written by a numerical expert who understood the ISO/IEC 60559:2020 standard. Not many of those, so in all probability your garbage code continues to be garbage (but faster) with FMA.


print *, norm2(x)

"The result of NORM2 (X) has a value equal to a processor-dependent approximation to the generalized L2 norm of X, which is the square root of the sum of the squares of the elements of X. If X has size zero, the result has the value zero."


Readers, please don't accept anything anyone writes about "FORTRAN", unless in a historical context. They probably last encountered the leading edge of the language 40 years ago.


Fortran (name since 1990) has SELECT CASE to make Computed GO TO obsolete. Maybe you are thinking of FORTRAN?


Sure, FORTRAN then. The language that Hamming was referencing.

Sadly, though, obsolete doesn't mean absent. I saw plenty of ostensibly professional code in the early 2010s that was developed using computed go to. It was "delightful" and totally humane code.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: