Hacker News new | past | comments | ask | show | jobs | submit login
Stan Ulam, John von Neumann, and the Monte Carlo Method (1987) [pdf] (lanl.gov)
77 points by Vannatter on Oct 28, 2016 | hide | past | favorite | 19 comments



think about it -- the notion of Monte Carlo could not have existed before the 19th century. Newton was proposing a purely deterministic model of nature in the 17th century and the next 200 years was spent trying to develop his program until Statistical Mechanics and Quantum Mechanics begin to show this is impossible.

DeMoivre and Laplace used calculus to develop come crude notions of probabiity as applied to Gambling -- a very sinful topic. Those ideas such as random walk wouldn't be made formal until Kolmogorov in the 20th century.

Lastly we did not have computers in the 19th century so we could not have the computational resources to implement even the simplest monte-carlo calculation even if the idea had come about. When Boltzman developed his ideas of entropy and the notion of applying statistics to describe a physical model, he was ridiculed so much he committed suicide

https://en.wikipedia.org/wiki/Ludwig_Boltzmann#Final_years


I don't agree with your first point. Determinism has nothing to do with using random numbers to solve a mathematical problem approximately. In Monte Carlo integration you just sample values in a domain, and then add up the sampled values of the function being integrated.

However, I agree with your last point: you need computers to do integrals this way. Before computers, sampling methods were indeed used (eg gaussian quadrature), but the sample points were chosen very carefully to reduce their number, and reduce the work of human "computers".


>Newton was proposing a purely deterministic model of nature

He could've made a great probabilist. http://fermatslibrary.com/s/isaac-newton-as-a-probabilist



there is also something called Crofton formula which extends the logic of the needle hitting a line

https://en.wikipedia.org/wiki/Crofton_formula


It's interesting how a strictly deterministic thinking and a more authoritarian society appear to have been correlated.

When individual liberty became more entrenched in culture, suddenly scholars were considering non-deterministic phenomena everywhere.

(I'm sure there's a million of counterexamples; I am just noticing a broad trend.)


As much as I'd like to agree with this, I can't avoid to notice that much of quantum mechanics was developed in early 20th century Germany (with some exceptions like Bohr, Fermu and de Broglie) which wasn't exactly following a liberal track at that time.


I beg to differ.

Regarding liberties Germany actually had a pretty good track record up until WW1 (considering the standards of the times).

How could its economy have outpaced the ones of France and Great Britain by then? Liberlization did. Its what made Germany a jewish "america" for Ashkenazis all over eastern Europe. Social Security, Kindergarten and other institutions were formalized. Even with WW1 and its subsequent burden of the overreaching Versailles-agreements, the people of Germany were living in a newly found state bound to fail - but were still the same people.

Berlin in the 20s had the most vegan restaurants in the western hemisphere. Gays were more accepted than for a long time to be seen in the west.

The scientists you cite got their educational groundwork to build upon out of these decades of progress.

Decades of work that shouldn't get discounted so casually ("wasn't exactly...").

I know this isn't a history thread but since we're using history for making helpful generalizations getting history right helps. And Germany was certainly not "all-brownshirts" from 1900-1945 :-)


It's true that Germans enjoyed quite a few civil liberties at the time, and that it's constituent states has at times been very progressive in certain respects. That said, there seems to have been very little emphasis on individual liberalism ideologically in the same way that you'd see in England, France or the US. Rather, public discourse was dominated by conservatism, nationalism and social democrats, none of whom puts primacy to classic liberalism.

I interpreted the comment I replied to as saying that it was liberal values in its own right that created a good scientific climate for discovery. It can be argued that even though the German states tended toward a liberal trajectory well before unification, and continued to make headway afterwards, it was largely driven by pragmatic policy rather than any such ideology. Bismarck in particular was no liberal, but nevertheless enacted several reforms towards such end.

What Germany did have, though, was an ancient academic tradition going back hundreds of years, with significantly more universities than other countries. This was largely due to the fragmented nature of states pre-unification, and the many different denominations of Christianity. I'd argue that it was this tradition that came to fruition, and that it was despite of the nationalistic elements of German society.

An interesting insight into science in this era is "The dilemmas of an upright man" about Max Planck.


Kolmogorov's contribution did not have any real initial impact, it was met with lukewarm reception. You might enjoy reading "The origins and legacy of Kolmogorov’s Grundbegriffe" for a longer exposition on his most well-known work. Boltzmann was likely bipolar and suffered from bouts of intense depression. His suicide was partly due to his long feud with philosophy, but I think a larger contributor was probably his own mental issues.


Laplace, de Moivre, and their immediate intellectual heirs developed most of the basic components of probability theory in the eighteenth and early nineteenth centuries, and more or less all of the algebraic properties of probability spaces were developed before Kolmogorov formulated his axioms in the twentieth century.

Indeed, Kolmogorov's formulation of probability theory, and measure theory generally, is mostly relevant only as a formalism that brought probability into step with the formal developments in analysis to the time. Yes, Kolomogorov's foundation is far and away the dominant flavor of formal probability theory today, but it is possible to develop substantially all of the field, formally or informally, using finite spaces and the basic body of knowledge about them that existed even before Kolmogorov was born [0].

You are certainly right that the advent of the computer age has enormously enhanced the practical utility of probability theory, but again I think you're mistaken in believing that computers were very necessary for the conception or basic use of techniques like Monte Carlo simulations. Twentieth-century mathematicians and engineers of the analogue age had books of random numbers (generated by physical techniques, like rolling dice) and knowledge of practical techniques for generating random-ish sequences that made many probabilistic approaches possible before computing time was cheap and readily available; for example, in his Mathematical Theory of Communication, Claude Shannon describes how he generates Markov chains modeling the English language by (1) flipping through books to find a sequence of character or words, (2) noting the sequence that follows it, and then (3) flipping ahead to find that sequence's next occurrence in the text, before repeating the process to determine the next part of the sequence.

(These sorts of pre-computational techniques also help to build intuition about probabilistic processes that seem to be missing in many people who turn to computers right away; I've found some mid-century papers and books on probabilistic topics to be particularly enjoyable and insightful because they present things in this pre-computational setting.)

0. To do so formally requires nonstandard analysis for many applications, but the nonstandard approach only rigorously develops techniques that were known, albeit without formal grounding, to probabilists of previous eras. One needs Kolmogorov's axioms and modern measure theory only to do highly abstract probabilistic sorts of things on uncountable spaces that cannot be sufficiently well approximated by nonstandard or informal models of "very large" spaces. I have never seen an application of probability outside of a pure maths setting that cannot be modeled by a nonstandard finite space, and for good reason: our world is finite, if not actually, then in our experience of it.


Stan Ulam was an incredible, fascinating, man.

He thought of new methods to find truth over a game of cards, and new theoretical weapons while looking over his country garden.


Interesting seeing the name Robert Richtmeyer in there. He was a family friend and had some great recollections on Stan Ulam among many others, including Von Neumann (Richard Feynman and Enrico Fermi).


Did he ever comment about Von Neumann's memory? There are many claims that he could read a book once and recall the whole book perfectly down to the page even years later.


I remember him mentioning that the two were having a conversation where Von Neumann was writing out a complex mathematical formula, at which time he was interrupted by a phone call that lasted about 30min, and upon his return picked up writing out the formula as if there was no interruption. Robert found was just amazed at his powers of thought!

So many anecdotes that he related about all the prominent scientists involved during the Manhattan Project.


Do you believe that?


The thing is Von Neumann is an extreme outlier in terms of human mental ability and achievement.

- liked to read books whilst driving

- on his death-bed, reciting, by heart and word-for-word, the first few lines of each page of Goethe's Faust

- always had a stack of books next to his bed that he was reading

It seems plausible that if he could retain everything he read he would want to read as much as possible.

I haven't come across any first-hand account of people explicitly saying this was bullshit, other than skepticism from people who never knew him. There's always going to be skepticism on outlier events.

Here's an interesting post about this memory: http://overweeninggeneralist.blogspot.co.za/2011/11/john-von...


Monte Carlo simulations make become more important than ever because they offer easy parallelism and thus are impervious the breaking down of Moore's Law.


If you liked that, you might like

Matthew Richey's "The Evolution of Markov Chain Monte Carlo Methods"[1]

[1] http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.295....




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: