source: Monte Carlo method (Wikipedia)
John von Neumann was one of the genuine mathematical geniuses of the twentieth century. A particularly interesting window onto von Neumann's scientific work is provided by George Dyson in his book, Turing's Cathedral: The Origins of the Digital Universe. The book is as much an intellectual history of the mathematics and physics expertise of the Princeton Institute for Advanced Study as it is a study of any one individual, but von Neumann plays a key role in the story. His contribution to the creation of the general-purpose digital computer helped to lay the foundations for the digital world in which we now all live.
There are many interesting threads in von Neumann's intellectual life, but one aspect that is particularly interesting to me is the early application of the new digital computing technology to the problem of simulating large complex physical systems. Modeling weather and climate were topics for which researchers sought solutions using the computational power of first-generation digital computers, and the research needed to understand and design thermonuclear devices had an urgent priority during the war and post-war years. Here is a description of von Neumann's role in the field of weather modeling in designing the early applications of ENIAC (P. Lynch, "From Richardson to early numerical weather prediction"; link):
John von Neumann recognized weather forecasting, a problem of both great practical significance and intrinsic scientific interest, as ideal for an automatic computer. He was in close contact with Rossby, who was the person best placed to understand the challenges that would have to be addressed to achieve success in this venture. Von Neumann established a Meteorology Project at the Institute for Advanced Study in Princeton and recruited Jule Charney to lead it. Arrangements were made to compute a solution of a simple equation, the barotropic vorticity equation (BVE), on the only computer available, the ENIAC. Barotropic models treat the atmosphere as a single layer, averaging out variations in the vertical. The resulting numerical predictions were truly ground-breaking. Four 24-hour forecasts were made, and the results clearly indicated that the large-scale features of the mid-tropospheric flow could be forecast numerically with a reasonable resemblance to reality. (Lynch, 9)
image: (link, 10)
A key innovation in the 1950s in the field of advanced computing was the invention of Monte Carlo simulation techniques to assist in the invention and development of the hydrogen bomb. Thomas Haigh, Mark Priestley, and Crispin Rope describe the development of the software supporting Monte Carlo simulations in the ENIAC machine in a contribution to the IEEE Annals of the History of Computing (link). Peter Galison offers a detailed treatment of the research communities that grew up around these new computational techniques (link). Developed first as a way of modeling nuclear fission and nuclear explosives, these techniques proved to be remarkably powerful for allowing researchers to simulate and calculate highly complex causal processes. Here is how Galison summarizes the approach:
Christened "Monte Carlo" after the gambling mecca, the method amounted to the use of random, numbers (a la roulette) to simulate the stochastic processes too complex to calculate in full analytic glory. But physicists and engineers soon elevated the Monte Carlo above the lowly status of a mere numerical calculation scheme; it came to constitute an alternative reality--in some cases a preferred one--on which "experimentation" could be conducted. (119)
At Los Alamos during the war, physicists soon recognized that the central problem was to understand the process by which neutrons fission, scatter, and join uranium nuclei deep in the fissile core of a nuclear weapon. Experiment could not probe the critical mass with sufficient detail; theory led rapidly to unsolvable integro-differential equations. With such problems, the artificial reality of the Monte Carlo was the only solution--the sampling method could "recreate" such processes by modeling a sequence of random scatterings on a computer. (120)The approach that Ulam, Metropolis, and von Neumann proposed to take for the problem of nuclear fusion involved fundamental physical calculations and statistical estimates of interactions between neutrons and surrounding matter. They proposed to calculate the evolution of the states of a manageable number of neutrons as they traveled from a central plutonium source through spherical layers of other materials. The initial characteristics and subsequent interactions of the sampled neutrons were assigned using pseudo-random numbers. A manageable number of sampled spaces within the unit cube would be "observed" for the transit of a neutron (127) (10^4 observations). If the percentage of fission calculated in the sampled spaces exceeded a certain value, then the reaction would be self-sustaining and explosive. Here is how the simulation would proceed:
Von Neumann went on to specify the way the simulation would run. First, a hundred neutrons would proceed through a short time interval, and the energy and momentum they transferred to ambient matter would be calculated. With this "kick" from the neutrons, the matter would be displaced. Assuming that the matter was in the middle position between the displaced position and the original position, one would then recalculate the history of the hundred original neutrons. This iteration would then repeat until a "self-consistent system" of neutron histories and matter displacement was obtained. The computer would then use this endstate as the basis for the next interval of time, delta t. Photons could be treated in the same way, or if the simplification were not plausible because of photon-matter interactions, light could be handled through standard diffusion methods designed for isotropic, black-body radiation. (129)Galison argues that there were two fairly different views in play of the significance of Monte Carlo methods in the 1950s and 1960s. According to the first view, they were simply a calculating device permitting the "computational physicist" to calculate values for outcomes that could not be observed or theoretically inferred. According to the second view, Monte Carlo methods were interpreted realistically. Their statistical underpinnings were thought to correspond exactly to the probabilistic characteristics of nature; they represented a stochastic view of physics.
King's view--that the Monte Carlo method corresponded to nature (got "back of the physics of the problem") as no deterministic differential equation ever could--I will call stochasticism. It appears in myriad early uses of the Monte Carlo, and clearly contributed to its creation. In 1949, the physicist Robert Wilson took cosmic-ray physics as a perfect instantiation of the method: "The present application has exhibited how easy it is to apply the Monte Carlo method to a stochastic problem and to achieve without excessive labor an accuracy of about ten percent." (146)This is a very bold interpretation of a simulation technique. Rather than looking at the model as an abstraction from reality, this interpretation looks at the model as a digital reproduction of that reality. "Thus for the stochasticist, the simulation was, in a sense, of apiece with the natural phenomenon" (147).
One thing that is striking in these descriptions of the software developed in the 1950s to implement Monte Carlo methods is the very limited size and computing power of the first-generation general-purpose computing devices. Punch cards represented "the state of a single neutron at a single moment in time" (Haigh et al link 45), and the algorithm used pseudo-random numbers and basic physics to compute the next state of this neutron. The basic computations used third-order polynomial approximations (Haigh et al link 46) to compute future states of the neutron. The simulation described here resulted in the production of one million punched cards. It would seem that today one could use a spreadsheet to reproduce the von Neumann Monte Carlo simulation of fission, with each line being the computed result from the previous line after application of the specified mathematical functions to the data represented in the prior line. So a natural question to ask is -- what could von Neumann have accomplished if he had Excel in his toolkit? Experts -- is this possible?
ReplyDeleteIt would seem that today one could use a spreadsheet to reproduce the von Neumann Monte Carlo simulation of fission, with each line being the computed result from the previous line after application of the specified mathematical functions to the data represented in the prior line. So a natural question to ask is -- what could von Neumann have accomplished if he had Excel in his toolkit? Experts -- is this possible?
Hopefully I'm reading your open question correctly. Monte Carlo algorithms in Computer Science are pervasive and can be implemented in most any programming language. So his models could certainly be in Excel (I assume considering it lets you write mathematical functions) just as they could be implemented (probably more easily) in C, C++, Java, C#, Python, Matlab, PHP, Javascript, or most any other programming language.
The punch cards come down more to the move to fully digital machines where memory (persistent storage, RAM, etc...) are now available to store data and programs whereas before this was not the case. And of course computing speed has increased to such a ridiculous degree from von Neumann's time that. Now what von Neumann would have done with this extra power I couldn't say.
These kind of calculations are trivial in Excel, although the random number generator (at least in older Excels) is wanting. But the question how far it would have gotten him. One the one hand he was at the forefront of his field, so faster calculations would have meant faster hypothesis-checking and a larger impact. On the other hand, hardly any physicists have used Excel to advance their field. That might be historical though, that the computing power leveraged by physicists blew up before spreadsheets were available.
ReplyDeleteI'm sure there are many experts far better qualified than I to provide answers, but I see no comments so I'll offer something. I'm a professional programmer with a Physics BA from 35 years ago. I'm not a computer science academic.
ReplyDeleteWithout knowing the specifics of the calculations, I guess the computations, which probably have many steps and intermediate results, would be clumsy to write directly in Excel cell formulas. On would probably need the features of a precedural language to code the algorithms. Using automation one can write custom code/calculations in a separate program that can read and write Excel cells during execution. So it would be possible to do what you are suggesting. However, if the computation requires thousands or millions of intermediate results to reach the end-state of each time interval, it doesn't seem valuable to integrate the processing with the Excel presentation in this way. Most likely it would be far more efficient in terms both of programming and program execution to write a custom program in a language like C++ that did the computation and produced the series of end-states as data arranged in a CSV file as output. This output file could then be analysed using Excel's statistics and graphing capabilities.
"would be clumsy to write"
DeleteA strange critique cinsidering you're not manually punching cards anymore with excel.
An introduction or summary is needed at least for me to tell me how to read the essay and what I should learn from it. I am lost, even though I have what is taken as a superb background.
ReplyDelete