Are there any examples of calculations that took the Manhattan mainframes months to run that I can punch into an iPad to compare. Or any similar feats of calculating that are now rendered ordinary.
Are there any examples of calculations that took the Manhattan mainframes months to run that I can punch into an iPad to compare. Or any similar feats of calculating that are now rendered ordinary.
What do you tell an iPad with two black eyes?
Nothing, you’ve already told it twice…
Theories don’t require much number crunching, a rough calculation can be done on the back of a Toohey’s Old coaster.
Number crunching is where computer power comes into it’s own.
Working out how many grains of TNT are required to fire each piece of the critical mass at just the right velocity, the density of the neutron cloud and a myriad of other calculations would have benefited from better computers obviously.
Real atomic tests are now no longer needed, it can all be simulated now in big mother computer programmes, I think that is where the real advantage of modern computer processing power lies.
just to note. the “computers” on the manhattan project were women.
http://www.mphpa.org/classic/HISTORY/H-06c18.htm
JudgeMental said:
just to note. the “computers” on the manhattan project were women.http://www.mphpa.org/classic/HISTORY/H-06c18.htm
So before there was cloud computing there was fog computing? Pretend i didn’t say that…….
:P
From: Dropbear
ID: 656746
Subject: re: January Chat
Curve, if you had a matrix of 2000 × 2000 numbers and you wanted to calculate some value for each point in that matrix.
Even if the calculation only took 1 cpu cycle (and this is dumbing it down) on 4 kHz cup (early mainframe) it would take a thousand seconds.
On a late 80s ibm pc , about 1 second
On an iPad, about 1/1000th of a second
I read recently a biography of Turing and it describes matrices and how his idea was basically a yes/no operation. He also predicted a new employment, the programmers to allay the fears of mathematicians that his idea was going to do them all out of jobs.
Matrix manipulation is one of those things that leads itself to parallel operations .. Take a small chunk of the matrix and hand it off to a number of workers and then collate the answer at the end.
Great strides have been made with this sort of parallel operation. These days GPUs (the graphics card in your computer) is MUCH better at this sort of thing than your CPU.
When I was a student at Melb Uni 1978 to 1981, we had to write a program in BASIC for a basic ray tracing of light through two lenses over a certain distance (I think). I seem to recall we had to do the paper calculations first by hand. Then write the program. You then had to get time on the computer to type in your program. There were a few stations in a dingy room over in the Physics department, I think. Then you waited a day or so to see if the program ran properly – only to discover that George had neglected to mention some really important grammar or something to make it run. So you wrote it again and tried again. Today such programs are off the shelf and run in a matter of seconds once you input the data.
Not quite what you asked, a fairly small example. But never the less, it shows the improvements.
Peak Warming Man said:
Theories don’t require much number crunching, a rough calculation can be done on the back of a Toohey’s Old coaster.
Number crunching is where computer power comes into it’s own.
Working out how many grains of TNT are required to fire each piece of the critical mass at just the right velocity, the density of the neutron cloud and a myriad of other calculations would have benefited from better computers obviously.
Real atomic tests are now no longer needed, it can all be simulated now in big mother computer programmes, I think that is where the real advantage of modern computer processing power lies.
it’s also about perfect timing and creating explosive that burns perfectly and in a particular way and at a varying speed
no such thing with explosives as “perfect”.
AwesomeO said:
Are there any examples of calculations that took the Manhattan mainframes months to run that I can punch into an iPad to compare. Or any similar feats of calculating that are now rendered ordinary.
Not sure I have an answer to that, but I can think of a few interesting computations.
1. The original collision that formed the Moon was computed using a computer program designed specifically for the hydrogen bomb project, that computed the impact of two atomic nuclei in the heart of a thermonuclear reaction. I think it used some sort of advanced version of the liquid drop model of the atomic nucleus. Instead of inputting two atomic nuclei into the calculation, the researcher input the mass of the Earth and the mass of Mars. The computation proved that the Moon formed from impact debris.
2. The original computation that lined up the coasts of Africa and that of South America at the edges of the continental shelf was done in Cambridge as a the first program run on the then fastest computer in the world. The error in fit was unexpectedly tiny. This computation proved sea floor spreading for the first time.
3. The books of the Cambridge mathematical tables. The Cambridge four figure mathematical tables were produced in 1946.
4. “A Million Random Digits with 100,000 Normal Deviates” by the RAND corporation from 1955. Apparently the computation took the word’s fastest computer mainframes eight years.
5. The stuff Feynman did for the Manhattan project (1942-1946). His job there was to produce computations in advance of when they would be needed by the Manhattan Scientists. I don’t know what he computed, but probably things like Bessel functions, other solutions of differential equations and integrals.
6. Computational Fluid Dynamics progressed relatively slowly because a full simulation with resolution n requires of order n^4 computation steps to compute. The earliest CFD calculations that really pushed the limits of computing were weather predications by Deardorff. Stuff such as “A numerical study of three-dimensional turbulent channel flow at large Reynolds numbers” by James W. Deardorff (1970).
It’d be nice to know if the software that the Apollo spaceflights used is still available.
mollwollfumble said:
It’d be nice to know if the software that the Apollo spaceflights used is still available.
I was watching a you tube thing if an old NASA film talking about how they worked out how the flight was progressing it was talking about three observational points some how intersecting and creating a zone that the spacecraft was most likely in