Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search
13 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
US builds the world's fastest supercomputer. Know what it's doing? Modeling climate change. (Original Post) mahatmakanejeeves Feb 2019 OP
Um. How can it be modeling something that doesnt exist? tymorial Feb 2019 #1
Don't tell Trump. Doodley Feb 2019 #2
+1 2naSalit Feb 2019 #4
I have worked on developing "expert" software for 25 years at140 Feb 2019 #3
And your point is? Boomer Feb 2019 #5
Point is straight forward... at140 Feb 2019 #6
This is a non-sequitor Boomer Feb 2019 #7
ok, hope you are right, but I have reservations about computer speed at140 Feb 2019 #9
The article addresses more than just processing speed. littlemissmartypants Feb 2019 #10
The article emphasized speed more than anything else at140 Feb 2019 #12
Greater processor speed can make for better modeling OKIsItJustMe Feb 2019 #11
In a different thread you posited that the climate is changing because of the Sun NickB79 Feb 2019 #13
Not all Tennesseans ... GeorgeGist Feb 2019 #8

tymorial

(3,433 posts)
1. Um. How can it be modeling something that doesnt exist?
Fri Feb 1, 2019, 04:33 PM
Feb 2019

Surely the president and the republicans wouldnt lie about it. They are our elected leaders. They look out for us!

Couldn't resist

at140

(6,110 posts)
3. I have worked on developing "expert" software for 25 years
Fri Feb 1, 2019, 04:42 PM
Feb 2019

and what I know for sure is, the software can be only as good as the programmers.
I have worked on a slow ass IBM 1620 on to must faster IBM 360 and on to faster
Burroughs computer. The faster computer helped me progress faster in development
of software which generated correct results. But if the algorithms were faulty,
faster computers did not correct them.

Boomer

(4,170 posts)
5. And your point is?
Fri Feb 1, 2019, 05:18 PM
Feb 2019

Do you have reason to believe that current climate models use faulty algorithms? Is your accurate but vague generality somehow specifically tied to this project?

Given the sheer number of factors that affect climate, I would expect that increased computing power will be useful in adding more forcing variables to the models. And the faster a model is run, the faster climate scientists can compare the output to known results and fine-tune the algorithms.

From what I know of their methodology, the models are developed by trying to predict climate that has happened in our past. If the results match what actually happened, then you have at least some verification on the models ability to predict future climate events. It's not like they're just making this up and accepting the results without question. And so far, climate predictions that have been made based on those models are proving to be pretty reliable. Always room for improvement, but it beats flying blindly into the future.

at140

(6,110 posts)
6. Point is straight forward...
Fri Feb 1, 2019, 05:22 PM
Feb 2019

if the current models are good and reliable, the faster computer will not change the results.

So if you missed my main point, based upon 4 decades of developing engineering and manufacturing automation software, speed of computers is not as important as the validity of algorithms used.

Boomer

(4,170 posts)
7. This is a non-sequitor
Fri Feb 1, 2019, 05:40 PM
Feb 2019

I'm not sure why you keep arguing a point that no one was trying to make. The obvious advantage of a super computer is speed. Given the complexity of climate change models, more work can be done if the models can be run faster and especially if you have greater access to a that super computer.

We're dealing with an emergency situation, so adding more speed to research progress is a big deal, even if the accuracy is the same. But running more models, more often, means that progress in accuracy is accelerated. This is a win-win situation.

at140

(6,110 posts)
9. ok, hope you are right, but I have reservations about computer speed
Fri Feb 1, 2019, 07:34 PM
Feb 2019

as something important, when super computers are already super speed.

littlemissmartypants

(22,840 posts)
10. The article addresses more than just processing speed.
Sat Feb 2, 2019, 01:32 AM
Feb 2019

It is a discussion of the use of speed combined with artificial intelligence which addresses the algorithm issue you bring up.

Summit, which occupies an area equivalent to two tennis courts, used more than 27,000 powerful graphics processors in the project. It tapped their power to train deep-learning algorithms, the technology driving AI’s frontier, chewing through the exercise at a rate of a billion billion operations per second, a pace known in supercomputing circles as an exaflop.

at140

(6,110 posts)
12. The article emphasized speed more than anything else
Sat Feb 2, 2019, 01:18 PM
Feb 2019

And that is my only retort. Emphasize better modelling as the prominent feature. That would make me happy. A faulty model run at higher computer speeds will generate faulty results faster.

OKIsItJustMe

(19,938 posts)
11. Greater processor speed can make for better modeling
Sat Feb 2, 2019, 12:40 PM
Feb 2019
https://www.wunderground.com/cat6/IBM-Introducing-Worlds-Highest-Resolution-Global-Weather-Forecasting-Model?cm_ven=hp-slot-5
IBM Introducing the World’s Highest-Resolution Global Weather Forecasting Model

Dr. Jeff Masters · January 8, 2019, 11:52 AM EST


Above: An August 2018 monsoon forecast for India, shown at left by a global weather model operating at 13-kilometer resolution. At right, the new IBM Global High-Resolution Atmospheric Forecasting System (GRAF) operates at 3-km resolution, showing much more detail, and updates 6 to 12 times more often than the current top global forecast models. Image credit: IBM.




Consider: One of the chief tasks of early digital computers was the "modeling" of trajectories:
http://zuse-z1.zib.de/simulations/eniac/history.html
Trajectory calculation

At the time of World War II intelligent bombs were not yet developed, so ground based artillery was used to attack the enemy. Depending on the distance of the target and the type of artillery, the bullet has to be shoot in a certain angle. This angle also is related to the weather, especially to the wind. To know the correct angle in the specific situation, the artillery men used so-called firing tables. But those firing tables had to be computed first.

Those ballistic computations were done at the Moore School of Electrical Engineering, part of the University of Pennsylvania, too.

Calculating a trajectory could take up to 40 hours using a desk-top calculator. The same problem took 30 minutes or so on the Moore School's differential analyzer. But the School had only one such machine, and since each firing table involved hundreds of trajectories it might still take the better part of a month to complete just one table. [1]


The speed up in developing new artillery designs caused an increased need of computation power. In November 1942 US forces landed in French North Africa, and entered a terrain, which was entirely different from what they had met before. The existing firing tables turned out as completely useless. That made the computation power totally to the bottleneck of the war machinery.

Under these circumstances John Mauchly, a member of Moore School's Engineering, Science, and Management War Training (ESMWT) program, wrote a first five-page memo called The Use of Vacuum Tube Devices in Calculating. In this paper he suggested a machine that would add 5,000 10-digit numbers per second and would be more than 100 times faster than the fastest computer at that time (the fastest computer in 1942 was a mechanical relay computer operating at Harvard, Bell Laboratories with 15-50 additions per second [1]).



As processor speeds increased, the accuracy of the models increased. At this point, computer-guided munitions can instantaneously calculate trajectories, but in reality, it's still just modeling.


It may be worthwhile to review some papers on climate models through the years:

If you read through this progression, you'll see that the models have always been constrained by processor speed.

NickB79

(19,277 posts)
13. In a different thread you posited that the climate is changing because of the Sun
Sat Feb 2, 2019, 05:39 PM
Feb 2019

Funny, you trotted out one of the climate denier's most popular arguments a few days ago (it's not humans, it's the sun), tried to back it up with another denier talking point when called on it (what about the Dust Bowl?), and here you are sounding like you are repeating another one of their favorite arguments (the models are junk).

Curiouser and curiouser.......

https://www.democraticunderground.com/100211747334#post11

Latest Discussions»Issue Forums»Environment & Energy»US builds the world's fas...