Advertisment

Bolts from the blue

author-image
DQI Bureau
New Update

Weather forecasting

as we know it was born in the trenches of World War I, when a prophetic

young British meteorologist named Lewis Fry Richardson had a stroke

of genius. Richardson's idea was to lay a grid over the landscape

and calculate the behavior of the atmosphere in each cell. The math

would be done by an array of 64,000 people gathered in a tremendous

amphitheater. Each person, armed with a mechanical calculator, would

compute the weather in a given cell based on physical equations,

observations streaming in from the field, and the results passed

along from neighboring cells. In one fell swoop, Richardson had

anticipated both modern forecasting and parallel computation.



Advertisment

Today, the computers

at the National Center for Environmental Prediction (NCEP) of the

National Oceanic and Atmospheric Administration (NOAA) and elsewhere,

fed by satellites and a global network of observatories, are a realization

in silicon of Richardson's dream.

These computers

do a pretty good job of predicting the weather by laying a coarse

grid across the United States. The cells of the grid are currently

32 kilometers on a side, which makes it impossible to predict precisely

where a storm is likely to strike. They can say if it will rain,

but not if it will rain on your parade.



That was not

good enough for the organizers of the 1996 Summer Olympics in Atlanta.

They needed to know how clouds would affect equestrian events, if

dew would endanger racers at the Velodrome, if the wind would be

adequate for sailors or too strong for platform divers. And perhaps

most important, they needed to know if a thunderstorm would threaten

the elaborate closing ceremonies.



Advertisment

To satisfy the

needs of the Olympic organizers, a team of IBM researchers led by

Zaphiris Christidis worked closely with precise weather forecasts.

Christidis adapted a well-known mathematical model of the atmosphere,

the Regional Atmospheric Modeling Systems (RAMS), developed at Colorado

State University, to run on an IBM RS/6000 SP parallel computer.



For the Olympics,

the researches divided the Atlanta area on a grid that could resolve

weather events on a scale of between 1.5 and 5 miles and track their

evolution in 12-second increments. To make this model run fast enough

required careful tuning and optimization, an art that Christidis

excels at. "The target was to produce a 24-hour forecast,"

Christidis explains, "but the constraint was to produce that

forecast in just a few hours. After all, if you needed 24 hours

to do a 24-hour forecast, you might as well just open a window and

stick out your head." In the end, by optimizing the RAMS code

and choosing an SP that consisted of 28 processor nodes, Christidis

was able to generate a 24-hour forecast in less than three hours.



What brings

the project under the umbrella of Deep Computing is the act of combining

Christidis's optimized weather code with Treinish's visualization

tools to create a system that can be used to make real-world decisions.

"It's putting the pieces together," says Treinish. "All

the pieces were already out there. We used a model that originated

at a university, visualization software developed here at Research

and available commercially, and the SP, which is also available

commercially. But you can't just buy all these pieces and say, 'Ok,

now we have an interactive forecaster'."



Advertisment

The most dramatic

test of the forecaster came on August 4, 1996, the day of the closing

ceremonies. The National Weather Service's coarse-grained models

predicted that the Atlanta area would have thunderstorms, which

could have been disastrous if they struck the crowded stadium. Ordinarily,

the organizers would have delayed the ceremonies, which would have

meant an expensive anticlimax to the games. The IBM system produced

an animated 3D visualization that showed storm clouds sprouting

like mushrooms over the Atlanta area. Beneath the coluds were puddles

of blue corresponding to predicted rainfall, which swiftly moved

across the landscape, missing the stadium by 10 miles. The organizers

decided to proceed with the ceremonies. The storm followed the precise

track predicted by the IBM system, and the ceremonies went off without

a hitch.



The forecaster

has since been demonstrated around the United States. At a computer

show in San Jose, the IBM system-running on an SP cluster that was

also demonstrating Deep Blue-correctly predicted the pattern of

the next day's rainfall. This feat so impressed reporters that the

local press dubbed the system Deep Thunder.



While Deep Thunder

is of obvious value to weather bureaus, it has sparked interest

in other quarters as well, including insurance companies, airlines,

utilities and agriculture. Visualization will be the key to such

varied applications. "You have to be able to disseminate the

forecast to a user who may not be a meteorologist," Treinish

says. "In some cases, you don't even show the weather directly."

Advertisment