Advertisment

Algorithms and parallel processing Daring To Think Deep

author-image
DQI Bureau
New Update

In

May 1997, when Kasparov, the reigning world's chess champion had

just been defeated in a six-game match by the IBM super-computer

Deep Blue, a new genre of computing was recognized. Deep Computing.

One that combines raw computing power and algorithmic virtuosity-once

the province of high-end scientific computing-to solve complex real-world

problems in areas as wide ranging as weather forecasting, business

computing and human genetics.



Advertisment

In

May 1997, grandmaster Garry kasparov looked into the Deep and knew

fear.



Kasparov, the world's reigning chess champion, had just been defeated
in a six-game match by an IBM supercomputer named Deep Blue. "I

am a human being," a stunned Kasparov told the press. "When

I see something that is well beyond my understanding, I'm afraid."






What

had spooked Kasparov exhilarated scientists and engineers at IBM.

"When we had gotten to the end of Deep Blue and had beaten

Kasparov, we looked at it and asked, where do we go next?"

recalls William R Pulleyblank, Director, Mathematical Sciences Department,

Thomas J Watson Research Center.

It was clear

to Pulleyblank and others at IBM that the convergence of technologies

that had led to Deep Blue's victory in the 64-square world of chess

could be used to make important moves in the real world. "This

time," says Mark Bregman, then general manager of the RS/6000

division, "the winner will be everyone whose life is touched

by information technology."



Advertisment

Scientists at

IBM Research recognized that Deep Blue had defined a new genre of

computing, one that combines the powerful parallel computers and

advanced algorithms typical of scientific computing with the vast

databases typical of business computing to make human decision.

"It's doing science on nonphysical data," Pulleyblank

explains. The new genre, dubbed Deep Computing, has already been

applied to a wide variety of problems, from more precisely predicting

the paths of thunderstorms to finding patterns in grocery purchases.

The possibilities opened up by powerful computers and algorithms

are so profound that IBM has recently formed a Deep Computing Institute

under Pulleyblank's directorship to help explore and define this

new field.



The most important

factor in the emergence of Deep Computing is the availability of

relatively inexpensive, fast and powerful systems such as the RS/6000

SP parallel computer, the guts of Deep Blue. "Each node in

an SP is essentially a workstation," explains Marc Snir, Senior

Manager for Scalable Parallel Systems. "The glue of the SP

is a hardware switch that can connect dozensor hundreds of these

nodes together." The SP architecture makes it relatively easy

to scale up a system to the size required by the application.



What inspired

Deep Computing was the convergence of powerful computers and the

massive data sets typical of business computing. "All of a

sudden," says Pulleyblank, "it was possible to take the

kind of processing used for scientific computing and apply it to

the commercial sector to get real business impact." Deep Computing

provides the methodology to uncover patterns and trends hidden in

terabytes of data, and to do so rapidly enough to make informed

and timely decisions.



Advertisment

For some tasks,

such as airline scheduling or customer profiling, systems with fast

processors like OS/390 mainframes or high-end RS/6000 workstations

are fast enough. Other tasks demand parallel computers, like the

RS/6000 SP, which yoke together many processors. But according to

Moore's Law, which says that computing power doubles every 18 months

or so, such hardware distinctions will gradually disappear. "In

the long run," says Nick Bowen, Director, Servers, IBM Research,

"Deep Computing will be something you do on any platform."



Processing speed

is not the whole story behind Deep Computing. A clever algorithm

can achieve overnight what progress in hardware would require decades

to accomplish. Algorithms are the recipes computers use to solve

problems-the sequences of simple steps they use to arrive at complex

results.



If Moore's Law

is like amassing a fortune through compound interest, an algorithm

can be like winning a lottery. "The algorithmic things are

really startling," says Pulleyblank, "because when you

get those right you can jump three orders of magnitude in an afternoon."

That's the equivalent of 15 years of Moore's Law progress. For example,

the same airline scheduling problems that used to take many hours

to solve on a powerful mainframe can today be solved in minutes

on a ThinkPad laptop computer. While some of this improvement is

due to better computer chips, much of the progress stems from faster

algorithms.



The final technology

driving Deep Computing is the advent of an inexpensive global communications

network. "This is absolutely crucial," says Pulleyblank.

"I may never have a server capable of solving large weather

models on my notebook computer. But because I can link into these

servers seamlessly, I'll have the same capability."



In Pulleyblank's

vision, Deep Computing is the key to making sense of our explosively

complex world. Yet at the same time, it should be so widely adopted

as to be taken for granted. "When you walk into a room and

click the lights on," says Pulleyblank, "you don't even

think about what it takes for that to happen. You just know that

the switch turns the lights on. If we're successful, Deep Computing

will have the same level of pervasiveness, the same level of transparency."

Advertisment