Advertisment

Tech’s Reliability Gap

author-image
DQI Bureau
New Update

It sounded like a reasonable response to the devastating attacks of September

11. Companies talked of reducing air travel, increasing their use of

technologies such as videoconferencing and collaboration software, and further

decentralizing operations through networks such as the Internet. In all of these

measures, there’s an assumption that technology is the key to making

businesses not only safer but more stable and resilient.

Advertisment

Boy, I’m not so sure. When is the last time you said, "Wow, I just can’t

believe how dependable my personal computer is!"?

There’s no doubt that computers, software, global networks, and the like

have made us more efficient as corporations and as individuals. Some security

measures, such as dispersing people and computers and using networks to connect

them, have made us even more dependent on information systems. But all too many

of these technologies are not only wide open to attack by hackers and fanatics,

they’re also absurdly unstable and unreliable. Relying on them still more,

without making huge improvements in how they’re designed and used, seems

certain to compound the disruptions we’ve already seen.

The root problem is the very nature of technology development, notes Richard

Pethia, director of the Computer Emergency Response Team Centers at Carnegie

Mellon University’s Software Engineering Institute. "Technology evolves

so rapidly that vendors concentrate on time to market, placing a low priority on

the security of their products," he said in recent testimony before a House

subcommittee.

Advertisment

Even more worrisome is the low priority placed on basic reliability. One of

the biggest challenges to distributed corporate operations may be bad software.

According to information technology consultant Standish Group International, bad

software last year cost about $175 billion worldwide in lost business due to

downtime, more than double the amount two years ago. It makes you wonder if

companies are ready to push more people into remote offices or even home to

telecommute, where technical support is shaky or nil. I can’t tell you how

many times my PC has inexplicably crashed, requiring on at least two occasions

that it be sent to techies across the country to fix.

To the Internet’s credit, key services such as e-mail and instant messaging

held up amazingly well in the hours after the terrorist attacks. But the very

strengths of the Internet may well increase the likelihood of widespread

disruptions in the future. The nearly unstoppable global connections it forges

among some 110 million computers online make it far easier to spread viruses and

software bugs from a few computers to millions in an instant. Private networks

may be better equipped to limit the impact of attacks and glitches. But even

there, remote and small-office setups often use PC technologies that are more

susceptible to viruses and just plain-old breakdowns.

This shaky situation suddenly makes cyber-attacks much less farfetched than

they once seemed. Viruses set loose by bored teenagers continue to bring

corporations to their knees. It’s not hard to imagine the havoc that a

dedicated group of fanatics could unleash on the systems that we intend to

depend on more than ever. Moreover, says Standish Group Chairman Jim Johnson,

disaster recovery plans at all but two of 56 companies he surveyed recently are

out of date, typically allowing them to return only 10% of their critical

software applications to operation quickly. Says Johnson: "Disaster

recovery is a disaster."

Most of these problems require big changes not only in technology, but in our

thinking. Companies must make technology reliability a top priority and demand

the same of their suppliers. Let’s not fool ourselves: None of this can be

fixed overnight. But I can’t think of a better time to start.

By Robert D Hof

in BusinessWeek. Copyright 2001 by The McGraw-Hill Companies, Inc

Advertisment