Advertisment

2013: Odyssey Three

author-image
DQI Bureau
New Update

PRASANTO K ROY, a FORMER

ASSISTANT EDITOR of DATAQUEST,



is EDITOR, COMPUTERs@HOME and PC QUEST

Advertisment

A planet with ten billion

embedded micros, five times more warfare infotech than weapons, a million terabits of

bandwidth. How did we get here?

hspace="2" width="180" height="239">I face="Arial">n the beginning was the game.



In the last year of the 20th century, when production chips first crossed the GHz barrier,
it had seemed almost bizarre that it happened not in a massively-parallel supercomputer,

but in a $ 299 TV game console.

Last year, 2012-13, Indian users spent $ 6

billion on game software. But even in the nineties, the real high-end stuff was driven by

gaming, and it was among the most deadly-serious of things that happened to commercial

infotech.

Advertisment

It wasn't productivity apps and business

software that triggered India's PC explosion at the turn of the century, it was kiddie

games and adult games, and what were then ultra-powerful home theaters built around cheap

GHz RISC chips, megadisplays, and Bose audio systems. The PC-entertainment convergence had

begun then, somewhat later than predicted. Gaming was 40 percent of India's software sales

then, and about a quarter of its software exports-India finally broke into shrink-wrapped

stuff with games. Bandwidth was still scarce, and access to the early Internet was ISDN

for most people, 56 K for the low-end, and cable for the serious gamers.

The gamers always drove the specs. As we

crossed into this century, desktop and NCs were inadequate for gaming. They struggled even

with bloated office software.

Interesting how the performance and

reliability curve changed over time. In the last 10 years of the 20th century, it had

actually declined, as conventional chips kept pushing clock speeds but without keeping

pace with huge OSes and apps. Then, in the year 2004, came what is now accepted as a

landmark in the history of computing. That was when the IEEE Real-Time Performance

Initiative (RTPI) committee defined the acceptable lower limits of user interface and

system response, driven largely by gaming industry initiatives.

Advertisment

RTPI 12.1 said that a system should respond

completely to any user input or interrupt within 10 ms of the event, as long as less than

5 million cumulative integer and floating-point operations were involved, and as long as

the output didn't need to be time-spread (such as a movie). But Revision 12.6 added the

requirement that the application or OS could not terminate without completely committing

all data and instructions to disk. Post such termination, the system must recover

completely within 100 ms with no loss of data, instructions, or buffered user input.

Effectively, they were saying: your system should be usable, fast, and crash-proof.

Suddenly, military-class specs were brought into office and home PCs. Or, more precisely,

gaming specs.

This was challenged five times by

Microsoft, three times by Intel, and four times by Compaq, but the committee stuck to its

guns: you could do what you liked, but if you wanted a RTPI 12.1 compliance stamp, you

conformed. It took two years for the major software and system vendors to respond with

compliant systems.

A parallel IEEE committee spent two years

defining a standard interface specification, covering conventional, speech and video input

and output. Finally, this was included as a usability extension to the same RTPI 12.1.

Speech and video recognition because viable, appeared on public interface terminals

everywhere. Each year experts predicted the death of the keyboard and mouse, and each year

the world made more of them. Till today, there's no sign of the century-old QWERTY

keyboard being threatened by speech and vision interfaces, especially in offices, where

the cubicles have gotten smaller and closer together, and there's nothing as valuable as

privacy.

Advertisment

But finally, computers and their software

and interfaces begun to get really usable by public utility users, housewives, fathers,

hourly-wages labor-and not just by kids and a few million Internet-privileged gamers. By

the year 2008, there were 3 billion computers, one for every three people on the planet.

In addition, there were at least 10 billion embedded computers in every electric and

electronic device ever made.

How that really happened is another story.

I color="#000000" size="2" face="Arial">n the year 2002, the international federation of

central reserve banks got together and asked the heavily funded three-year-old Cybercrime

division of Interpol for help in killing salami-slicing.

Advertisment

Salami-slicing was an old cracker crime.

Once confined to disgruntled and tech-savvy bank employees, it became popular among remote

crackers. It was simple: take 1 million savings accounts, and round down each individual

balance to the nearest rupee. Transfer those paise or cents in one sweep into a specific

bank account. It was rare for account holders to notice the difference, and almost no one

would complain about 63 paise becoming 00 paise.

In the early years of widespread cybercrime

at the turn of the century, the media continued to report it with a mix of amused

indulgence, hype, and sensationalism. It clearly considered online pornography the more

serious of the cyber issues. Then came the 2004 Cyberkiller, and a wave of original and

copycat crimes across the world that relegated salami-slicing to the inside pages of web

news services.

The Cyberkiller struck first in Germany.

Knoll Pharma's extranet on Knollstrasse in Ludwigshafen was penetrated on June 22, 2004.

The firewall retained some traces of the crack, and IS managers took the company's finance

subnet offline and took it apart. They found nothing. They didn't look elsewhere. That's

how they missed a soft agent left behind on an old NT6 server linked to the main CIM

workstation network. The worm stayed dormant for 10 days, then activated, making a few

small changes to the formulation for aspirin and a line of sleeping pills. The CIM system

obediently transmitted the edited formulation to the manufacturing control computers.

Another five days, and the altered drugs were out across the European Community (EC).

Advertisment

The first two deaths were reported from

Paris, the next one from Berlin. Six casualties and 24 hours later, the problem was linked

to the drugs. Another 12 hours passed before it was traced to the German source, during

which time there were two similar deaths in London-and the victims had taken aspirin

tablets not from Knoll but from Boots, UK. In another hour CNN Online kicked in with the

story, followed by Reuters and AP. There was panic, and all drugstores were closed around

the EC. The US FDA ordered all recent European drugs destroyed. The first recorded case of

cyberterrorism claimed 14 lives and cost Europe and others about $ 40 billion.

The Cyberkiller struck three more times in

four months before the Interpol traced her to a suburb in Sydney, Australia. The

23-year-old Stanford graduate, who turned out to have several petty phone-phreaking

convictions, was using a GPS2 pocket-comm device. She didn't know that it had an invisible

tracer ID pattern put in on a US Defense Department request to the manufacturer. The

tracer ID simply embedded the device code alongwith the GPS-derived location of the

device, accurate to within 1 meter on the planet surface and within 10 cm altitude error,

and attached it to any voice or data transmission's header.

The events had several effects. The

Cyberkiller got a conviction and death sentence in three months in an international court.

Interpol's CyberCrime division got a $ 100 billion annual funding for counter-terrorism

surveillance, equipment, and research. Even the NCRB in India beefed up its cyber crimes

department drastically, as did central law-enforcement agencies across the world. The

media turned away from its indulgent reporting of delinquent whizkids and reported on the

events with restrained revulsion and a very uncharacteristic maturity.

Advertisment

The CyberKiller left a lasting impression

on the infotech industry long after she was killed in an electric chair in the year 2005.

Her most significant 'contribution' to it was the year 2006 global agreement on equipment

tracing. A committee involving all members of the UN Security Council, including India,

and several standard bodies including the IEEE, were formed. In record time it formulated,

ratified, and mandated a standard for embedding tracer codes in all equipment that could

communicate.

There had been four proposals for this

tracer signature system. The first was from Microsoft, which had in place just such a

64-bit code for tracing pirated software, and which offered it for 2 cents per license.

The code was embedded onto streaming video and Microsoft offered $ 10 million to anyone

who could crack, block, or edit it. A Finnish group of four teenagers calling themselves

the GSM Phreakers Club won that money in 48 hours. Finally, an alternative proposal from

the Linux User Group, which involved a long-range non-repetitive data pattern modification

rather than a header code, won. The tracer signature format included a device ID, GMT

real-time stamp, and a configurable location stamp. The location was to be picked up from

either the on-board GPS information, if the device had a GPS receiver built in, or from

the electricity supply. By the year 2007, all electricity supply around the world was

encoded with local-time and location information added on at the local distribution

points. The embedded tracer hardware, if it did not have an on-board GPS, would simply

pick up the rough location information. Battery-run equipment picked up the information

easily out of stray RF from the mains; a year later, all portable devices mandatorily

included a GPS device, adding $ 20 to all their pricetags.

During the year 2008, there was the first

serious cyberwar attack, as the media delighted in calling it. Over six days, crackers

attacked US networks and services. The first was on two Internet-phone switching network

hubs in New York and New Jersey, taking down for 10 minutes all communications on the US

East coast. The second was on the Pentagon. The intruder came in through an official's

secretary's computer, penetrated a missile control network, and played with it for 24

hours before he was traced to a building in downtown Bangkok and caught. The US military

announced the break-in much later, including the fact that the vulnerable missile control

network had been an elaborate decoy system, going all the way to controlling dud silos.

This was part of a cyber-defense project started in 1997 and turned into an almost

foolproof system over 11 years and many minor attacks.

The third attack was on a network belonging

to Johnson & Johnson, and the intruder made changes to a database which controlled the

manufacture of baby food formula. But this was an offline database, and when the system's

replication kicked in at midnight, the CIM computers registered the change, but reported

it to the officer on duty. The manufacture was interrupted, without any of the alterations

executed. The changes would have made the baby food toxic.

The fourth attack was almost simultaneous

on four airports: Gatwick in London; Frankfurt Main; Washington Dulles; and Schipol,

Amsterdam. It took out the air traffic control computers. However, the intruders did not

try subtle, sophisticated changes; they simply brought the computers down. Security

systems kicked in within seconds in all four places, isolating and bringing redundant

systems online in minutes. Emergency procedures put approaching aircraft in holding

patterns or deflected them away, and on-board collision avoidance systems overrode manual

controls and took over the avionics. There were no crashes or mid-air collisions.

Only a few of the intruders could actually

be traced. Most were using older battery-powered equipment without GPS2 tracers, and

short-burst sessions that interrupted the tracer patterns. A few were controlling

remotely-placed handheld PCs with other remote computers, so all that could be traced and

seized were the intermediate notebooks.

The media caught on quickly, despite a

military clampdown on information about the attacks. But it reacted with less panic than

it could have, and that contributed partly to the limited public panic.

What the media didn't catch onto then, but

the military did, was that each attack was far less severe than it could have been. The

attacks could have caused air crashes, severe power outages, or mass killing of infants.

It was as if the digital intruders were trying to prove a point, rather than actually

kill. It was only years later that media columnists speculated on whether the intruders

could have had business interests in the equipment manufacturing or software development

industries....

The US military recommended to the UN

Security Council and the Equipment Tracing committee that GPS3 units be implanted in all

electric devices that could communicate. This was quickly ratified and enforced: by then,

single-chip GPS3 receivers cost less than $ 1, and atomic clocks and transmissions had

progressed enough for 10 cm location accuracy for these tiny devices. The committee also

reviewed and modified the tracer pattern spec to a shorter burst train.

Today, five years later, every electronic

device made has tracer devices, whether or not it can talk. Most devices can:

refrigerators and washing machines routinely dial up for remote diagnostics, household

electricity meters transmit power-consumption data back to the company and billing data to

the household computers. Every automobile has a GPS device and interchanges position,

speed, and traffic conformance data to control units, apart from the onboard computers

that control most of the car's subsystems. The four major manufacturers of GPS equipment

total up $ 260 billion in revenues.

But the most dramatic impact of all this

has been on the military. From 8 percent of the world's military budgets allocated to

infotech 15 years ago, today over 75 percent goes to information systems, digital

counter-invasion hardware and software, network surveillance and monitoring, and redundant

systems.

This has been the most significant

influence on infotech development in the past 10 years. Sort of like back to its roots.

Quite a bit of infotech and Internet development had its origins in US military research,

35 years ago. Then the two moved apart, with the military largely a customer of infotech

for the next 20 years. Then, just into this century, it got apparent that wars were all

going to be digitally fought, and the military stepped up its involvement, research, and

use of infotech and networking in every country in the world. Investment in hardware and

ground troops declined. The infotech industry grew to seven times its size in these past

10 years, and India's software exports about 25 times.

There's an interesting footnote to all

this, and it goes back to gaming. Today, in the year 2013, the big suppliers of military

and commercial defense systems are also the world's biggest game software developers. In

the early nineties, computer games drew on warfare themes and simulations. Then they went

far ahead. Today, the military is largely dependant on technology developed for and

adapted from game machines. The beta testers of many of the simulation modules, and

strategy and counter-invasion software for military use, are teenagers paid on a part-time

basis at these companies. We've come around full circle.

Advertisment