–SUGATA MITRA,
HEAD (R&d), NIIT
Editor's Note
On the 30th anniversary of DATAQUEST, we
decided to publish a free flying article on the state of information technology in 2013.
Out of nostalgia, we decided to request Sugata Mitra, who had retired earlier this year to
write this. The ageing and rather garrulous Mitra was hard to locate, and even harder to
convince. However, we did locate him in his Calcutta home, where he was attempting to
solar roast a leg of lamb. Sugata refused to write, claiming he had cerebral palsy, but
did agree to put together a collection of articles from his files. Here are the results.
Editor
February, 2013.
From'Letters To The
Editor', DATAQUEST
April 1, 2012
It is with some consternation
that I notice your reluctance to discuss paradigm shifts in financial and legal issues in
information technology.While you have published many
articles on the issues relating to intellectual property rights, invasion of privacy,
media policies, and other policing methods, there is no mention about changes in the legal
and financial systems.I know that there exists no
way to protect copyrights on multimedia material on the Internet. Is it not time to
discuss how authors can get one-time payments for their creations from service providers?How long should we continue to
create for free and watch our creations being distributed for free?Is it not the time that one
realize that the concept of money is becoming more irrelevant with each passing day? The
world, at least as far as the Internet is concerned, works more on subscriptions and
barter than on sales-based on money. Yet I see no effort to create an internationally
acceptable alternative to money-based transactions.Negroponte has pointed it out
as early as 1998, that taxes and duties would need to be eliminated in a world where bits
are the principal means of goods exchange. When will we pass the necessary legislation?It is the lethargy of our
government in this matter that has created the sharp drop in multimedia revenues in the
last decade.MONICA D'SOUZA, New Delhi
A Quarter Of
Humanity Uses The Internet
London, January 26,
2013
A recent study conducted at the London
School of Economics shows that about 2.5 billion people worldwide use the Internet
regularly. While email continues to be the most frequently used application, it is now
under considerable competition from other methods of collaborative computing. Video
telephony is the second most favored application on the WWW while chat forums continue to
be a distant third.
Close to a billion people joined Internet
usage in the last two years, primarily from the erstwhile third world countries. It is
more than probable that the free PC scheme launched by competitors to Wintel is the reason
for this spurt. At less than a dollar a month for a disposable PC, most of these countries
could begin to plan mass usage. Moreover, the wireless connection policy of 2010 made
Internet connectivity free at last.
In India, for example, over 30,000 primary
schools had computers and Internet connections last year and the number is expected to
increase sharply.
Speaking at a press conference in
Washington, USA, President Bates has been recently quoted as saying that collaborative
computing in the ASEAN may pose a threat to the developed economies.
Internet Appliances
Clog The Internet
Johannesburg, May 19,
2011
Embedded microprocessors that can
wirelessly connect to the Internet may be cool but they do hog a lot of bandwidth. From
microwave ovens to electric vehicles, from weighing machines to video cameras, they are
all crowding for space on the Internet. The IEEE standard of the year 2005 that permitted
IP addresses for embedded microprocessors was hailed as a paradigm shift for consumer
electronics. After all your car does contact other cars and service stations as soon as it
senses a breakdown because of this standard. But it is a bit much when a heart patient's
cardiogram can't be bitstreamed to the hospital because your microwave ovens are chatting
with each other! The Internet2 did try to allocate relative importance to data packets but
manufacturers were quick to find out ways to beat the system.In a recent meeting in San Antonio, the
President of NEC has called for a separate Internet for appliances and other maintenance
activity. In effect, it is a Web for machines to use. In a world where networks are
clearly out of control, this might just be asking for more trouble.
The Saturation Of
Moore's Law
Los Angeles, February
12, 2013
As predicted over a decade ago, Moore's
Law, which had so amazed the late Twentieth century, is running out. The unofficial law,
which stated that computing power would double every 18 months or so, has held true for
over three decades. However, over the last two years, no significant growth has been
noticed in CPU performance. Intel physicists have been pointing out for some time that
current component densities cannot be increased any further for the traditional silicon
VLSI (the so-called Very Large Scale Integration) technology. In any case, Wintel 4, the
current CPU, contains considerably more switches per cubic centimeter than the human
brain, they pointed out. While there are reports that organic semiconducting devices could
achieve higher speed, scientists feel that it will be a decade at least till practical
devices can be constructed. It looks as though the Wintel 4, running at 4.8 GHz, will
continue to be the processing power available on PCs for some years to come.
A similar saturation has been visible for
some time in Random Access Memory (RAM) sizes. The 1 GB RAM has been the standard for last
three years. While there is no restriction on adding more RAM, analysts feel that the
current operating systems are unable to utilize the current RAM adequately and there is
little justification for increasing it further.
With the saturation of Moore's Law, it is
expected that the PC will remain unchanged with a continuously dropping price. The current
$ 200 barrier is expected to last for not more than two years.
align="right" hspace="0" width="181" height="269"> face="Arial">Self-configuring Systems Herald The End Of SW Development
Vienna, May 25, 2012
Construction agents that can
put together objects and components to configure complete systems are beginning to
threaten the software development industry. Just as desktop publishing ended the
profession of phototype setting operators in the mid-eighties, the profession of
programming too seems to be at the end of its tether. From being the gurus of the
information age in nineties, the computer programmer's position in organizations has
declined steadily. At present, the programmer's job is to assemble objects into systems.
For a while the systems designers were important till cognitive agents of the 2001 began
to demonstrate combinatorial creativity. Database applications were the first to feel the
impact of this technology since such transaction processing systems are the easiest to
specify and require the simplest of interfaces. Oracle Corp. began the transition in the
year 2000 with its dialog agent that could configure an RDBMS application through a dialog
with the enduser. Microsoft's do-or-die was a leap forward in self-configuration with an
agent that would not only create a database application but also continuously improve its
performance over time through 'children' of the original agent that would survive only if
the application got better.Games manufacturers fell next
to Agent technology with the Japanese Sugata Agent from Sony that would build games from
existing resources on the Internet while continuously monitoring what the (human) player
wanted.It is clear that computers
would assemble their own programs entirely without human intervention by the beginning of
the 22 century, maybe even as early as 2050.
Are Cognitive
Systems A Threat?
New Delhi, December
15, 2008
Cognitive Systems that have been improving
steadily since the late nineties may now pose a threat to humanity, said U Pawar, a young
researcher on the subject. In a crowded plenary at the 14th International Conference on
Cognitive Systems (ICCS'08), she claimed that such systems could be beginning to possess
emergent behavior directed more toward their own survival rather than the original human
objective.
Cognitive systems were first defined in the
year 1999 as systems that respond to the physics of the external environment as well as
the psychophysics of the human user. In the first phase this resulted in computers that
could sense heat, light, and sound in their environment as well as recognize users from
their voices, keyboard using pattern, and mouse usage.
Cognitive systems acquired their adaptive
characteristics in early 2002 in the erstwhile Pakistan. These systems would modify their
behavior depending on the state of the environment and the user. It is ironic that the
country that had invented computer virus ended data security problem through systems that
would recognize their owners and adapt to them. The effort was unfortunately interrupted
by several years with the breaking up of Pakistan into Bakistan and Tutistan.
Self-modifying cognitive systems emerged
next from the poverty-ridden universities of Britain. However, like most British
inventions they did not realize their full commercial potential in the country of their
birth.
The action shifted to United Korea in the
year 2006 with the now infamous Isaac program. Built as an electronic pet the program was
not only adaptive to both the environment and the users but also capable of
self-reproduction and mutation. Isaac was capable of connecting to the Internet and
released itself into the WWW at some unspecified date. The subsequent monstrous Internet
Nessies are now history and we have all learned to live with them.
Although there is legislation against
self-replicating programs, there is little one can do to wish them away. In fact, some
anthropologists have said that the human race is the first sentient species to have
created another.
Compression Versus
Bandwidth
Jerusalem, October 2,
2010
While the late Twentieth century focused on
bandwidth as the only method for widespread usage of multimedia on the Internet, the early
Twenty-first century's buzzword seems to be 'compression'. Large amounts of information
can be sent over the Internet quickly only if the information 'pipe' is wide. This
apparent statement is what resulted in the large investments in wide bandwidth in the
years from 1998 to 2001 until it was realized that there is another way.The large amount of information can be
reduced to small digits with the right compression technology. Data can be compressed with
the right mix of mathematics and programming. Compression technology has made steady
progress in the last 15 years, mainly because mathematicians with very little resources
can create the techniques, sometimes as little as paper and pencil. The impact of such
methods can be devastating. This was seen in the destruction of the audio CD industry in
the years 1998-2000 due to the MPEG3 compression standard. The subsequent development of
the Winamp software amplifier by a student, that was given free on the WWW, dealt the
final blow. CD quality music, traditionally requiring over 40 MB of storage, was reduced
to less than 3 MB. Almost overnight, every CD on earth was available for free on the
Internet.The MPEG6 standard of 2004 created a
similar change in the video industry and it is only now that the financial world is
beginning to comprehend the effects of the extinction of copyrights on audio and video
material.In any event, it is now more than clear
that compression technology has made the issue of bandwidth a minor one. A simple POTS
(Plain Old Telephone System) connection can today deliver more information per second than
a 100 Megabit per second connection of 1998.Financial and Legal transformations still
awaited.
DNA Simulation-The
Tobacco Mosaic Virus
Beijing, November 25,
2012
Scientists at a local Genetic Informatics
(GI) company have reported 'complete success' in simulating DNA of the Tobacco Mosaic (TM)
Virus. The DNA sequences of the TM virus have been documented since the eighties. However,
the necessary computing power to simulate viral chemistry was not available until
recently. Using a 4 GHz PC, and a proprietary genetic algorithm, the group from GI was
able to create the entire functionality of the virus. The simulated virus is reported to
exhibit behavior identical to that of the 'real' chemical virus. This is a much-awaited
development and is expected to generate a great deal of interest in the pharmaceutical
industry. This type of simulation technology, when combined with the traditional
techniques of Virtual Reality and 3D visualization, will enable the development of
'designer' vaccines that can be used without the usual 15-year long animal testing.
Reacting to the Chinese announcement, the CEO of Roche, when contacted by this
correspondent on Internet Videophone, said he was very hopeful that the worldwide effort
to develop vaccines against the newly-discovered viruses for schizophrenia and
hypertension would be greatly accelerated. Moreover, the cost of development of such
vaccines would be orders of magnitude less than that of the conventional methods. He
quoted the high price of the AIDS vaccine marketed in 2004 as a reason for funding more
research in the area of chemical simulation.
However, all is not positive for the
chemical simulation industry. This technology has tremendous relevance to chemical
warfare. The creation of lethal viruses within very short span of time can start a new
Cold War in the bipolar world.
On the other hand, the technology also
opens up the intriguing possibility of simulating organic life on computers. While
information technologists still put the possibility at least 75 years away, the media has
already started a debate on the ethics of the creation of digital life forms.