Autonomic Computing, Questions and Answers

From: J. R. Molloy (
Date: Sun Oct 21 2001 - 07:59:10 MDT

Autonomic Computing Q&A v1.1

What is autonomic computing?
Autonomic computing is an approach to self-managed computing systems with a
minimum of human interference. The term derives from the body's autonomic
nervous system, which controls key functions without conscious awareness or
involvement. Autonomic computing is an emerging area of study and a Grand
Challenge for the entire I/T community to address in earnest.

What are the origins of autonomic computing?
Autonomic computing is the evolution of a long tradition of understanding and
creating self-regulating systems. It's risen to the top of the I/T agenda
because of the immediate need to solve the skills shortage and the rapidly
increasing size and complexity of the world's computing infrastructure.

What is the goal of autonomic computing?
The goal is to realize the promise of I/T: increasing productivity while
minimizing complexity for users. It's time to design and build computing
systems capable of running themselves, adjusting to varying circumstances, and
preparing their resources to handle most efficiently the workloads we put upon

What does autonomic computing promise to deliver?
Most immediately, the automated management of computing systems. But that
capability will provide the basis for much more: from seamless e-sourcing and
grid computing to dynamic e-business and the ability to translate business
decisions that managers make to the I/T processes and policies that make those
decisions a reality. Ultimately, autonomic computing is a challenge that must
be met before the industry can deliver 'the next big thing.'

What is IBM doing to drive this idea to the rest of the industry?
First, IBM is focusing its R&D efforts around the challenges of autonomic
computing, in addition;
IBM will distribute 75,000 copies of the autonomic manifesto to the top
technical and scientific minds in the world;
Over the next few years, IBM will fund at least 50 grants to unversities to
encourage additional research on the challenges of autonomic computing;
IBM will organize a scientific conference focusing on autonomic computing in
early 2002.
IBM will also host several conferences and symposiums on progress and
challenges, and continue to encourage our best minds to actively participate
in research outside of IBM;
Company-wide, IBM is developing products that contain key autonomic features
through Project eLiza. (eLiza Homepage)

How does IBM plan to work with other technology companies?
IBM has thousands of partners in all areas of the industry. IBM Research is
highly regarded for collaborating with leading national and university labs
around the world. In addition, we're supporting an extensive grant program,
and we'll form a consortium of experts from the I/T industry to help guide and
shape the future of this effort.

What will a world based on autonomic computing look like?
In the future, human intervention in most tasks associated with systems
management will seem as archaic and unnecessary as asking an operator for help
making a phone call seems today. Initially, you'll see more availability of
the systems that serve you --- your bank, your ISP, your travel agent. You'll
hear, 'sorry, our systems are down' less often. Simultaneously, autonomic
features will begin to appear in client level devices so that your individual
PC will complete for itself many of the tasks that currently make you a
part-time administrator. If you had to stop and get out of your car every few
hours to check under the hood, make a minor adjustment, or restart the car
because of some unknown glitch, it would take a long-time to reach your

What actually exists in autonomic computing today?
Much of the progress today has come from work on mainframes, but we've also
seen advances in open standards and storage. For instance, the TCP/IP standard
upon which the Internet is based has made possible an unprecedented level of
interoperability. In storage, Redundant Arrays of Independent Disks (RAID)
present a way of storing the same data in different places on multiple hard
disks, which improves performance and increases fault tolerance. To date, the
most progress in autonomic computing has been made with IBM's eLiza
initiative. Launched by IBM to develop self-managing servers, software, and
networks. IBM's autonomic computing efforts have already yielded:
IBM's Intelligent Resource Director (IRD), a self-managing operating system
for the eServer z900, which allows the server to dynamically reallocate
processing power to a given application as workload demands increase;
Workload management, which is available for IBM's mainframes and is being
extended to heterogeneous platforms;
The self-healing cellular architecture of Blue Gene, a high-speed machine now
under construction at IBM Research, which will detect failed processors and
redistribute work to compensate for their loss without interruption;
Tivoli's Intrusion Manager, an integrated approach to security that reduces
the overall complexity of security management.

What is the difference between eLiza and autonomic computing?
Project eLiza is IBM's initiative to develop self-managing servers, software,
and networks. It's just one piece of the autonomic picture. Autonomic
computing is an emerging area of study and a Grand Challenge for the entire
I/T community to address in earnest; an approach to self-managed computing
systems with a minimum of human interference and it includes all facets of
I/T -- software, hardware, architecture and more.

What's the difference between autonomic computing and your Intelligent
Infrastructure initiative of a few years ago?
Although the two are related -- autonomic computing actually includes all of
what we considered Intelligent Infrastructure -- the difference between them
is a matter of perspective. For Intelligent Infrastructure, we were basically
acting as technologists trying to plug holes in a dam we saw as about to
flood: not enough skilled I/T people to keep up with I/T demand. So the
solution was to improve the infrastructure by embedding intelligence in it.

Pretty soon, though, we realized that approach wouldn't address some of the
most pressing problems I/T customers faced. So with autonomic computing, we
are looking at things from the ultimate end user's perspective and the point
of view of the I/T customer: what behaviors do they need computing systems to
exhibit to get the most value and benefit from them?

Can you be more precise in defining grants and the dollar commitment?
We are just beginning the process of soliciting proposals and laying out a
strategic agenda -- we invite the academic community to help us in this. As we
evaluate proposals, we can better determine the specifics of individual grants
and amounts. But we are committed to funding a steady stream of work in this
area. Of course, we can't do it all by ourselves; that's why we're asking the
NSF, the I/T industry, and national labs around the world to target autonomic
computing as they fund crucial I/T research.

How do you think AC will change the way businesses are conducted?
One of the first examples is e-sourcing, which is gaining traction now.
e-sourcing is the ability to deliver I/T as a utility, when you need it, in
the amount you must have to accomplish the task at hand. Autonomic computing
will create huge opportunities for these emerging kinds of services.

Is autonomic computing being developed at places other than IBM?
Yes, and what we're trying to do now is focus all of that great work together
so the I/T community can address this Grand Challenge. There is already a
terrific amount of work being done at university labs like Berkeley, MIT,
University of Texas, University of Michigan and more. IBM has been doing work
in this area for years in aspects of self-managing servers, self-tuning
software and the like.

Isn't autonomic computing the same as creating intelligent machines?
This is partly a matter of definition. If " intelligent machine" means one
that embodies human cognitive powers, the answer is no. But if that term is
taken to mean systems that can adapt, learn and take over certain functions
previously performed by humans then autonomic computing does involve the idea
of embedding this kind of intelligence in computing systems.

Does autonomic computing replace AI?
No. In fact, Artificial Intelligence is a critical discipline that will help
bring about autonomic computing. AI-related research, some involving new ways
to apply control theory and control laws, can provide insight into how to run
complex systems that optimize to their environments. But to be clear,
autonomic computing does not require the duplication of conscioius human
thought as an ultimate goal. In our opinion, this is not the primary issue
that needs addressing now.

What are the biggest technical hurdles to autonomic computing?
We don't know all of them yet; we're still exploring. That is why its so
important for the I/T community to step up to this challenge. We know that a
great deal of innovation and hard work will be required. For example,
extending capabilities that exist on one platform to others, defining system
management identities and relationships, and figuring out how to effectively
map business policies to system policies. Clearly standards will be an

When will autonomic systems be available?
Truly autonomic systems are years away, although in the nearer term, autonomic
functionality will appear in servers, storage and software. Certain aspects of
autonomic systems are already available. IBM's eLiza initiative has already
determined a roadmap and will deliver aspects of this shortly. For instance,
IBM's z900 eServers have a self-managing operating system known as Intelligent
Resource Director (IRD).

What academic institutions are currently exploring autonomic computing?
We know of work at the University of California-Berkeley, University of Texas,
University of Wisconsin, MIT, and the University of Michigan that can apply to
this approach. And we are interested in learning about other work going on
around the world that may be directly or indirectly related. IBM hopes to
expand these efforts and spread the study of autonomic computing to other
institutions by awarding some 50 research grants dedicated to the study of
autonomic computing.

How does autonomic computing address the need for skilled I/T workers?
By embedding and automating I/T infrastructure complexity, autonomic computing
will require fewer I/T professionals to worry about configuring, maintaining,
fixing, securing, updating and generally running systems. More importantly, if
systems and networks become autonomic, I/T professionals will be able to work
at a higher level. Imagine not having to worry about making sure that mundane,
complicated tasks are handled correctly by humans -- repairing root causes of
failure, bringing servers to 'dry-dock', or resource allocations.

This is the ultimate benefit of autonomic computing: systems that tackle the
complexity 'under the covers', freeing I/T professionals to drive creativity,
innovation and opportunity. Entire new business models will emerge.

--- --- --- --- ---

Useless hypotheses, etc.:
 consciousness, phlogiston, philosophy, vitalism, mind, free will, qualia,
analog computing, cultural relativism, GAC, Cyc, Eliza, cryonics, individual
uniqueness, ego, human values, scientific relinquishment

We move into a better future in proportion as science displaces superstition.

This archive was generated by hypermail 2b30 : Sat May 11 2002 - 17:44:14 MDT