Internet
Fact-checked

At EasyTechJunkie, we're committed to delivering accurate, trustworthy information. Our expert-authored content is rigorously fact-checked and sourced from credible authorities. Discover how we uphold the highest standards in providing you with reliable knowledge.

Learn more...

What is Autonomic Computing?

R. Kayne
R. Kayne

Autonomic computing is the next generation of integrated computer technology that will allow networks to manage themselves with little or no human intervention. It is named after the human autonomic nervous system, which sends impulses that control heart rate, breathing and other functions without conscious thought or effort.

Paul Horn of IBM Research first suggested the idea of autonomic computing on 15 October 2001 at the Agenda conference in Arizona. The need centers around the exponential growth of networking complexity. Not only is there a vast array of desktop and mobile devices interconnecting and feeding into various types of networks using competing strategies, standards and interfaces; but businesses, institutions and even infrastructure have come to rely more and more on these networks. However, there is a shortage of I/T professionals, and it is virtually impossible for techs to keep up with the continual onslaught of new devices, changing protocols, new online business solutions and interfacing mobile challenges. IBM and other technology giants foresee this problem getting worse.

Woman holding a disc
Woman holding a disc

The solution, according to IBM, is to create a foundation of industry-wide standards based on some common protocols for handling data. 'Shared root assumptions' would allow hardware and software from various manufacturers not only to work together, but also to support a multilevel autonomic computing system based on those assumptions. This would create an environment where the system could perform various critical administrative tasks without human intervention.

IBM sees eight basic criteria defining a pervasive autonomic computing system. In short, they are as follows:

  • The system must be capable of taking continual stock of itself, its connections, devices and resources, and know which are to be shared or protected.
  • It must be able to configure and reconfigure itself dynamically as needs dictate.
  • It must constantly search for ways to optimize performance.
  • It must perform self-healing by redistributing resources and reconfiguring itself to work around any dysfunctional elements.
  • It must be able to monitor security and protect itself from attack.
  • It must be able to recognize and adapt to the needs of coexisting systems within its environment.
  • It must work with shared technologies. Proprietary solutions are not compatible with autonomic computing ideology.
  • It must accomplish these goals seamlessly without intervention.

While those are the eight proposed ingredients of an autonomic computing system, IBM hopes they will result in three goals for the end user: flexibility, accessibility and transparency. In short, the ability to extract data seamlessly from home, office or field, hassle free and regardless of the device, network, or connectivity methodology.

Several universities and companies, such as Sun Microsystems and Hewlett Packard, are developing similar systems, but IBM claims their plans for autonomic computing are more far reaching. As this plan relies on a cooperative evolution of hardware and software, autonomic computing is to be implemented in stages over a period of several years.

Discuss this Article

Post your comments
Login:
Forgot password?
Register:
    • Woman holding a disc
      Woman holding a disc