Category: 

What is Distributed Computing?

Article Details
  • Written By: Brendan McGuigan
  • Edited By: Niki Foster
  • Last Modified Date: 01 November 2016
  • Copyright Protected:
    2003-2016
    Conjecture Corporation
  • Print this Article
Free Widgets for your Site/Blog
The mongoose was introduced to Hawaii in order to kill rats, but mongooses hunt in the day, while rats are nocturnal.  more...

December 7 ,  1941 :  Japanese bombers attack Pearl Harbor.  more...

Distributed computing utilizes a network of many computers, each accomplishing a portion of an overall task, to achieve a computational result much more quickly than with a single computer. In addition to a higher level of computing power, distributed computing also allows many users to interact and connect openly. Different forms of distributed computing allow for different levels of openness, with most people accepting that a higher degree of openness in a distributed computing system is beneficial.

The segment of the Internet most people are most familiar with, the World Wide Web, is also the most recognizable use of distributed computing in the public arena. Many different computers make everything one does while browsing the Internet possible, with each computer assigned a special role within the system.

A home computer is used, for example, to run the browser and to break down the information being sent, making it accessible to the end user. A server at your Internet service provider acts as a gateway between your home computer and the greater Internet. These servers speak with computers that comprise the domain name system, to help decide which computers to talk to based on the URL the end user enters. In addition, each web page is hosted on another computer.

Ad

Another type of distributed computing is known as grid computing. Grid computing consists of many computers operating together remotely and often simply using the idle processor power of normal computers. The highest visibility example of this form of distributed computing is the At Home project of the Search for Extra-Terrestrial Intelligence (SETI). SETI uses the processing power of over five million home computers to utilize computational power far in excess of even the greatest supercomputers. SETI makes available a free piece of software a home user may install on a computer. The software runs when the computer is left idle, and each computer with the software contacts a central server in Berkeley and downloads a 250k file which tells it what to analyze. The distributed computing system then analyzes this data for specific patterns, which in theory represent a high likelihood of intelligent design.

Many home computers are also examples of distributed computing — albeit less drastic ones. By using multiple processors in the same machine, a computer can run separate processes and reach a higher level of efficiency than otherwise. Many home computers now take advantage of multiprocessing, as well as a similar practice known as multithreading, to achieve much higher speeds than their single-processor counterparts.

Ad

You might also Like

Recommended

Discuss this Article

Charred
Post 3

@everetra - Yeah, the distributed computing model is better than the supercomputing model in that sense.

That’s one reason SETI uses it. I think many people don’t realize that before SETI adopted the grid computing approach, they used to use a supercomputer.

However, they realized that they could accomplish more with many “workers bees” (individual workstations) than they could with one mammoth computer. Cluster computing was the answer they needed.

everetra
Post 2

@hamje32 - Grid distributed computing is simply cheaper and more efficient than traditional supercomputing, which is what makes it so popular.

I have no doubt that your professor was using Linux on his workstations. Linux is free, and the workstations themselves probably were not that expensive.

Instantly, he had nearly the processing power of a single supercomputer at much less the cost.

hamje32
Post 1

Many times using distributed computing architecture is the only way to solve a problem.

In one of my computer classes in college, I had a professor who was trying to solve a genetic algorithm. He connected several Sun workstations together and had them working 24 hours a day, 7 days a week on the problem before he got it solved.

I think it was purely theoretical, meant for a peer reviewed article that he was working on. He needed all that raw processing processor to get the job done. I can’t imagine anything in my current workload that would need that kind of computing power.

Post your comments

Post Anonymously

Login

username
password
forgot password?

Register

username
password
confirm
email