What is a Computer Chip?

Phil Shepley

A computer chip is a small electronic circuit, also known as an integrated circuit, which is one of the basic components of most kinds of electronic devices, especially computers. Computer chips are small and are made of semiconductors that is usually composed of silicon, on which several tiny components including transistors are embedded and used to transmit electronic data signals. They became popular in the latter half of the 20th century because of their small size, low cost, high performance and ease to produce.

Robert Noyce was one of the first developers of the modern computer chip.
Robert Noyce was one of the first developers of the modern computer chip.

The modern computer chip saw its beginning in the 1950s through two separate researchers who were not working together, but developed similar chips. The first was developed at Texas Instruments by Jack Kilby in 1958, and the second was developed at Fairchild Semiconductor by Robert Noyce in 1958. These first computer chips used relatively few transistors, usually around ten, and were known as small-scale integration chips. As time went on through the century, the amount of transistors that could be attached to the computer chip increased, as did their power, with the development of medium-scale and large-scale integration computer chips. The latter could contain thousands of tiny transistors and led to the first computer microprocessors.

Computer chips are one of the basic components of most electronic devices.
Computer chips are one of the basic components of most electronic devices.

There are several basic classifications of computer chips, including analog, digital and mixed signal varieties. These different classifications of computer chips determine how they transmit signals and handle power. Their size and efficiency are also dependent upon their classification, and the digital computer chip is the smallest, most efficient, most powerful and most widely used, transmitting data signals as a combination of ones and zeros.

Today, large-scale integration chips can actually contain millions of transistors, which is why computers have become smaller and more powerful than ever. Not only this, but computer chips are used in just about every electronic application including home appliances, cell phones, transportation and just about every aspect of modern living. It has been posited that the invention of the computer chip has been one of the most important events in human history. The future of the computer chip will include smaller, faster and even more powerful integrated circuits capable of doing amazing things, even by today’s standards.

Tiny transistors play a key role in the operation of a computer chip.
Tiny transistors play a key role in the operation of a computer chip.

You might also Like

Readers Also Love

Discussion Comments

anon942863

Who do I talk to to get a computer chip programmed for me to run in a machine I’m making ?

anon325859

Why do computer chips have to go in computers, if they already work without them?

anon306680

What is the name of a chip that processes data in the serial form a computer can accept, making keyboards possible?

anon197012

There are images online of some old computers and computer chips from the 1960s, 70s and 80s: IBM, Intel, RCA, AMD, Hughes, Ma Bell, Samsung, Cray, Burroughs, Univac, MIT and many more semiconductor makers, from vacuum tubes to advanced integrated circuits.

anon110914

Well, say what you will concerning Binary. I had to program my assembly program to run my robot in 94 to graduate for my A.S. in electronics and it is still being taught! Ask Spock!

mcsquared

@anon79262 - Binary is absolutely a computer language, even though it is not one that people often program in. Programming languages like C++ and Java are translated into a binary file when they are compiled. Basically, they go from a language that programmers can understand to a language that computers can understand.

It is still possible to write a program entirely to binary-- it would just be incredibly difficult and time-consuming. Binary represents the lowest level (meaning, closest to the hardware level) computer language whereas the programming languages we are typically familiar with represent higher level languages.

anon79262

Fair article but does not describe that ones and zeros are not computer language. Space and time would deserve a more thorough inspection.

Post your comments
Login:
Forgot password?
Register: