What is a Computer Chip?

Article Details
  • Written By: Phil Shepley
  • Edited By: Bronwyn Harris
  • Images By: Yurazaga, Oleg Zhukov, Intel Free Press
  • Last Modified Date: 11 May 2020
  • Copyright Protected:
    Conjecture Corporation
  • Print this Article
Free Widgets for your Site/Blog
Long thought extinct, a small, deer-like animal known as the silver-backed chevrotain was seen in Vietnam in 2019.  more...

June 5 ,  1968 :  Senator Robert F. Kennedy was shot.  more...

A computer chip is a small electronic circuit, also known as an integrated circuit, which is one of the basic components of most kinds of electronic devices, especially computers. Computer chips are small and are made of semiconductors that is usually composed of silicon, on which several tiny components including transistors are embedded and used to transmit electronic data signals. They became popular in the latter half of the 20th century because of their small size, low cost, high performance and ease to produce.

The modern computer chip saw its beginning in the 1950s through two separate researchers who were not working together, but developed similar chips. The first was developed at Texas Instruments by Jack Kilby in 1958, and the second was developed at Fairchild Semiconductor by Robert Noyce in 1958. These first computer chips used relatively few transistors, usually around ten, and were known as small-scale integration chips. As time went on through the century, the amount of transistors that could be attached to the computer chip increased, as did their power, with the development of medium-scale and large-scale integration computer chips. The latter could contain thousands of tiny transistors and led to the first computer microprocessors.

There are several basic classifications of computer chips, including analog, digital and mixed signal varieties. These different classifications of computer chips determine how they transmit signals and handle power. Their size and efficiency are also dependent upon their classification, and the digital computer chip is the smallest, most efficient, most powerful and most widely used, transmitting data signals as a combination of ones and zeros.

Today, large-scale integration chips can actually contain millions of transistors, which is why computers have become smaller and more powerful than ever. Not only this, but computer chips are used in just about every electronic application including home appliances, cell phones, transportation and just about every aspect of modern living. It has been posited that the invention of the computer chip has been one of the most important events in human history. The future of the computer chip will include smaller, faster and even more powerful integrated circuits capable of doing amazing things, even by today’s standards.

You might also Like


Discuss this Article

Post 14

Who do I talk to to get a computer chip programmed for me to run in a machine I’m making ?

Post 13

Why do computer chips have to go in computers, if they already work without them?

Post 9

What is the name of a chip that processes data in the serial form a computer can accept, making keyboards possible?

Post 7

There are images online of some old computers and computer chips from the 1960s, 70s and 80s: IBM, Intel, RCA, AMD, Hughes, Ma Bell, Samsung, Cray, Burroughs, Univac, MIT and many more semiconductor makers, from vacuum tubes to advanced integrated circuits.

Post 5

Well, say what you will concerning Binary. I had to program my assembly program to run my robot in 94 to graduate for my A.S. in electronics and it is still being taught! Ask Spock!

Post 4

@anon79262 - Binary is absolutely a computer language, even though it is not one that people often program in. Programming languages like C++ and Java are translated into a binary file when they are compiled. Basically, they go from a language that programmers can understand to a language that computers can understand.

It is still possible to write a program entirely to binary-- it would just be incredibly difficult and time-consuming. Binary represents the lowest level (meaning, closest to the hardware level) computer language whereas the programming languages we are typically familiar with represent higher level languages.

Post 2

Fair article but does not describe that ones and zeros are not computer language. Space and time would deserve a more thorough inspection.

Moderator's reply: Thank you for visiting wiseGEEK and for contributing to the discussion forum. Unfortunately, it is impossible to cover every aspect of such a broad topic in a 400-500 word article.

Post your comments

Post Anonymously


forgot password?