Learn something new every day More Info... by email
A Turing machine is a philosophical construct for how a computer might function, invented in 1936 by Alan Turing, a famous English mathematician and logician of the 20th century. The ideas behind the Turing machine are the basis for all modern computer software and hardware systems that exist as of 2011, though the actual concepts Turing created were never used to build an actual device at the time, and were invented before digital computers existed in any real form. The principles upon which a Turing machine functions include a set of controls for input and output data, the machine for processing the data in some form, and a set of established rules for how this data is processed by the machine.
The genius behind Alan Turing's discovery was that any consistent group of symbols representing meaningful information, such as mathematical symbols or letters comprising a language, could be processed mechanically by a machine if given a proper set of rules for their processing. This would result in the creation of mechanical devices that could be asked logical questions for complex problems and quickly come up with unbiased answers. The Turing machine was a precursor in this respect to a computer algorithm, which is a compiled list of computer instructions that central processing units (CPUs) in computers rely upon to function as of 2011.
The design for the Turing machine was simplistic by modern-day computing standards of the 21st century, and its physical function had impracticalities as to its implementation, but the ideas upon which it was built had a solid foundation. The machine consisted of a tape or ribbon with imprinted symbols on it, which could be read by a head as the tape was passed over it. As the symbols were read, they would invoke certain states in the machine, which would direct the motion of the tape and affect the output values produced by the machine. The analog to modern computer systems of 2011 would be that the tape represents computer software code or algorithms, the reader is the CPU, and the output would be display and transmission systems such as monitors, speakers and printers, network traffic, and more.
The ideas behind the Turing machine were seen as a fundamental function of performing any series of calculations and could also be compared to how the human brain works. Turing himself and others of his day believed that the Turing machine could be adapted to perform practically any type of imaginable computation and act as a universal machine for solving all human problems. The issue that soon arose with the concept, however, is known as a Turing tarpit, and refers to the fact that, although any self-consistent set of symbols can be processed by a Turing machine, getting such a machine to produce meaningful answers to questions relies entirely on increasingly complex and multi-layered sets of processing rules.
Computer science soon encountered problems with how software and hardware systems based on Turing machine principles could get bogged down in meaningless calculations known as program loops. Logic limitations led to adaptations on Turing machine principles, such as that of the quantum and probabilistic Turing machines. A probabilistic Turing machine utilizes the idea of multiple tapes being run in the machine simultaneously to produce different results in parallel, which are then weighted against each other based on the probability of which result is most likely accurate. Such machines would reach conclusions in a manner similar to how fuzzy logic software operates in advanced control systems as of 2011.
A quantum computer based on the Turing machine principle would have a tape of infinite length with cells of symbols in a perpetual undetermined state until read. This would provide for a form of parallel processing which would be vastly superior to data processing procedures used in computers as of 2011. Quantum Turing machines offer the option of storing multiple values in individual cells of memory until accessed, which standard logic-based computers cannot do.