Amdahl's law is a theory involving carrying out algorithms either in serial or parallel. It states that the benefits of running in parallel (that is, carrying out multiple steps simultaneously) are limited by any sections of the algorithm that can only be run serially (one step at a time). The most common use of Amdahl's law is in parallel computing, such as on multi-core machines.
At heart, Amdahl's law is a mathematical formula. Put in its simplest form, it says that the biggest increase in speed that can be achieved by parallelizing a process is equal to one divided by the proportion of the process that can't be parallelized, minus one. For example, if 80% of a process can be parallelized, then one divided by the remaining 20% gives five; taking away one leaves four. This means that parallelizing the process in this way makes it run four times as quickly. The formula also works where only a minority of the process can be parallelized: if 12% can be parallelized, the calculation is one divided by 88%, which equals 1.136, minus one, which adds up to a 13.6% increase in speed.
The formula can be adapted for use in more complicated situations where different stages of the process get different speed increases from being parallelized. This involves producing a figure for each stage, which is the percentage of time devoted to that stage before the parallelization, divided by the increase in speed, then adding up these figures to produce a total. The formula then divides one by this total and subtracts one from the result, giving the overall increase in speed.
The major area where Amdahl's law is used is in parallel computing. This is where multiple processors work on a task at once. This deals with one of the major drawbacks of computer processors, which is that they work very quickly but can only carry out one action at a time. In some cases, a multi-core processor can effectively carry out parallel computing, as it simulates multiple processors.
While some people argue Amdahl's law is a misleading name and it should really be "Amdahl's argument," the name is a play on words relating to Moore's law. This is a theory based on a 1965 statement by Intel founder Gordon Moore. He predicted that technology would advance so that the number of transistors fitting on an integrated circuit would double every two years, a prediction that has proved extremely accurate.