Category: 

What Does "Memoization" Mean?

Article Details
  • Written By: Alex Newth
  • Edited By: Angela B.
  • Last Modified Date: 11 November 2016
  • Copyright Protected:
    2003-2016
    Conjecture Corporation
  • Print this Article
Free Widgets for your Site/Blog
Snake charmers get snakes to “dance” because of the movement of their flute-like instruments, not their music.  more...

December 4 ,  1945 :  The United States Senate approved of US participation in the United Nations.  more...

Memoization, which is similar to memorization but specific to computers, is a method of optimizing a program’s speed by increasing the amount of space it uses. What this technique directly entails is that a program is built to remember the calculations of a function, and the result is stored in a cache. The next time the function is called, the program can retrieve the calculation from the cache rather than redoing the same calculation. Unlike strength reduction, which speeds up the machine based on a similar speed and space trade-off, memoization is portable and can be used over many machines.

Users may not see it, but programs make calculations all the time. When someone clicks a button on a program or uses an inherent function, a calculation is required to make it work. Often, the user will use the same function several times without any change. Without memoization in place, even though the program just performed a calculation, it would need to do so again. This makes the program’s speed slower than if it stored the result.

Ad

The result of a calculation is stored in a cache memory area. With memoization, when the user performs the same or a similar input, the program will draw from the cache rather than perform the calculation to get the answer. By doing this, the program saves time and becomes optimized and faster. The input may be similar and not exactly the same, so some calculations may not be entirely accurate, but the inaccuracy is usually very slight and unlikely to cause errors.

The memoization technique essentially performs a trade-off. All programs are built with size and time constraints. Here, the size is sacrificed so the speed can increase. Results are stored in the cache, so more memory is needed for the program. The amount of memory sacrificed is very slight, because a cache is limited in the amount of results it can store, but it still adds to the space cost.

Another similar, but not as reliable, trade-off optimization technique is strength reduction. Strength reduction does not reduce the strength of the program but the strength of the calculation by breaking it down into weaker, less-memory-costly functions. For example, multiplication requires a larger amount of time to process than addition, so the formula will be changed to allow for the less time-intensive process. This technique is able to speed up processes, but the savings may not be seen on all machines and it only saves compile time.

Ad

You might also Like

Recommended

Discuss this Article

Post your comments

Post Anonymously

Login

username
password
forgot password?

Register

username
password
confirm
email