Learn something new every day
More Info... by email
Direct mapping is a method of storing information for easy access on a computer. Recently used information is stored in a cache so the computer can quickly find the information the next time it is needed. In computer programming, a cache is a small section of random access memory (RAM) that is set aside for the purpose of easily retrieving data. With direct mapping, each piece of data in memory is assigned a space in the cache, which it shares with other pieces of data. Cache data is constantly being overwritten as new data is needed.
A cache is organized into lines. Each line is only large enough to store one block of data and a tag to identify where the data came from. When a user requests a piece of data, the computer first scans the cache to see if the information is there. If it is, the information is returned to the user. This is known as a cache hit, which is quicker than retrieving the data from its original location.
The percentage of requests that results in cache hits is called the hit rate. If the data the user requested is not in the cache, the computer will find it in memory. A copy of the data will be deposited into the cache so that it can be found quickly the next time the user requests it, in theory increasing the hit rate. All of this happens behind the scenes. The user does not know if the data received came from the cache or memory.
Direct mapping is one method of deciding where blocks of memory will be stored in the cache. Each block of memory is assigned a specific line in the cache. Since the cache is smaller than the memory, multiple blocks will share a single line in the cache. If a line is already full when a new block needs to be written to it, an old block will be overwritten.
Though direct mapping is a very simple and easy way to design a cache, it does present some problems. If a program continually accesses multiple blocks of data that share the same line in a direct mapping cache, the line will be rewritten often. This results in a lot of misses because the data the computer needs is less likely to be the data that is actually in that cache line at the moment. So direct mapping has a lower hit rate than other cache mapping models.
@David09 - I guess I can see the advantage of direct mapping or any kind of cache technology if you are using a disk drive and want to store information in RAM.
However, I don't think that it matters if you were using flash memory. I have a small notebook that uses flash memory – there is no hard disk whatsoever.
In that case, I assume that all of the data read and write operations are already taking place in memory and therefore a separate cache would not be useful whatsoever.
That’s just my opinion; I can vouch that the flash memory is much faster than the hard disk from an older laptop.
I think that a cache, any cache, is always a faster way of accessing data than constantly reading from the main data storage area in my opinion.
This is why I look for L1 memory cache when I buy a new computer. This is a cache of memory that is used to store frequent read and write operations like the article talks about.
I don’t know if it uses what you would call direct mapping or some other cache operation, but I do know that it helps to boost computer speed.
Similarly, when you are accessing a web page on the Internet, now and then you will find cached pages. These are pages that are frequently accessed and therefore are cached for immediate access. You can pull up these pages faster than you would if the page were always being refreshed live once again.