First page Back Continue Last page Overview Graphics
Principle of locality: lines of code are executed repeatedly. Most programs satisfy the principle of locality.
Cache is a technique for taking advantage of the principle of locality to increase memory speed without significantly increasing cost.
Many microprocessors include cache on chip.
In DSP applications, programs observe the principle of locality; but data usually does not observe the principle of locality.
- Cache may be used for program memory for DSP.
- Cache is not used for data memory for DSP.
Based on the principle of locality, cache memory can be an effective way to increase the memory access speed. Some processors (e.g. Intel Pentium) use two-stage cache.
Whether to use cache or not depends on whether the principle of locality is satisfied. For example, many programs have loops that need to be executed repeatedly. This is the ideal situation for using cache memory. However the input data of signal processors are not used repeatedly. This is a case in which cache should not be used. When the Harvard architecture is adopted in a system, one may use cache memory for program memory and not for data memory.
A drawback of cache memory is that the information access time is indeterministic. When the required data is not in cache, it takes much longer to retrieve the information from memory. Therefore, cache memory is not suitable for high speed real-time applications.