Cache or cache memory in information technology is a mechanism for high-speed secondary data storage that is used to store data / instructions are frequently accessed. Cache memory is intended to provide the speed of memory that is close to the fastest memory that can be obtained, and at the same time provide a large memory capacity at a lower price than other types of semiconductor memory.
The term cache is derived from the French word cache which means hiding place. By definition, the cache is a place to store data temporarily. This mechanism is intended to increase data transfer by storing data that has been accessed at the cache, so that if there is data to be accessed is the same data, then access will be made faster. The cache is SDRAM type memory which has a limited capacity but has a very high speed and a price that is more expensive than main memory. This cache memory is located between the register and RAM (main memory) so that data processing does not directly refer to the main memory.
Cache is a small-sized memory that is temporary (temporary). Even though the file size is very small, the speed is very high. In hardware terminology, this term usually refers to high-speed memory that bridges the flow of data between the processor and main memory (RAM) which usually has a much lower speed. The use of cache is intended to minimize bottlenecks in the flow of data between the processor and RAM. Whereas in software terminology, this term refers to a temporary storage area for several files that are frequently accessed (usually applied in a network). Cache is generally divided into several types, namely:
- Level 1 or Register –
It is a type of memory in which data is stored and accepted that are immediately stored in CPU. Most commonly used register is accumulator, Program counter, address register etc.
- Level 2 or Cache memory –
It is the fastest memory which has faster access time where data is temporarily stored for faster access.
- Level 3 or Main Memory –
It is memory on which computer works currently. It is small in size and once power is off data no longer stays in this memory.
- Level 4 or Secondary Memory –
It is external memory which is not as fast as main memory but data stays permanently in this memory.
Cached memory can consist of several levels, for example L1, L2 and L3. Level 1 memory cache (L1) is cache memory located in the processor (internal cache). This cache has the highest access speed and the most expensive price. Memory sizes expand from 8Kb, 64Kb and 128Kb. Level 2 (L2) cache has a larger capacity, ranging from 256Kb to 2Mb. However, the L2 cache has a lower speed than the L1 cache. The L2 cache is located separately from the processor or is called the external cache. While the level 3 cache is only owned by processors that have more than one unit, for example dualcore and quadcore. Its function is to control the incoming data from the L2 cache from each processor core.
The important elements of the cache memory design are as follows:
Cached size, adjusted to the need to help working memory. The larger the cache size the slower it is because there are more gates in the cache addressing. Mapping function, consists of Direct Mapping, Associative, Associative Set. Direct mapping is the simplest technique, which is to map each main memory block only to a cache channel. Associative mapping can overcome the shortcomings of direct mapping by allowing each main memory block to be loaded into any cache channel. This is according to an article from Yulisdin Mukhlis, ST., MT Replacement algorithm, consisting of Least Recently Used (LRU), First in First Out ( FIFO), Least Frequently Used (LFU), Random.
The replacement algorithm is used to determine which blocks should be removed from the cache to prepare a place for new blocks. There are 2 methods of algorithm replacement, Write-through and Write-back. Write-through is Cache and main memory is updated simultaneously. Whereas Write-back updates data in main memory only when word memory has been modified from the cache. Block size, larger blocks reduce the number of blocks that occupy the cache. Each block retrieval overwrites the old cache, so a small number of blocks will cause data to be suppressed after the block has been retrieved. As the block size increases, the distance of each additional word becomes further from the requested word, making it less likely to be needed in the near future. (Excerpted from Yulisdin’s article “Mukhlis, ST., MT”) Line size, number of stops, one or two two levels, unity or separate
If the processor needs data, it will first look for it at the cache. If data is found, the processor will immediately read it with a very small delay. But if the data sought is not found, the processor will look for it at RAM whose speed is lower. In general, cache can provide the data needed by the processor so that the effect of slow RAM work can be reduced. In this way the memory bandwidth will increase and the processor work becomes more efficient. In addition, the larger cache capacity will also increase the overall working speed of the computer.
Two types of cache that are often used in the computer world are memory caching and disk caching. The implementation can be a special piece of computer main memory or a special high-speed data storage media.
There are several types of cache that are commonly used, including:
Memory cache is often also called RAM cache. This is a portion of memory made with high-speed static RAM (SRAM). This type of cache is more effective because almost all programs can access the same data or commands repeatedly. That means, the more information stored in SRAM, the computer will use this cache more frequently than using DRAM which is relatively slower. Memory cache has three levels. You have certainly seen the label L1 Cache, L2 Cache or L3 Cache in your computer’s device box or in its specification information package. L1 is the name for internal cache, it inhabits the part between the CPU and DRAM. This cache has the highest access speed. Memory sizes expand from 8Kb, 64Kb and 128Kb. Whereas L2 is an external cache that has a larger capacity which ranges from 256Kb to 2MB. But about speed, L2 is actually slower than L1. Finally, L3 Cache is usually found in new computer models that have more than one processor unit, for example dual core or quad core. L3 functions as a regulator of data that is accessed from the L2 cache and each processor core.
Not much different from Memory Cache, Disk Cache is also based on the same principle. But instead of using high-speed SRAM data, disk cache actually uses conventional memory or so-called dynamic RAM (DRAM) which is relatively slower. How it works, when you run an application that requires data from disk, the application will first check the data availability in the memory buffer. Disk Cache can provide a significant speed effect on application performance, because this mechanism is much faster than retrieving data from hard disk components.