Collection
zero Useful+1
zero

Buffer memory

Announce Upload video
A cache memory
Cache is a kind of cache memory, which is an important technology to solve the speed mismatch between CPU and main memory. Cache is a small capacity between CPU and main memory storage , but the access speed is faster than that of main memory. When the main memory capacity is configured with hundreds of MB, the typical value of Cache is hundreds of KB. Cache can provide instructions and data to CPU at a high speed, thus speeding up the execution of programs. From the functional point of view, it is a buffer memory of main memory SRAM form. In pursuit of high speed, all functions including management are implemented by hardware, which is transparent to programmers.
Chinese name
Buffer memory
Foreign name
cache
Characteristics
Provide instructions and data to CPU at high speed
Features
Buffer memory of main memory speeds up program execution

brief introduction

Announce
edit
Currently with semiconductor device With further improvement of integration, buffer memory has been put into CPU Its working speed is close to that of the CPU, so it can form a two-level cache system. [1]

working principle

Announce
edit
The working principle of buffer memory requires it to store the latest data as much as possible. When a new main memory block needs to be copied to the Cache, and the row positions allowed to store this block are full of other main memory blocks, replacement is required. The replacement problem is closely related to the way cache is organized. For directly mapped caches, since a main memory block can only be stored in a specific row location, it is easy to solve the problem Main storage The block is swapped out of the Cache. For fully associative and associative caches, it is necessary to select a row from a number of specific rows that allow new main memory blocks to be stored. [1]

algorithm

Announce
edit
How to select is related to replacement strategy, also known as replacement algorithm. adopt Hardware The commonly used algorithms are as follows.
Least Frequently Used (LFU) Algorithm
The LFU algorithm believes that the line of data with the least number of accesses in a period of time should be replaced. To do this, set a counter per line. After a new line is created, it starts counting from 0. Each time it is accessed, the counter of the accessed line increases by 1. When replacement is needed, compare the count values of these specific lines, replace the line with the lowest count value, and clear the counters of these specific lines. This algorithm limits the counting cycle to the interval between two replacements of these specific lines, so it cannot strictly reflect the recent access situation. [1]
Least recently used( LRU )Algorithm
The LRU algorithm swaps out lines that have not been accessed for a long time in the near future. For this reason, a counter is also set for each line, but each time the Cache hits, the hit line counter is cleared, and the other line counters are increased by 1. When it is necessary to replace, compare the count value of each specific line and replace the line with the largest count value. This algorithm protects the new data lines that have just been copied to the cache, which conforms to the working principle of the cache, so that the cache has a high hit rate.
For 2-way cascaded cache, the hardware implementation of LRU algorithm can be simplified. Because a main memory block can only be stored in two rows of a specific group, selecting one from the other does not require a counter at all, only one binary bit is required. For example, this position can be "1" if row A in a group is copied into new data, and "0" if row B is copied into new data. When replacement is needed, just check the bit status of the binary: replace line A for 0 and line B for 1, realizing the principle of protecting new lines. The data cache in the Pentium CPU is a 2-way cascade structure. This simple LRU replacement algorithm is used. [1]
Random substitution
In fact, the random replacement strategy does not require any algorithm. You can select a line randomly from a specific line position and replace it. This strategy is easy to implement in hardware and faster than the first two strategies. The disadvantage is that the data randomly swapped out is likely to be used again soon, thus reducing the hit rate and cache efficiency. However, this defect decreases with the increase of cache capacity. Research shows that the efficacy of random replacement strategy is only slightly inferior to the first two strategies. [1]

application

Announce
edit
Buffer memory is widely used in computers. Each hard disk contains a buffer memory, which can cache the data commonly used in the hard disk to speed up the reading efficiency of the system. Usually, the larger the buffer memory, the better, because the speed of the buffer memory is faster than the speed of data being found from the hard disk! At present, mainstream products can reach about 16MB of memory.