Here we provided Associative Mapping in Realsubtitle. In this post, we discuss Associative Mapping in cache memory. The basic characteristic of cache memory is its fast access time. Therefore, very little or no time must be wasted when searching for words in the cache. The transformation of data from main memory to cache memory is referred to as a mapping process. Three types of mapping procedures are of practical interest when considering the organization of cache memory:
- Associative mapping
- Direct mapping
- Set-associative mapping
The fastest and most flexible cache organization uses an associative memory. The associative memory stores both the address and content (data) of the memory word. This permits any location in the cache to store any word from the main memory. The diagram shows three words presently stored in the cache. The address value of 15 bits is shown as a five-digit octal number and its corresponding 12-bit word is shown as a four-digit octal number. A CPU address of 15 bits is placed in the argument register and the associative memory is searched for a matching address.
Associative Mapping in cache memory figure
If the address is found, the corresponding 12-bit data is read and sent to the CPU. If no match occurs, the main memory is accessed for the word. The address-data pair is then transferred to the associative cache memory. If the cache is full, an address-data pair must be displaced to make room for a pair that is needed and not presently in the cache. The decision as to what pair is replaced is determined from the replacement algorithm that the designer. chooses for the cache. A simple procedure is to replace cells of the cache in round-robin order whenever a new word is requested from the main memory. This constitutes a first-in-first-out (FIFO) replacement policy.
Feel free to share this post if it has been helpful in any way to solve your problem. Finally, in case you’re finding it difficult, You can leave a comment and we will get the issue fixed in hours.