lec 05 - cache memory
Post on 05-Apr-2018
230 Views
Preview:
TRANSCRIPT
-
7/31/2019 Lec 05 - Cache Memory
1/19
Concepts and Principles
-
7/31/2019 Lec 05 - Cache Memory
2/19
A small but fast memory element in whichthe contents of the most commonly accessed
locations are maintained, and can be placedbetween the main memory and the CPU.
-
7/31/2019 Lec 05 - Cache Memory
3/19
-
7/31/2019 Lec 05 - Cache Memory
4/19
Faster electronics can be used, which also results in a greaterexpense in terms of money, size, and power requirements.
A cache memory has fewer locations than a main memory,which reduces the access time.
The cache is placed both physically closer and logically closerto the CPU than the main memory, and this placementavoids communication delays over a shared bus.
-
7/31/2019 Lec 05 - Cache Memory
5/19
-
7/31/2019 Lec 05 - Cache Memory
6/19
-
7/31/2019 Lec 05 - Cache Memory
7/19
The cache for this example consists of 214 slots into
which main memory blocks are placed. There are
more main memory blocks than there are cache
slots, and any one of the 227 main memory blocks
can be mapped into each cache slot (with only oneblock placed in a slot at a time).
-
7/31/2019 Lec 05 - Cache Memory
8/19
Tag bit - keeps track of which one of the227possible blocks is in each slot
Valid bit- indicate whether or not the slotholds a block that belongs to the programbeing executed.
-
7/31/2019 Lec 05 - Cache Memory
9/19
Dirty bit - keeps track of whether or not ablock has been modified while it is in the
cache. A slot that is modified must be writtenback to the main memory before the slot isreused for another block.
-
7/31/2019 Lec 05 - Cache Memory
10/19
-
7/31/2019 Lec 05 - Cache Memory
11/19
This scheme is called direct mapping because
each cache slot corresponds to an explicit set of
main memory blocks. For a direct mapped cache,
each main memory block can be mapped to onlyone slot, but each slot can receive more than oneblock.
-
7/31/2019 Lec 05 - Cache Memory
12/19
-
7/31/2019 Lec 05 - Cache Memory
13/19
The set associative mapping schemecombines the simplicity of direct mapping
with the flexibility of associative mapping.
-
7/31/2019 Lec 05 - Cache Memory
14/19
Least recently used (LRU) First-in first-out (FIFO)
Least frequently used(LFU) Random.
-
7/31/2019 Lec 05 - Cache Memory
15/19
For the LRU policy, a time stamp is added to each slot, whichis updated when any slot is accessed. When a slot must befreed for a new block, the contents of the least recently used
slot, as identified by the age of the corresponding timestamp, are discarded and the new block is written to thatslot.
-
7/31/2019 Lec 05 - Cache Memory
16/19
The LFU policy works similarly, except that only one
slot is updated at a time by incrementing a
frequency counter that is attached to each slot.
When a slot is needed for a new block, the leastfrequently used slot is freed.
-
7/31/2019 Lec 05 - Cache Memory
17/19
The FIFO policy replaces slots in round-robinfashion, one after the next in the order of
their physical locations in the cache.
-
7/31/2019 Lec 05 - Cache Memory
18/19
The random replacement policy simplychooses a slot at random.
-
7/31/2019 Lec 05 - Cache Memory
19/19
Write-Through
Write-Back
top related