10+-+Caches - CS 4290/6290 Caches and Cache Hierarchies...

Info iconThis preview shows pages 1–10. Sign up to view the full content.

View Full Document Right Arrow Icon
CS 4290/6290 Caches and Cache Hierarchies
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
CS 4290/6290 – Spring 2009 – Prof. Milos Prvulovic Data Locality Temporal: if data item needed now, it is likely to be needed again in near future Spatial: if data item needed now, nearby data likely to be needed in near future Exploiting Locality: Caches Keep recently used data in fast memory close to the processor Also bring nearby data there 2
Background image of page 2
CS 4290/6290 – Spring 2009 – Prof. Milos Prvulovic 3 Register File Instruction Cache Data Cache L2 Cache L3 Cache Main Memory Disk Row buffer SRAM Cache Bypass Network Capacity + Speed - Speed + Capacity - ITLB DTLB
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
CS 4290/6290 – Spring 2009 – Prof. Milos Prvulovic 60-100ns not uncommon Quick back-of-the-envelope calculation: 2GHz CPU D 0.5ns / cycle 100ns memory 200 cycle memory latency! Solution: Caches 4
Background image of page 4
CS 4290/6290 – Spring 2009 – Prof. Milos Prvulovic Fast (but small) memory close to processor When data referenced If in cache, use cache instead of memory If not in cache, bring into cache (actually, bring entire block of data, too) Maybe have to kick something else out to do it! Important decisions Placement: where in the cache can a block go? Identification: how do we find a block in cache? Replacement: what to kick out to make room in cache? Write policy: What do we do about stores? Key: Optimize the average memory access latency 5
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
CS 4290/6290 – Spring 2009 – Prof. Milos Prvulovic Cache consists of block-sized lines Line size typically power of two Typically 16 to 128 bytes in size Example Suppose block size is 128 bytes • Lowest seven bits determine offset within block Read data at address A=0x7fffa3f4 Address begins to block with base address 0x7fffa380 6
Background image of page 6
CS 4290/6290 – Spring 2009 – Prof. Milos Prvulovic Placement Which memory blocks are allowed into which cache lines Placement Policies Direct mapped (block can go to only one line) Fully Associative (block can go to any line) Set-associative (block can go to one of N lines) E.g., if N=4, the cache is 4-way set associative Other two policies are extremes of this (E.g., if N=1 we get a direct-mapped cache) 7
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
CS 4290/6290 – Spring 2009 – Prof. Milos Prvulovic Address: Tag Index Offset Which part of cache block? To which line can it go? (3 bits) Cache Line #  0 1 2 3 4 5 6 7 Cache has 8 lines, need 3 bits that tell us to which line to go 110 8
Background image of page 8
CS 4290/6290 – Spring 2009 – Prof. Milos Prvulovic Address: Tag Index Offset Which part of cache block? To which line
Background image of page 9

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 10
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 04/12/2010 for the course CS 6290 taught by Professor Staff during the Spring '08 term at Georgia Institute of Technology.

Page1 / 37

10+-+Caches - CS 4290/6290 Caches and Cache Hierarchies...

This preview shows document pages 1 - 10. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online