06-Computer-Architecture

06-Computer-Architecture - CS107 Handout 06 Spring 2007...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: CS107 Handout 06 Spring 2007 April 6, 2007 Computer Memory: Bits and Bytes This handout was written by Nick Parlante and Julie Zelenski. To begin, we are going to take a glimpse into the inner workings of a computer. The goal is that by seeing the basics of how the computer and compiler cooperate, you will better understand how language features work. Basic Architecture Almost all modern computers today are designed using the Von Neumann architecture of 1954. In the Von Neumann architecture, the computer is divided into a Central Processing Unit, CPU, and memory. The CPU contains all the computational power of the system while the memory stores the program code and data for the program. Von Neumann's innovation was that memory could be used to store both the program instructions and the program's data. The instructions that constitute a program are laid out in consecutive words in memory, ready to be executed in order. The CPU runs in a "fetch-execute" cycle where it retrieves and executes program instructions from memory. The CPU executes the current instruction, and then fetches and executes the next instruction, and so on. The sort of instructions the CPU executes are detailed later in this handout. Memory The smallest unit of memory is the "bit". A bit can be in one of two states on vs. off, or alternately, 1 vs. 0. Technically any object that can have two distinct states can remember one bit of information. This has been done with magnets, gear wheels, and tinker toys, but almost all computers use little transistor circuits called "flip-flops" to store bits. The flip-flop circuit has the property that it can be set to be in one of two states, and will stay in that state and can be read until it is reset. Most computers don't work with bits individually, but instead group eight bits together to form a "byte". Each byte maintains one eight-bit pattern. A group of N bits can be arranged in 2 N different patterns. So a byte can hold 2 8 = 256 different patterns. The memory system as a whole is organized as a large array of bytes. Every byte has its own "address" which is like its index in the array. Strictly speaking, a program can interpret a bit pattern any way it chooses. By far the most common interpretation is to consider the bit pattern to represent a number written in base 2. In this case, the 256 patterns a byte can hold map to the numbers 0..255....
View Full Document

This note was uploaded on 01/14/2010 for the course CS 107 taught by Professor Cain,g during the Spring '08 term at Stanford.

Page1 / 4

06-Computer-Architecture - CS107 Handout 06 Spring 2007...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online