This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Lecture 1: Basic Structures and Definition of PageRank Transcriber: Mary Radcliffe 1 A Glimpse of Spectral Graph Theory Let G = ( V,E ) be a graph, where n =  V  . Define the adjacency matrix of G , denoted by A , to be a matrix indexed by V , with A ( u,v ) = 1 u v else where u v denotes adjacency (i.e., u v iff { u,v } E ). We generally think of the graph G as being fairly large, so we are interested in ways to simplify the computations required to glean information from the graph. Because A is a realvalued symmetric matrix, we know from linear algebra that has real eigenvalues 1 n 1 . Moreover, A is diagonalizable, so A = T 1 T , where is the diagonal matrix with i in the ( i,i ) position and the columns of T form an orthogonal eigenbasis corresponding to A . Diagonalizing the matrix allows multiplication with the adjacency matrix to be done more simply, as A k = T 1 k T . Let 1 denote the length n vector 1 of all 1s. Then 1 A k 1 * is the number of length k walks in G . Exercise 1. = lim k 1 A k 1 * 11 * 1 /k 2 Introduction to Random Walks Given G as above, define the degree matrix of G , denoted by D , to be a diagonal matrix indexed by V with D ( v,v ) = d v , where d v denotes the degree of v . We define a random walk on G to be a Markov chain with transition probability matrix P , where P ( u,v ) denotes the probability that, given the random walk is...
View
Full
Document
This note was uploaded on 09/30/2011 for the course MATH 262 taught by Professor Aterras during the Fall '08 term at UCSD.
 Fall '08
 aterras
 Math, Graph Theory

Click to edit the document details