This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Lecture 1: Basic Structures and Definition of PageRank Transcriber: Mary Radcliffe 1 A Glimpse of Spectral Graph Theory Let G = ( V,E ) be a graph, where n =  V  . Define the adjacency matrix of G , denoted by A , to be a matrix indexed by V , with A ( u,v ) = 1 u ∼ v else where u ∼ v denotes adjacency (i.e., u ∼ v iff { u,v } ∈ E ). We generally think of the graph G as being fairly large, so we are interested in ways to simplify the computations required to glean information from the graph. Because A is a realvalued symmetric matrix, we know from linear algebra that has real eigenvalues ρ ≥ ρ 1 ≥ ··· ≥ ρ n 1 . Moreover, A is diagonalizable, so A = T 1 Γ T , where Γ is the diagonal matrix with ρ i in the ( i,i ) position and the columns of T form an orthogonal eigenbasis corresponding to A . Diagonalizing the matrix allows multiplication with the adjacency matrix to be done more simply, as A k = T 1 Γ k T . Let 1 denote the length n vector 1 of all 1’s. Then 1 A k 1 * is the number of length k walks in G . Exercise 1. ρ = lim k →∞ 1 A k 1 * 11 * 1 /k 2 Introduction to Random Walks Given G as above, define the degree matrix of G , denoted by D , to be a diagonal matrix indexed by V with D ( v,v ) = d v , where d v denotes the degree of v . We define a random walk on G to be a Markov chain with transition probability matrix P , where P ( u,v ) denotes the probability that, given the random walk is...
View
Full Document
 Fall '08
 aterras
 Math, Graph Theory, Vertex, Markov chain, adjacency matrix, Spectral graph theory

Click to edit the document details