In mathematics and computer science, a finite graph is often represented by its adjacency matrix. The adjacency matrix of a directed or undirected graph (with n vertices, say) is the n-by-n matrix whose entry in row i and column j gives the number of edges from the i-th to the j-th vertex. (An alternative way to store graphs in computers, especially graphs with few edges, is given by the incidence matrix.)
The modified adjacency matrix is generated by replacing all entries greater than 1 in the adjacency matrix by 1.
The adjacency matrix for the example graph
is
The adjacency matrix of undirected graphs is always symmetric.
Suppose two directed or undirected graphs G_{1} and G_{2} with adjacency matrices A_{1} and A_{2} are given. G_{1} and G_{2} are isomorphic if and only if there exists a permutation matrix P such that
If A is the adjacency matrix of the directed or undirected graph G, then the matrix A^{n} (i.e. the matrix product of n copies of A) has an interesting interpretation: the entry in row i and column j gives the number of (directed or undirected) paths of length n from vertex i to vertex j.
The matrix I−A (where I denotes the n-by-n identity matrix) is invertible if and only if there are no directed cycles in the graph G. In this case, the inverse (I−A)^{−1} has the following interpretation: the entry in row i and column j gives the number of directed paths from vertex i to vertex j (which is always finite if there are no directed cycles). This can be understood using the geometric series for matrices: