more on matrix-tree formula

Fix a weighted graph \(\Gamma=(V,E), w:E\to \mathbb{R}\).

The Laplacian \(L\) of \(\Gamma\) is the symmetric matrix

$$
L_{u,v}=\left\{
\begin{array}{ll}
-w(uv)&\mbox{ if }u\neq v,\\
\sum_{u’\neq u} w(uu’)& \mbox{ if }u=v\\
\end{array}\right..
$$

(Here we view the weights \(w\) as formal variables.)

As we all know, any principal minor of \(L\) equals the sum of the weights of spanning trees of \(\Gamma\).

Another way to define the principal minor is as the determinant of the restriction the quadratic form given by \(L\) to any of the coordinate hyperplanes. Turns our the hyperplanes need not to be the coordinate ones: restricting to any (well, almost any) codimension one plane gives you the sum of the weights of the spanning trees.

More precisely, by the “Hesse trick,” the determinant of the extended Laplacian
$$
\tilde{L}=\left(
\begin{array}{cc}
0 & x\\
x^T&L\\
\end{array}
\right)
$$
is equal (up to a factor depending on \(x\)) to the determinant of the restriction of the quadratic form defined by \(L\) to the hyperplane \(\langle x,\cdot\rangle=0\).

Turns out,
$$
\det \tilde{L}=(x_1+x_2+\ldots+x_v)^2\sum_{\mbox{spanning trees }T} w(T).
$$
Very useful!

No comments yet.

Leave a Reply