Archive | research

RSS feed for this section

3.14 Vector Analysis And Classical Identities in 3D

\(\def\pd{\partial}\def\Real{\mathbb{R}}
\)

  1. There are two main types of products between vectors in \(\mathbb{R}^3\):
    The inner/scalar/dot product
    $$ A\cdot B = A_x+B_x + A_yB_y + A_z Bz \in \mathbb{R} $$
    is commutative, distributive, and homogenenous.
    The vector (cross) product:
    $$ A\times B = \begin{pmatrix}
    A_yB_z-A_zB_y \\
    A_zB_c-A_zB_z \\
    A_xB_y-A_yB_x\end{pmatrix} \in \mathbb{R^3} $$
    is homogeneous, not commutative, not associative, but linear with
    respect to each entry.
  2. The cross product \(A\times B\) is always perpendicular to \(A\) and \(B\),
    and its length equals to \(|A||B|\sin(\phi)\) where \(\phi\) is the angle
    between the vectors.
Read full story Comments { 0 }

2.28 operators in Hermitian spaces and their spectra.

\(\def\Real{\mathbb{R}}
\def\Comp{\mathbb{C}}\def\Rat{\mathbb{Q}}\def\Field{\mathbb{F}}\def\Fun{\mathbf{Fun}}\def\e{\mathbf{e}}
\def\f{\mathbf{f}}\def\bv{\mathbf{v}}\def\i{\mathbf{i}}
\def\eye{\left(\begin{array}{cc}1&0\\0&1\end{array}\right)}
\def\bra#1{\langle #1|}\def\ket#1{|#1\rangle}\def\j{\mathbf{j}}\def\dim{\mathrm{dim}}
\def\ker{\mathbf{ker}}\def\im{\mathbf{im}}
\def\tr{\mathrm{tr\,}}
\def\braket#1#2{\langle #1|#2\rangle}
\)

1.
Given a bilinear form \(Q(\cdot,\cdot)\), or, equivalently, a mapping \(Q:U\to U^*\), one can bake easily new bilinear forms from linear operators \(U\to U\): just take \(Q_A(u,v):=Q(u,Av)\).

This opens door for exploiting the interplay between the normal forms of operators and the properties of the corresponding bilinear forms. One problem is though, that the normal forms of endomorphisms involve complex eigenpairs, and the most natural bilinear form on Euclidean space, the scalar product, is not positive definite if one extends the ground field to complex numbers: for a quadratic form, \(q(\lambda v)=\lambda^2q(v)\), which cannot be nonnegative for all \(\lambda\in\Comp\).…

Read full story Comments { 0 }

midterm

problems and solutions here.…

Read full story Comments { 0 }

solutions to 2.14

Exercise:
Let \(V\) be the space of real polynomial functions of degree at most \(3\). Consider the quadratic form \(q_1:V \to \mathbb{R}\) given by $$ q_1(f):=|f(-1)|^2+|f(0)|^2+|f(1)|^2. $$ Is this form positive definite?
Consider another form \(q:V\to \mathbb{R}\) given by $$ q(f):=\int_{0}^\infty e^{-x} |f(x)|^2 dx $$ Diagonalize the form using Gram-Schmidt procedure starting with the standard monomial basis \(\{1,x,x^2,x^3\}\).
Solution:
In order for \(q_1\) be positive definite, it should vanish only for the zero polynomial, namly \(q_1(f)=0\) if and only if \(f(x)=0\).…

Read full story Comments { 0 }

2. 14 quadratic forms

\(\def\Real{\mathbb{R}}\def\Comp{\mathbb{C}}\def\Rat{\mathbb{Q}}\def\Field{\mathbb{F}}\def\Fun{\mathbf{Fun}}\def\e{\mathbf{e}}
\def\f{\mathbf{f}}\def\bv{\mathbf{v}}\def\i{\mathbf{i}}
\def\eye{\left(\begin{array}{cc}1&0\\0&1\end{array}\right)}
\def\bra#1{\langle #1|}\def\ket#1{|#1\rangle}\def\j{\mathbf{j}}\def\dim{\mathrm{dim}}
\def\ker{\mathbf{ker}}\def\im{\mathbf{im}}
\def\tr{\mathrm{tr\,}}
\def\braket#1#2{\langle #1|#2\rangle}
\)

1. Bilinear forms are functions \(Q:U\times U\to k \) that depend on each of the arguments linearly.

Alternatively, one can think of them as the linear operators
\[
A:U\to U^*, \mathrm{ \ with\ } Q(u,v)=A(u) (v).
\]
If \(U\) has a basis, the bilinear form can be identified with the matrix of its coefficients:
\[
Q_{ij}=Q(e_i,e_j).
\]

(Notice that the order matters!)

Rank 1 bilinear forms of rank 1 are just products of linear functions, \(\bra{u}\bra{v}\).…

Read full story Comments { 0 }

solutions to 2.9

Exercise: consider the mapping that takes a quadratic polynomial \(q\) to its values at \(0,1,2\) and \(3\). Find the normal form of this operator.
Solution: let \(V\) denote the space of quadratic polynomials with the standard basis \(\{1,x,x^2\}\), in which \(p(x) = a_0+a_1 x + a_2 x^2\). The mapping \(T:V\to \mathbb{R}^4\) has the form $$ Tp = \begin{pmatrix} p(0) \\ p(1) \\ p(2) \\p(3) \end{pmatrix} = \begin{pmatrix} 1 & 0 & 0 \\ 1 & 1 & 1 \\ 1 & 2 & 4 \\ 1 & 3 & 9 \end{pmatrix} \begin{pmatrix} a_0 \\ a_1 \\ a_2 \end{pmatrix} $$

All we need to do is to find bases in which the matrix representation is block identity.…

Read full story Comments { 0 }

2.9 normal forms

\(\def\Real{\mathbb{R}}\def\Comp{\mathbb{C}}\def\Rat{\mathbb{Q}}\def\Field{\mathbb{F}}\def\Fun{\mathbf{Fun}}\def\e{\mathbf{e}}
\def\f{\mathbf{f}}\def\bv{\mathbf{v}}\def\i{\mathbf{i}}
\def\eye{\left(\begin{array}{cc}1&0\\0&1\end{array}\right)}
\def\bra#1{\langle #1|}\def\ket#1{|#1\rangle}\def\j{\mathbf{j}}\def\dim{\mathrm{dim}}
\def\ker{\mathbf{ker}}\def\im{\mathbf{im}}
\def\tr{\mathrm{tr\,}}
\def\braket#1#2{\langle #1|#2\rangle}
\)
1. We know that an operator can be represented as a matrix if you fix the basis. Changing the basis changes the matrix. One can try to make the matrix simpler, bringing it to one of the normal forms. If the operator is given by a matrix (in some basis), then change of the basis (in the source, or in the target space).

The normal forms depend on the type of the linear operator.…

Read full story Comments { 0 }

solutions to 2.7

Exercise: find the product of rotations by \(\frac{\pi}{2}\) around \(x, y\) and then \(z\) axes.
Solution: \( \newcommand{Rot}[1]{\overset{R_#1}{\longrightarrow}} \) Let \(R_x,R_y\) and \(R_z\) represent the rotation by \(\frac{\pi}{2}\) matrices about the \(x,y\) and \(z\) axes respectively, as defined in the lectures notes: $$ R_x = \left[\begin{array}{c|cc} 1 & 0 & 0\\ \hline 0 & 0 & 1 \\ 0 & -1 & 0 \end{array}\right],~~~ R_y = \left[\begin{array}{c|c|c} 0 & 0 & -1 \\ \hline 0 & 1 & 0 \\ \hline 1 & 0& 0 \end{array}\right],~~~ R_z = \left[\begin{array}{cc|c} 0 & 1 & 0 \\ -1 & 0 & 0 \\ \hline 0 & 0 & 1 \end{array}\right] $$ Our goal is to compute \(R_zR_yR_x\), and although this is a very simple product, we can also deduce the overall rotation by pure geometrical means.…

Read full story Comments { 0 }

2.7 functions of operators

\(\def\Real{\mathbb{R}}\def\Comp{\mathbb{C}}\def\Rat{\mathbb{Q}}\def\Field{\mathbb{F}}\def\Fun{\mathbf{Fun}}\def\e{\mathbf{e}}\def\f{\mathbf{f}}\def\bv{\mathbf{v}}\def\i{\mathbf{i}}
\def\eye{\left(\begin{array}{cc}1&0\\0&1\end{array}\right)}
\def\bra#1{\langle #1|}\def\ket#1{|#1\rangle}\def\j{\mathbf{j}}\def\dim{\mathrm{dim}}
\def\braket#1#2{\langle #1|#2\rangle}
\def\ker{\mathbf{ker}}\def\im{\mathbf{im}}
\def\e{\mathcal{E}}
\def\tr{\mathrm{tr}}
\)

1.
A linear operator \(A:U\to U\) that maps a space into itself is called endomorphism. Such operators can be composed with impunity. In elevated language, they form an algebra. (It is a generalization of our representation of complex numbers as \(2\times 2\)-matrices).


2.
Which means we can form some functions of operators. Polynomials are the easiest one: if \(P=a_0x^n+\ldots+a_n\), then
\[
P(A)=a_0A^n+\ldots+a_n E.
\]

Other functions can be defined as well, if they can be approximated by polynomials: the exponential is a familiar example:
\[
\exp(A)=\sum_{k\geq 0} A^k/k!…

Read full story Comments { 0 }

solutions to exercises 1.26

Exercise: Find the determinants of Jacobi matrices with \(a=1,bc=-1\) and \(a=-1,bc=-1\).
Solution: First assume \(a=1,bc=-1\), and use the regression obtained in the lecture notes, i.e $$j_{k+1}=aj_k-bcj_{k-1} = j_k+j_{k-1}$$

Those are Fibonacci numbers with initial conditions obtained by the first two determinants: $$ j_1 = \text{det}(a) = a = 1,~~~ ~~~ j_2 = \left| \begin{matrix} a & b \\ c & a \end{matrix} \right| = a^2-bc = 1+1 = 2, $$ and the rest follows the standard Fibonacci sequence (\(1,2,3,5,8,13,\ldots\)). For the second case, \(a=-1,bc=-1\), we similarly get $$ j_{k+1}=aj_k-bcj_{k-1} = -j_k+j_{k-1}$$ with initial conditions \(j_1 = -1\) and \(j_2 = 1+1 = 2 \), which is the alternating Fibonacci sequence (\(-1,2,-3,5,-8,13,\ldots\)).…

Read full story Comments { 0 }