# solutions to exercises 1.19

Exercise: do the polynomials $$p_0(x)=1,~~p_1(x)=(x-1),~~p_2(x)=(x-1)^2,~~p_3=(x-1)^3$$ form a basis of the vector space of cubic polynomials (with real coefficients)? If so, express $$x^3$$ in this basis.

Solution: use the binomial: $$x^n = \big((x-1)+1\big)^n = \sum_{m=0}^n {n\choose m}(x-1)^m$$ to infer that $$x^n$$ is a linear combination of the polynomials $$\{(x-1)^m\}_{m=0}^{n}$$ (having the same linear span). Then $$p_0,p_1,p_2,p_3$$ have the same span as $$1,x,x^2,x^3$$. Specifically in our case, $$x^3 = (x-1)^3 + 3(x-1)^2+3(x-1)+1$$.

Exercise: let $$V$$ be the linear space of quadratic polynomials, and define
the function $$e_k^*:V\to \mathbb{C}$$ by $$e_k^*p:=p(k-1),~~k=1,2,3$$.
Check that these functions are linear (as functions on polynomials). Express $$p(3)$$ as a linear combination of $$e_k$$’s.

Solution: recall that if $$p_1(x) = \sum_{k=0}^m a_k x^k$$ and $$p_2(x) = \sum_{k=0}^n b_k x^k$$ (assuming that $$m\leq n$$ without loss of generality, then the polynomial $$\alpha p_1+\beta p_2$$ is defined by the linear combination of the coefficients:

$$(\alpha p_1+\beta p_2)(x) = \sum_{k=0}^n (\alpha a_k+\beta b_k) x^k = \alpha p_1(x) + \beta p_2(x)$$

where by convention $$a_k:=0$$ for all $$m<k\leq n$$.

From that we can infer that valuation functionals are indeed  linear:

$$e_k^*(\alpha p_1+\beta p_2) = (\alpha p_1 + \beta p_2)(k-1) = \alpha p_1(k-1) + \beta p_2(k-1) = \alpha (e_k^*p_1) + \beta (e_k^* p_2)$$

The valuation functional $$e_x^*$$ can be represented by the powers (row) covector acting on the coefficients (column) vector:

$$e_x^* p = p(x) = \begin{bmatrix} 1 & x & x^2 \end{bmatrix} \begin{bmatrix} c_0 \\ c_1 \\ c_2 \end{bmatrix}$$

The functionals $$e_0^*,~e_1^*$$ and $$e_2^*$$ are linearly independent, can be stacked  as rows of a $$3\times 3$$ matrix:

$\begin{bmatrix} e_0^* \\ e_1^* \\ e_2^* \end{bmatrix} = \underbrace{\begin{bmatrix} 1 & 0 & 0 \\ 1 & 1 & 1 \\ 1 & 2 & 4\end{bmatrix}}_{A}$

The functional $$e_3^* = \begin{bmatrix} 1 & 3 & 9 \end{bmatrix}$$ is linearly dependent on $$e_0^*, e_1^*, e_2^*$$. We wish to solve the linear system

$$e_3^* = c_0 e_0^* + c_1 e_1^* + c_2 e_2^* = \underbrace{\begin{bmatrix} c_0 & c_1 & c_2 \end{bmatrix}}_{c} A$$

and indeed, by inversion of $$A$$ we get $$c =\begin{bmatrix} 1 & 3 & 9\end{bmatrix}A^{-1} =\begin{bmatrix} 1 & -3 & 3\end{bmatrix}$$,or $$e_3^* = e_0^* -3e_1^* + 3e_2^*$$

Exercise: find the smallest linear subspace of the space of smooth functions that is invariant with respect to shift, and containing $$\exp(2x)-x^2$$.

Solution (option #1): note that whenever $$V$$ contains smooth functions and is closed under shifts, then it is closed under differentiations (see Note below).

Let $$f(x) = \exp(2x)-x^2$$ and let $$u_k(x)=f^{(k)}(x)$$. If $$V$$ then $$u_k\in V$$ for all $$k\geq 0$$; let us compute a few of those derivatives:
$$u_0(x) = \exp(2x)-x^2$$
$$u_1(x) = 2\exp(2x)-2x$$
$$u_2(x) = 4\exp(2x)-2$$
$$u_3(x) = 8\exp(2x)$$
$$u_4(x) = 16\exp(2x) = 2u_3(x)$$
$$u_5(x) = 32\exp(2x) = 2u_4(x)$$.

Since $$u_0,\ldots,u_3$$ are linearly independent the minimal shift invariant space (or derivative invariant space) must contain all of them, i.e $$V=\text{span}\{u_0,\ldots,u_3\}$$, which is shift invariant and contains smooth functions (being spanned by smooth functions). The linear independence streak breaks with $$u_4 = 2u_3$$, and in general $$u_{n+1}(x) = 2u_{n}(x)$$ for all $$n\geq 3$$.

Note: recall that in class we mentioned that whenever $$V$$ is a space of smooth functions that is also shift invariant, there exists some differential equation $$\sum_{k=0}^d c_k f^{(k)}(x) =0$$ whose solution space is $$V$$. If we differentiate again this very differential equation, we get $$\sum_{k=0}^d c_k (f’)^{(k)}(x)=0$$. So whenever $$f(x)\in V$$ is a solution of the differential equation, then so is its first derivative $$f'(x)\in V$$, and by induction a derivative $$f^{(n)}\in V$$ of any order. In this problem, the differential equation is $$f^{(4)}(x) -2 f^{(3)}(x) = 0$$.

Solution (option #2): we look for a linear function space $$V$$  that has the following properties:

1. Shift invariant
2. Contains smooth functions
3. Contains  $$f(x) = \exp(2x)-x^2$$
4. Has minimal dimension (i.e does not contain a subspace of lower dimension with all the listed properties).

Since the space should contain $$f$$ and all its shifts, we first look at what algebraic expressions occur when we shift $$f(x)$$:

$$f(x+T) = e^{2T}\exp(2x) -x^2-2Tx-T^2$$

We can use the function $$f$$ itself as a basis vector to $$V$$ (and then add other functions as needed), but since $$f$$ (and its arbitrary shift) is a sum of a polynomial of degree $$2$$ and and exponential function, we might use those (simpler terms) instead as basis vectors.

Let $$u_0(x) = 1,~ u_1(x) = x,~ u_2(x) = x^2$$, and $$u_3(x) = \exp(2x)$$, and let $$V=\text{span}\{u_0,u_1,u_2,u_3\}$$.  This space has all the desired properties:

1. $$V$$ is shift invariant, as its shifted basis vectors remain in $$V$$:
$$u_0(x+T) = 1 = u_0(x) \in V$$
$$u_1(x+T) = (x+T) = u_1(x)+Tu_0(x) \in V$$
$$u_2(x+T) = (x+T)^2 = u_2(x)+2Tu_1(x) + T^2u_0(x) \in V$$
$$u_3(x+T) = \exp(2(x+T)) = e^{2T} u_3(x) \in V$$
2. $$V$$ is spanned by four smooth functions (three polynomials and one exponential), hence contains only smooth functions.
3. $$V$$ contains $$f=u_3-u_2 \in V$$ and all its possible shifts:
$f(x+T) = e^{2T}u_3(x) – u_2(x) -2T u_1(x) – T^2 u_0(x) = \underbrace{\begin{bmatrix} e^{2T} & -1 & -2T & -T^2 \end{bmatrix}}_{C_T}\begin{bmatrix} u_3(x) \\ u_2(x) \\ u_1(x) \\ u_0(x) \end{bmatrix}$
4.  Lastly, we should comment on the minimality of $$V$$. The dimension of $$V$$ is $$4$$, being spanned by four linearly independent functions. The question is now whether a space of lower dimension ($$\leq 3$$) can satisfy all the above properties (1)-(3). We will show that $$f(t),f(t+1),f(t-1)$$ and $$f(t+2)$$ are linearly independent, and indeed
$$C_0 = \begin{bmatrix} 1 & -1 & 0 & 0 \end{bmatrix}$$
$$C_1 = \begin{bmatrix} e^2 & -1& -2& -1 \end{bmatrix}$$
$$C_2 = \begin{bmatrix} e^{-2} & -1& 2& -1 \end{bmatrix}$$
$$C_3 = \begin{bmatrix} e^{4} & -1& -4& -4 \end{bmatrix}$$The four coefficient vectors above are linearly independent (easily verifiable by taking their determinant), hence their corresponding functions (all of which reside in $$V$$) are also linearly independent – hence $$V$$ must be no less than $$4$$ dimensional.