A PHP Error was encountered

Severity: Warning

Message: fopen(/home/polpe/.phpsession/ci_session5c2835f2e1b8252a09cb0b7a935033e8df868160): failed to open stream: No space left on device

Filename: drivers/Session_files_driver.php

Line Number: 159

Backtrace:

File: /home/polpe/public_html/application/controllers/Main.php
Line: 17
Function: library

File: /home/polpe/public_html/index.php
Line: 315
Function: require_once

Polcz Péter honlapja

Tartalomjegyzék

math.StackExchange

Derivation of the matrix 2-norm (source)

Eigenfunctions of Hermitian Operators are Orthogonal

Position Space and Momentum Space

This may be a trivial question yet I was unable to find an answer:

$$\left \| A \right \| _2=\sqrt{\lambda_{\text{max}}(A^{^*}A)}=\sigma_{\text{max}}(A)$$

where the spectral norm $\left \| A \right \| _2$ of a complex matrix $A$ is defined as $$\text{max} \left\{ \|Ax\|_2 : \|x\| = 1 \right\}$$

How does one prove the first and the second equality?

Put $B=A^*A$ which is a Hermite matrix.

As a linear transformation of Euclidean vector space $E$ is Hermite iff there exists an orthonormal basis of E consisting of all the eigenvectors of $B$

Let $\lambda_1,...,\lambda_n$ be the eigenvalues of $B$ and $\left \{ e_1,...e_n \right \}$ be an orthonormal basis of $E$

Let $x=a_1e_1+...+a_ne_n$

we have $\left \| x \right \|=\left \langle \sum_{i=1}^{n}a_ie_i,\sum_{i=1}^{n}a_ie_i \right \rangle^{1/2} =\sqrt{\sum_{i=1}^{n}a_i^{2}}$,

$Bx=B\left ( \sum_{i=1}^{n}a_ie_i \right )=\sum_{i=1}^{n}a_iB(e_i)=\sum_{i=1}^{n}\lambda_ia_ie_i$

Denote $\lambda_{j_{0}}$ to be the largest eigenvalue of $B$.

Therefore,

$\left \| Ax \right \|=\left \langle Ax,Ax \right \rangle=\left \langle x,A^*Ax \right \rangle=\left \langle x,Bx \right \rangle=\left \langle \sum_{i=1}^{n}a_ie_i,\sum_{i=1}^{n}\lambda_ia_ie_i \right \rangle=\sqrt{\sum_{i=1}^{n}a_i\overline{\lambda_ia_i}} \leq \underset{1\leq j\leq n}{max}\sqrt{\left |\lambda_j \right |} \times (\left \| x \right \|)$

So, if $\left \| A \right \|$ = $\text{max} \left\{ \|Ax\| : \|x\| = 1 \right\}$ then $\left \| A \right \|\leq \underset{1\leq j\leq n}{max}\sqrt{\left |\lambda_j \right |}$ (1)

Consider: $x_0=e_{j_{0}}$ $\Rightarrow \left \| x \right \|=1$ so that $\left \| A \right \| \geq \left \langle x,Bx \right \rangle=\left \langle e_{j_0},B(e_{j_0}) \right \rangle=\left \langle e_{j_0},\lambda_{j_0} e_{j_0} \right \rangle=\sqrt{\left | \lambda_{j_0} \right |}$ (2)

Combining (1) and (2) gives us $\left \| A \right \|= \underset{1\leq j\leq n}{max}\sqrt{\left | \lambda_{j} \right |}$ where $\lambda_j$ is the eigenvalue of $B=A^*A$

Conclusion: $$\left \| A \right \| _2=\sqrt{\lambda_{\text{max}}(A^{^*}A)}=\sigma_{\text{max}}(A)$$

Geometric Interpretation of Non-Involutive Distribution (source)

Suppose $\theta:\mathbb{R}^3\rightarrow\mathbb{R}$ and $\partial_3\theta=c$ is constant. I was interested in the distribution generated by the orthogonal complement of $v=(\cos(\theta),\sin(\theta),0)$.

So, if one takes $\alpha=(-\sin(\theta),\cos(\theta),0)$ and $\beta=(0,0,1)$, I can check if $[\alpha,\beta]\in\text{span}(\{\alpha,\beta\})$. I get that: $$ [\alpha,\beta] =\sum_i\sum_j \alpha^i\frac{\partial \beta^j}{\partial x^i}\partial_j - \beta^j\frac{\partial \alpha^i}{\partial x^j}\partial_i = \partial_3\theta(\cos(\theta),\sin(\theta),0) =cv \notin\text{span}(\{\alpha,\beta\}) $$ So the distribution is not involutive. Therefore, there are no 2D integral submanifolds for this distribution, if I understand correctly.

Is there some intuitive explanation for where the non-involutivity lies and what it means geometrically?

A matrix and its transpose have the same set of eigenvalues (source)

I'm going to work a little bit more generally.

Let $V$ be a finite dimensional vector space over some field $K$, and let $\langle\cdot,\cdot\rangle$ be a nondegenerate bilinear form on $V$.

We then have for every linear endomorphism $A$ of $V$, that there is a unique endomorphism $A^*$ of $V$ such that $$\langle Ax,y\rangle=\langle x,A^*y\rangle$$ for all $x$ and $y\in V$.

The existence and uniqueness of such an $A^*$ requires some explanation, but I will take it for granted.

Proposition: Given an endomorphism $A$ of a finite dimensional vector space $V$ equipped with a nondegenerate bilinear form $\langle\cdot,\cdot\rangle$, the endomorphisms $A$ and $A^*$ have the same set of eigenvalues.

Proof: Let $\lambda$ be an eigenvalue of $A$. And let $v$ be an eigenvector of $A$ corresponding to $\lambda$ (in particular, $v$ is nonzero). Let $w$ be another arbitrary vector. We then have that: $$\langle v,\lambda w\rangle=\langle\lambda v,w\rangle=\langle Av,w\rangle=\langle v,A^*w\rangle$$ This implies that $\langle v,\lambda w-A^*w\rangle =0$ for all $w\in V$. Now either $\lambda$ is an eigenvalue of $A^*$ or not. If it isn't, the operator $\lambda I -A^*$ is an automorphism of $V$ since $\lambda I-A^*$ being singular is equivalent to $\lambda$ being an eigenvalue of $A^*$. In particular, this means that $\langle v, z\rangle = 0$ for all $z\in V$. But since $\langle\cdot,\cdot\rangle$ is nondegenerate, this implies that $v=0$. A contradiction. $\lambda$ must have been an eigenvalue of $A^*$ to begin with. Thus every eigenvalue of $A$ is an eigenvalue of $A^*$. The other inclusion can be derived similarly.

How can we use this in your case? I believe you're working over a real vector space and considering the dot product as your bilinear form. Now consider an endomorphism $T$ of $\Bbb R^n$ which is given by $T(x)=Ax$ for some $n\times n$ matrix $A$. It just so happens that for all $y\in\Bbb R^n$ we have $T^*(y)=A^t y$. Since $T$ and $T^*$ have the same eigenvalues, so do $A$ and $A^t$.