1. hw 9
(1) Let L(Rn, Rm) be the space of all linear maps from Rn to Rm. For each T ∈ L(Rn, Rm), we define
kT kop= sup
kxkRn=1
kT (x)kRn. (a) Prove that k · kop defines a norm on L(Rn, Rm).
(b) Prove that kT (x)kRm ≤ kT kopkxkRn for all x ∈ Rn. (c) Let S ∈ L(Rm, Rp). Prove that kS ◦ T kop≤ kSkopkT kop. (2) For each A ∈ Mmn(R), we define the matrix norm of A to be
kAk = kLAkop
where LA: Rn→ Rm is the linear map LA(x) = Ax for any x ∈ Rn. In this exercise, we assume m = n.
(a) Use mathematical induction to prove that kAkk ≤ kAkk for any k ≥ 1.
Remark. This implies that the sequence of numbers (kAkk1/k) is bounded above by kAk. We define the spectral radius of a square matrix A to be
ρ(A) = lim sup
k→∞
kAkk1/k. This implies that ρ(A) ≤ kAk.
(b) Let Eij be the n × n matrix whose ij-th entry is 1 and zero otherwise. Find kEijk for all 1 ≤ i, j ≤ n.
(c) Let λ1, · · · , λn be n real numbers. Suppose that D = Pn
i=1λiEii, i.e. D is a diagonal matrix. Prove that
kDk = max{|λ1|, · · · , |λn|}.
We denote D by diag(λ1, · · · , λn). Prove that ρ(D) = kDk.
(3) Let R > 0. Suppose that the power series f (x) =P∞
k=0akxk is convergent for any x ∈ (−R, R). Here an∈ R for any n ≥ 0. Let A be an n × n real matrix such that kAk ∈ (−R + δ, R − δ) where 0 < δ < R.
(a) Prove that P∞
k=0akAk is convergent in (Mn(R), k · k). In this case, we define f (A) =P∞
k=0akAk.
(b) If D = diag(λ1, · · · , λn), where λ1, · · · , λn are real numbers in (−R + δ, R − δ).
Prove that f (D) = diag(f (λ1), · · · , f (λn)), i.e. f (D) =Pn
i=1f (λi)Eii.
(c) Let A be diagonalizable1 with A = SDS−1 where D = diag(λ1, · · · , λn). Sup- pose λi ∈ (−R + δ, R − δ) for 1 ≤ i ≤ n. Show that f (A) = Sf (D)S−1.
Remark. If A is diagonalizable with A = SDS−1, then exp(A) = S−1exp(D)S, and cos A = S−1cos(D)S, and sin A = S−1sin(D)S.
(d) Let λ ∈ (−R, R) and denote
J3(λ) =
λ 1 0 0 λ 1 0 0 λ
.
1A matrix A ∈ Mn(R) is said to be diagonalizable if there exists an invertible matrix S and a diagonal matrix D in Mn(R) such that S−1AS = D.
1
2
Compute f (J3(λ)) in terms of f and λ and compute exp(tJ3(λ)) for all t, λ ∈ R.
In general, compute f (Jn(λ)) and compute exp(tJn(λ)), where2
Jn(λ) =
λ 1 0 · · · 0 0 λ 1 · · · 0 ... ... ... . .. ...
0 0 0 λ 1
0 0 0 0 λ
n×n
(4) Let A, B : (a, b) → Mn(R) be matrix valued differentiable functions3. . (a) Prove that
d
dtTr(A(t)) = Tr(A0(t)) for any a < t < b.
(b) Prove that
(A(t)B(t))0 = A0(t)B(t) + A(t)B0(t) for any a < t < b.
(c) Show that A0(t) = 0 for t ∈ (a, b) if and only if there exists A ∈ Mn(R) such that A(t) = A for any a < t < b.
(d) Suppose that A(t) ∈ GLn(R) for all a < t < b. Prove that
(1.1) d
dt(A(t))−1 = −A(t)−1A0(t)A(t)−1 for any a < t < b.
(e) Let t0 ∈ (a, b). Let C : [a, b] → Mn(R) be a continuous function. Define F : [a, b] → Mn(R) by
F (t) = Z t
a
C(s)ds, t ∈ [a, b].
Prove that F is differentiable with F0 = C.
(f) Let A(t) = etcos t etsin t
−etsin t etcos t
, for t ∈ R. Find A0(t), Z t
0
A(s)ds and verify the equation (1.1) using A(t).
(5) Let A =
2 1 1 2
∈ M2(R).
(a) Let χA(λ) = det(λI2− A). Find the roots of χA(λ).
(b) Let λ1 and λ2 be the roots of χA(λ). Find unit vectors u1 and u2 such that Aui = λiui for i = 1, 2 and a matrix S whose i-th column vector is ui with det S > 0. More precisely, if ui= (xi, yi), then
S =
x1 x2
y1 y2
.
(c) Prove that AS = SD where D = diag(λ1, λ2) and that A is diagonalizable.
(d) Use the result obtained in exercise (3) to compute exp A, cos A and sin A.
2Jnis called a Jordan matrix.
3Let (V, k · k) be a normed space and f : (a, b) → V be a function. (Such a function is called a V -valued function).
Let t0∈ (a, b), we say that f is differentiable at t0 if
t→tlim0 1 t − t0
(f (t) − f (t0)) exists.
In this case, we denote the limit by f0(t0). If f is differentiable at every point of (a, b), we say that f is differentiable.
We can inductively define f(k)(t) for any k ≥ 1. (We can also define the right derivative and the left derivative of f.)
3
(e) Solve for the matrix differential equation
X0(t) = AX(t), X(0) = I2. (f) Let Y (t) = cos(tA) and Z(t) = sin(tA)
A for t ≥ 0. Verify that Y (t) and Z(t) are both solutions to the matrix differential equation
Φ00(t) + A2Φ(t) = 0.
Here we use
sin θ
θ =
∞
X
k=0
(−1)k θ2k
(2k + 1)! for any θ.