Search 75,901 tutors
FIND TUTORS
Ask a question
0 0

Please help me with this math problem?

Let J=(λ, 0, 1, λ), where λ is an arbitrary real number. (2x2 matrix, λ and 0 on the left, 1 and λ on the right.)
 
(a) Find J^2, J^3, and J^4.
(b) Use an inductive argument to show that J^n=(λ^n, 0, nλ^(n-1), λ^n).
(c) Determine exp(Jt).
(d) Use exp(Jt) to solve the initial value problem x'=Jx, x(0)=x^0.
Tutors, please sign in to answer this question.

2 Answers

Let me continue the Andre's answer:
 
For eJt he obtained the following:
 
eJtn=0(1/n!){(λn, 0);(nλn-1, λn)}tnn=0{( (λntn)/n!, 0);( (nλn-1tn)/n!, (λntn)/n!)}=
=[Σn=1{( (λt)n/n!, 0); ( (λt)n-1t/(n-1)!, (λt)n/n!)}]+{(1,0);(0,1)}={ (eλt, 0); (teλt; eλt) }
 
So, eJt={ (eλt, 0); (teλt; eλt) }
 
The initial value problem x′=Jx; x(0)=x^0; has a solution in the form:
 
x=x^0*eJt; Since we determined eJt in the previous part, we can formally multiply matrix eJt by a vector x^0 to get a solution. I assume x^0 means (1,1) vector. Then the solution is:
 
x(t)=(eλt,(t+1)eλt)
I'm assuming you know how to multiply matrices and what a Taylor series is. If not, let me know.
 
(a)  J² just means multiply matrix J by itself, J² = J*J. Similarly, J³=J*J*J etc.
 
Let's do J² for your matrix:
 
J²=J*J=(λ, 0, 1, λ)*(λ, 0, 1, λ)=(λ², 0, 2λ, λ²)
 
Similarly,
 
J³=J²*J=(λ², 0, 2λ, λ²)*(λ, 0, 1, λ)=(λ³, 0, 3λ², λ³)
J4=J³*J=(λ³, 0, 3λ, λ³)*(λ, 0, 1, λ)=(λ4, 0, 4λ³, λ4)
 
(b) In part (a), we see a pattern emerging, so we claim
 
Jn=(λn, 0, nλn-1, λn) for any positive integer n.
 
The inductive argument goes like this:
 
1. The statement is true for n=1, because
 
J1=J=(λ, 0, 1, λ) by definition.
 
2. Let the statement be true for some arbitrary but fixed n=k:
 
Jk=(λk, 0, kλk-1, λk)
 
Then for n=k+1
 
Jk+1=Jk*J=(λk, 0, kλk-1, λk)* (λ, 0, 1, λ)=(λk+1, 0, (k+1)λk, λk+1).
 
Therefore, by the principle of mathematical induction, the statement is true for all n=1,2,3,...
 
(c) The matrix exponential exp(A) of a matrix A is defined as a Taylor series of matrices analogous to the Taylor series of the function et about t=0:
 
exp(A)= I + A + 1/2! A² + 1/3! A³ + .... =Σ( (1/n!) An, n=0..infinity),
 
where I is the identity matrix and A0=I.
 
Therefore,
 
exp(Jt) = Σ( (1/n!) (Jt)n, n=0..infinity) = Σ( (1/n!) Jntn, n=0..infinity)
 
For our matrix J=(λ, 0, 1, λ), we showed in part (b) Jn=(λn, 0, nλn-1, λn), so that
 
exp(Jt) =Σ( (1/n!) (λn, 0, nλn-1, λn) tn, n=0..infinity)
 
We can distribute the summation and the tn into each entry and turn the series of matrices into a matrix of series. We get
 
exp(Jt) = (Σ (1/n!)λntn, 0, Σ (1/n!) nλn-1tn, Σ (1/n!) λntn)
 
Now use the definition of the Taylor series of the function eλt about t=0 and get
 
exp(Jt) = ( eλt, 0, t eλt, eλt).
 

(d) We have the initial value problem
 
x'=Jx, with x(0)=x0
 
I claim that the solution is
 
x= exp(Jt) x0.
 
Actually, this statement is true for all matrices J, but the proof is especially simple for the triangular matrix J given in this problem, J=(λ, 0, 1, λ). I will show that x' and Jx are equal to the same thing.
 
First,
 
x' = (exp(Jt) x0)' =  d/dt (eλt, 0, t eλt, eλt) x0 =  ( λ eλt, 0, λt eλt + eλt, λ eλt) x0
 
Second,
 
J x = (λ, 0, 1, λ)*(exp(Jt) x0) = (λ, 0, 1, λ)*(eλt, 0, t eλt, eλt) x0
= ( λ eλt, 0, λt eλt+ eλt, λ eλt) x0
 
Both sides of x'=Jx are equal to ( λ eλt, 0, λt eλt+ eλt, λ eλt) x0, so
 
x= exp(Jt) x0
 
is a solution to the equation. By the uniqueness theorem it is the only solution. qed