# Differential Equations and Linear Algebra, 6.1: Eigenvalues and Eigenvectors

From the series: Differential Equations and Linear Algebra

*
Gilbert Strang, Massachusetts Institute of Technology (MIT)
*

The eigenvectors **x** remain in the same direction when multiplied by the matrix (*A***x** *=* λ**x**). An *n* x *n* matrix has *n* eigenvalues.

**Published:
27 Jan 2016**

So today begins eigenvalues and eigenvectors. And the reason we want those, need those is to solve systems of linear equations. Systems meaning more than one equation, n equations. n equal 2 in the examples here.

So eigenvalue is a number, eigenvector is a vector. They're both hiding in the matrix. Once we find them, we can use them. Let me show you the reason eigenvalues were created, invented, discovered was solving differential equations, which is our purpose.

So why is now a vector-- so this is a system of equations. I'll do an example in a minute. A is a matrix. So we have n equations, n components of y. And A is an n by n matrix, n rows, n columns. Good.

And now I can tell you right away where eigenvalues and eigenvectors pay off. They come into the solution. We look for solutions of that kind. When we had one equation, we looked for solutions just e to the st, and we found that number s. Now we have e to the lambda t-- we changed s to lambda, no problem-- but multiplied by a vector because our unknown is a vector. This is a vector, but that does not depend on time. That's the beauty of it. All the time dependence is in the exponential, as always. And x is just multiples of that exponential, as you'll see.

So I look for solutions like that. I plug that into the differential equation and what happens? So here's my equation. I'm plugging in what e to the lambda tx, that's y. That's A times y there. Now, the derivative of y, the time derivative, brings down a lambda. To get the derivative I include the lambda.

So do you see that substituting into the equation with this nice notation is just this has to be true. My equation changed to that form. OK Now I cancel either the lambda t, just the way I was always canceling e to the st. So I cancel e to the lambda t because it's never zero. And I have the big equation, Ax, the matrix times my eigenvector, is equal to lambda x-- the number, the eigenvalue, times the eigenvector. Not linear, notice. Two unknowns here that are multiplied. A number, lambda, times a vector, x.

So what am I looking for? I'm looking for vectors x, the eigenvectors, so that multiplying by A-- multiplying A times x gives a number times x. It's in the same direction as x just the length is changed. Well, if lambda was 1, I would have Ax equal x. That's allowed.

If lambda is 0, I would have Ax equals 0. That's all right. I don't want x to be 0. That's useless. That's no help to know that 0 is a solution. So x should be not 0. Lambda can be any number. It can be real, it could be complex number, as you will see. Even if the matrix is real, lambda could be complex. Anyway, Ax equal lambda x. That's the big equation. It got a box around it.

So now I'm ready to do an example. And in this example, first of all, I'm going to spot the eigenvalues and eigenvectors without a system, just go for it in the 2 by 2 case. So I'll give a 2 by 2 matrix A. We'll find the lambdas and the x's, and then we'll have the solution to the system of differential equations. Good.

There's the system. There's the first equation for y1-- prime meaning derivative, d by dt, time derivative-- is linear, a constant coefficient. Second one, linear, constant coefficient, 3 and 3. Those numbers, 5, 1, 3, 3, go into the matrix. Then that problem is exactly y prime, the vector, derivative of the vector, equal A times y. That's my problem.

Now eigenvalues and eigenvectors will solve it. So I just look at that matrix. Matrix question. What are the eigenvalues, what are the eigenvectors of that matrix? And remember, I want Ax equals lambda x.

I've spotted the first eigenvector. 1, 1. We could just check does it work. If I multiply A by that eigenvector, 1, 1, do you see what happens when I multiply by 1? That gives me a 6. That gives me a 6. So A times that vector is 6, 6. And that is 6 times 1, 1. So there you go. Found the first eigenvalue. If I multiply A by x, I get 6 by x. I get the vector 6, 6.

Now, the second one. Again, I've worked in advance, produced this eigenvector, and I think it's 1 minus 3. So let's multiply by A. Try the second eigenvector. I should call this first one maybe x1 and lambda 1. And I should call this one x2 and lambda 2. And we can find out what lambda 2 is, once I find the eigenvectors of course. I just do A times x to recognize the lambda, the eigenvalue.

So 5, 1 times this is 5 minus 3 is a 2. It's a 2. So here I got a 2. And from 3, 3 it's 3 minus 9 is minus 6. That's what I got for Ax. There was the x. When I did the multiplication, Ax came out to be 2 minus 6. Good.

That output is two times the input. The eigenvalue is 2. Right? I'm looking for inputs, the eigenvector, so that the output is a number times that eigenvector, and that number is lambda, the eigenvalue. So I've now found the two. And I expect two for a 2 by 2 matrix. You will soon see why I expect two eigenvalues, and each eigenvalue should have an eigenvector.

So here they are for this matrix. So I've got the answers now. y of t, which stands for y1 and y2 of t. Those are-- it's e to the lambda tx. Remember, that's the picture that we're looking for.

So the first one is e to the 6t times x, which is 1, 1. If I put that into the equation, it will solve the equation. Also, I have another one. e to the lambda 2 was 2t. e to the lambda t times its eigenvector, 1 minus 3. That's a solution also. One solution, another solution.

And what do I do with linear equations? I take combinations. Any number c1 of that, plus any number c2 of that is still a solution. That's superposition, adding solutions to linear equations. These are null equations. There's no force term in these equations. I'm not dealing with a force term. I'm looking for the null solutions, the solutions of the equations themselves.

And there I have two solutions, two coefficients to choose. How do I choose them? Of course, I match the initial condition, so at t equals 0. At t equals 0. At t equals 0, I would have y of 0. That's my given initial condition, my y1 and y2.

So I'm setting t equals 0, so that's one of course. When t is 0, that's one. So I just have c1 times 1, 1. And c2-- that's one again at t equals o-- times 1 minus 3. That's what determines c1 and c2. c1 and c2 come from the initial conditions just the way they always did.

So I'm solving two first order linear constant coefficient equations, homogeneous, meaning no force term. So I get a null solution with constants to choose and, as always, those constants come from matching the initial conditions. So the initial condition here is a vector. So if, for example, y of 0 was 2 minus 2, then I would want one of those and one of those. OK.

I've used eigenvalues and eigenvectors to solve a linear system, their first and primary purpose. OK. But how do I find those eigenvalues and eigenvectors? What about other properties? What's going on with eigenvalues and eigenvectors? May I begin on this just a couple more minutes about eigenvalues and eigenvectors? Basic facts and then I'll come next video of how to find them. OK, basic facts.

Basic facts. So start from Ax equals lambda x. Let's suppose we found those. Could you tell me the eigenvalues and eigenvectors of A squared? I would like to know what the eigenvalues and eigenvectors of A squared are. Are they connected with these? So suppose I know the x and I know the lambda for A. What about for A squared?

Well, the good thing is that the eigenvectors are the same for A squared. So let me show you. I say that same x, so this is the same x, same vector, same eigenvector. The eigenvalue would be different, of course, for A squared, but the eigenvector is the same. And let's see what happens for A squared.

So that's A times Ax, right? One A, another Ax. But Ax is lambda x. Are you good with that? That's just A times Ax. So that's OK. Now lambda is a number. I like to bring it out front where I can see it. So I didn't do anything there. This number lambda was multiplying everything so I put it in front.

Now Ax. I have, again, the Ax. That's, again, the lambda x because I'm looking at the same x. Same x, so I get the same lambda. So that's a lambda x, another lambda. I have lambda squared x. That's what I wanted. A squared x is lambda squared x.

Conclusion. The eigenvectors stay the same, lambda goes to lambda squared. The eigenvalues are squared.

So if I had my example again-- oh, let me find that matrix. Suppose I had that same matrix and I was interested in A squared, then the eigenvalues would be 36 and 4, the squares. I suppose I'm looking at the n-th power of a matrix. You may say why look at the n-th power? But there are many examples to look at the n-th power of a matrix, the thousandth power.

So let's just write down the conclusion. Same reasoning, A to the n-th x is lambda. It's the same x. And every time I multiply by A, I multiply by a lambda. So I get lambda n times. So there is the handy rule.

And that really tells us something about what eigenvalues are good for. Eigenvalues are good for things that move in time. Differential equations, that is really moving in time. n equal 1 is this first time, or n equals 0 is the start. Take one step to n equal 1, take another step to n equal 2. Keep going. Every time step brings a multiplication by lambda.

So that is a very useful rule. Another handy rule is what about A plus the identity? Suppose I add the identity matrix to my original matrix. What happens to the eigenvalues? What happens to the eigenvectors? Basic question. Or I could multiply a constant times the identity, 2 times the identity, 7 times the identity.

And I want to know what about its eigenvectors. And the answer is same, same x's. Same x. I show that by figuring out what I have here. This is Ax, which is lambda x. And this is c times the identity times x. The identity doesn't do anything so that's just cx.

So what do I have now? I've seen that the eigenvalue is lambda plus c. So there is the eigenvalues. I think about this as shifting A by a multiple of the identity. Shifting A, adding 5 times the identity to it. If I add 5 times the identity to any matrix, the eigenvalues of that matrix go up by 5. And the eigenvectors stay the same.

So as long as I keep working with that one matrix A. Taking powers, adding multiples of the identity, later taking exponentials, whatever I do I keep the same eigenvectors and everything is easy.

If I had two matrices, A and B, with different eigenvectors, then I don't know what the eigenvectors of A plus B would be. I don't know those. I can't tell the eigenvectors of A times B because A has its own little eigenvectors and B has its eigenvectors. Unless they're the same, I can't easily combine A and B. But as always I'm staying with one A and its powers and steps like that, no problem.

OK. I'll stop there for a first look at eigenvalues and eigenvectors.

Select a Web Site

Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .

You can also select a web site from the following list

How to Get Best Site Performance

Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.

Americas

- América Latina (Español)
- Canada (English)
- United States (English)

Europe

- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)

- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)

Asia Pacific

- Australia (English)
- India (English)
- New Zealand (English)
- 中国
- 日本Japanese (日本語)
- 한국Korean (한국어)