Digital Proofer
A First Course in Li... Authored by Mr. Mohammed K A K... 6.0" x 9.0" (15.24 x 22.86 cm) Black & White on White paper 130 pages ISBN-13: 9781502901811 ISBN-10: 1502901811 Please carefully review your Digital Proof download for formatting, grammar, and design issues that may need to be corrected. We recommend that you review your book three times, with each time focusing on a different aspect.
1
Check the format, including headers, footers, page numbers, spacing, table of contents, and index.
2 3
Review any images or graphics and captions if applicable. Read the book for grammatical errors and typos.
Once you are satisfied with your review, you can approve your proof and move forward to the next step in the publishing process. To print this proof we recommend that you scale the PDF to fit the size of your printer paper.
A First Course in Linear Algebra
About the Author Mohammed Kaabar is a math tutor at the Math Learning Center (MLC) at Washington State University, Pullman, and he is interested in linear algebra, scientific computing, numerical analysis, differential equations, and several programming languages such as SQL, C#, Scala, C++, C, JavaScript, Python, HTML 5 and MATLAB. He is a member of of Institute of Electrical and Electronics Engineers (IEEE), IEEE Antennas and Propagation Society, IEEE Consultants Network, IEEE Smart Grid Community, IEEE Technical Committee on RFID, IEEE Life Sciences Community, IEEE Green. ICT Community, IEEE Cloud Computing Community, IEEE Internet of Things Community, IEEE Committee on Earth Observations, IEEE Electric Vehicles Community, IEEE Electron Devices Society, IEEE Communications Society, and IEEE Computer Society. He participated in several competitions, conferences, research papers and projects. He is an online instructor of numerical analysis at Udemy Inc, San Francisco, CA. In addition, he is also a Technical Program Committee (TPC) member, reviewer and presenter at CCA-2014, WSMEAP 2014, EECSI 2014, JIEEEC 2013 and WCEEENG 2012. He worked as electrical engineering intern at Al-Arabia for Safety and Security L.L.C. He also received several educational awards and certificates from accredited institutions. For more information about the author and his free online courses, please visit his personal website: http://www.mohammed-kaabar.net.
Table of Contents 1 Systems of Linear Equations
1
1.1
Row Operations Method……………….........….1
1.2
Basic Algebra of Matrix……………………….17
1.3
Linear Combinations…………………….…….19
1.4
Square Matrix……………………….………….29
1.5
Inverse Square Matrix…………………..…….33
1.6
Transpose Matrix……………..………….…….42
1.7
Determinants……….……………………..…....46
1.8
Cramer’s Rule……………………………..…….53
1.9
Adjoint Method………………………………….55
1.10 Exercises………………………………...……….60
2 Vector Spaces
62
2.1
Span and Vector Space...................................62
2.2
The Dimension of Vector Space………….......65
2.3
Linear Independence……….………………….66
2.4
Subspace and Basis…………………………….69
2.5
Exercises…………………………………….…...82
3 Homogeneous Systems
84
3.1
Null Space and Rank......................................84
3.2
Linear Transformation…………………..…….91
3.3
Kernel and Range …………………...….…….100
3.4
Exercises……………………………………..…105
4 Characteristic Equation of Matrix
107
4.1
Eigenvalues and Eigenvectors……..…..……107
4.2
Diagonalizable Matrix…...............................112
4.3
Exercises………………………………………...115
5 Matrix Dot Product
116
5.1
The Dot Product in Թ ..................................116
5.2
Gram-Schmidt Orthonormalization ……….118
5.3
Exercises........................................................120
Answers to Odd-Numbered Exercises
121
Bibliography
125
Introduction In this book, I wrote five chapters: Systems of Linear Equations, Vector Spaces, Homogeneous Systems, Characteristic Equation of Matrix, and Matrix Dot Product. I also added exercises at the end of each chapter above to let students practice additional sets of problems other than examples, and they can also check their solutions to some of these exercises by looking at “Answers to Odd-Numbered Exercises” section at the end of this book. This book is very useful for college students who studied Calculus I, and other students who want to review some linear algebra concepts before studying a second course in linear algebra. According to my experience as a math tutor at Math Learning Center, I have noticed that some students have difficulty to understand some linear algebra concepts in general, and vector spaces concepts in particular. Therefore, my purpose is to provide students with an interactive method to explain the concept, and then provide different sets of examples related to that concept. If you have any comments related to the contents of this book, please email your comments to
[email protected]. I wish to express my gratitude and appreciation to my father, my mother, and my brother. I would also like to give a special thanks to my mathematics professor Dr. Ayman Badawi, Professor of Mathematics & Statistics at AUS, for his brilliant efforts in revising the content of this book and giving me the permission to use his lecture notes and his online resources as a guide to write this book, and I would also like to thank all math professors at Washington State University, Pullman. Ultimately, I would appreciate to consider this book as a milestone for developing more math books that can serve our mathematical society.
Chapter 1
2
Systems of Linear
M. Kaabar
First of all, each variable in this system is to the power 1 which means that this system is a linear system. As given in the question itself, ʹ ൈ ʹ implies that the
Equations
number of equations is 2, and the number of unknown
In this chapter, we discuss how to solve ݊ ൈ ݉ systems of linear equations using row operations method. Then, we give an introduction to basic algebra of matrix including matrix addition, matrix subtraction and matrix multiplication. We cover in the remaining sections some important concepts of linear algebra such as linear combinations, determinants, square matrix, inverse square matrix, transpose matrix,
variables is also 2. ʹൈʹ ሺܗܚ܍܊ܕܝۼ۳ܛܖܗܑܜ܉ܝܙሻ ൈ ሺܛ܍ܔ܊܉ܑܚ܉܄ܖܟܗܓܖ܃ܗܚ܍܊ܕܝۼሻ
Then, the unknown variables in this question are x and y. To solve for x and y, we need to multiply the first equation ͵ ݔ ʹ ݕൌ ͷ by 2, and we also need to multiply the second equation െʹ ݔ ݕൌ െ by 3. Hence, we obtain the following: ݔ Ͷ ݕൌ ͳͲ ǥ ǥ ǥ ǥ ǥ Ǥ Ǥͳ ൜ െ ݔ ͵ ݕൌ െͳͺ ǥ ǥ ǥ ǥ Ǥʹ
Cramer’s Rule and Adjoint Method.
1.1 Row Operations Method First of all, let’s start with a simple example about
By adding equations 1 and 2, we get the following: ݕൌ െͺ Therefore, ݕൌ
ି଼
݊ ൈ ݉ systems of linear equation.
Now, we need to substitute the value of y in one of the
Example 1.1.1 Solve for x and y for the following ʹ ൈ ʹ
two original equations. Let’s substitute y in the first
system of linear equations:
equation ͵ ݔ ʹ ݕൌ ͷ as follows:
͵ ݔ ʹ ݕൌ ͷ ൜ െʹ ݔ ݕൌ െ Solution: Let’s start analyzing this ʹ ൈ ʹ system. 1
ି଼
͵ ݔ ʹ ቀ ቁ ൌ ͷ is equivalent to
ି଼
ଵ
ହଵ
͵ ݔൌ ͷ െ ʹ ቀ ቁ ൌ ͷ ቀ ቁ ൌ Then, ݔൌ Therefore, ݔൌ
ଵ
ܽ݊݀ ݕൌ
ି଼
Hence, we solve for x and y.
ଵ
Example 1.1.2 Describe the following system: ൜
ʹ ݔ ͵ ݕെ ʹ ݖൌ ͳͲ ͷ ݔ ʹ ݕ ݖൌ ͳ͵
Solution: Let’s start asking ourselves the following questions. Question 1: How many equations do we have? Question 2: How many unknown variables do we have? Question 2: What is the power of each unknown variable? If we can answer the three questions above, then we can discuss the above systems. Let’s start now answering the three questions above. Answer to Question 1: We have 2 equations: ʹ ݔ ͵ ݕെ ʹ ݖൌ ͳͲ andͷ ݔ ʹ ݕ ݖൌ ͳ͵. Answer to Question 2: We have 3 unknown variables: x, y and z. Answer to Question 3: Each unknown variable is to the power 1. After answering the above three question, we have now
4
M. Kaabar
Solving for unknown variables in ʹ ൈ ʹ or ʹ ൈ ͵ systems of linear equations using some arithmetic operations such as additions, subtractions and multiplications of linear equations is very easy. But if we have for example Ͷ ൈ Ͷ system of linear equations, using some arithmetic operations in this case will be very difficult and it will take long time to solve it. Therefore, we are going to use a method called Row Operation Method to solve complicated ݊ ൈ ݉ systems of linear equations. Now, let’s start with an example discussing each step of Row Operation Method for solving ݊ ൈ ݉ system of linear equations. Example 1.1.3 Solve for x1, x2 and x3 for the following ͵ ൈ ͵ system of linear equations: ݔଵ ݔଶ െ ݔଷ ൌ ʹ ʹݔ ൝ ଵ െ ݔଶ ݔଷ ൌ ʹ െݔଶ ʹݔଷ ൌ ͳ
ሺܗܚ܍܊ܕܝۼ۳ܛܖܗܑܜ܉ܝܙሻ ൈ ሺܛ܍ܔ܊܉ܑܚ܉܄ܖܟܗܓܖ܃ܗܚ܍܊ܕܝۼሻ
Solution: In the above system, we have 3 equations and 3 unknown variables x1, x2 and x3. To solve this ͵ ൈ ͵ system, we need to use Row Operation Method. The following steps will describe this method. Step 1: Rewrite the system as a matrix, and this
Since the number of equations in this example is 2 and
matrix is called Augmented Matrix. Each row in this
the number of unknown variables is 3, then using the
matrix is a linear equation.
an idea about the above system. As we remember from example 1.1.1 that the system of equations has the following form:
above form, the above system is ʹ ൈ ͵ system. Recall that each unknown variable is to the power 1. This means that it is a ʹ ൈ ͵ linear system. 3
x1 ͳ ൭ʹ Ͳ
x2 ͳ െͳ െͳ
x3
C
െͳ ʹ ͳ อʹ൱ ʹ ͳ
As we can see from the above augmented matrix, the
6
M. Kaabar
first column represents the coefficients of x1 in the -2R1+R2--Æ R2
three linear equations. The second column represents the coefficients of x2 in the three linear equations. The
(This means that we multiply the first row by -2 and
third column represents the coefficients of x3 in the
we add it to the second row and the change will be only
three linear equations. The fourth column is not an
in the second row and no change in the first row)
actual column but it just represents the constants
Hence, we obtain: ͳ ൭Ͳ Ͳ
because each linear equation equals to a constant. Step 2: Start with the first row in matrix and make the
ͳ െͳ ʹ െ͵ ͵ อെʹ൱ െͳ ʹ ͳ
first non-zero number equals to 1. This 1 is called a ିଵ
leader number.
ଷ
x1 ͳ ൭ʹ Ͳ
x2
x3
C
ͳ െͳ ʹ െͳ ͳ อʹ൱ െͳ ʹ ͳ
Since the leader number is already 1in this matrix, then we do not need to do anything with this number. Otherwise, we need to make this leader number equals to 1 by multiplying the whole row with a non-zero number. Step 3: Eliminate all numbers that are exactly below the leader number in other words use the leader number to eliminate all things below it. Step 4: Move to the second row and repeat what has been done in step 2. 5
R2
(This means that we multiply the second row by
ିଵ ଷ
and
the change will be only in the second row and no change in other rows) Hence, we obtain: ͳ ൭Ͳ Ͳ
ͳ ͳ െͳ
െͳ ʹ మ െͳอ య ൱ ʹ ͳ
R2+R3--Æ R3 (This means that we add the second row to the third row and the change will be only in the third row and no change in the second row)
-R2+R1--Æ R1
8
M. Kaabar
(This means that we multiply the second row by -1, and we add it to the first row and the change will be
After this example, we will get introduced to two new
only in the first row and no change in the second row)
definitions.
Hence, we obtain:
Definition 1.1.1 ݊ ൈ ݉ system of linear equations is Ͷ ͵ Ͳ ተʹۊ െͳ ۋ ͳ ተ͵ۋ ͷ ͵ی
Ͳ ͳۇ ͳ Ͳۈ Ͳ Ͳۈ ۉ
Step 5: Move to the third row and do the same as we did in the first row and the second row of matrix. R3+R2--Æ R2 (This means that we add the third row to the second row and the change will be only in the second row and no change in the third row)
Ͳ ͳ Ͳ
ۉ ସ
ଷ
ଷ
Therefore, x1= , x2=
Ͷ ͵ Ͳተۊ Ͳ ۋ ͳተ͵ۋ ͷ ͵ی
ହ
and x3=
ଷ
These are one solution only. Hence, the system has a unique solution (one solution). 7
inconsistent. Definition 1.1.2 ݊ ൈ ݉ system of linear equations has a unique solution if each variable has exactly one value. Now, let’s apply what we have learned from example 1.1.3 for the following example 1.1.4. Example 1.1.4 Solve for x1, x2, x3, x4 and x5 for the following ͵ ൈ ͷ system of linear equations: ݔଶ െ ݔଷ ݔସ െ ݔହ ൌ ͳ െʹݔଵ ݔଷ െ ݔହ ൌ Ͳ ൝ െݔଶ ݔଷ ʹݔସ െ ͳͲݔହ ൌ ͳʹ Solution: In the above system, we have 3 equations and 5 unknown variables x1, x2, x3, x4 and x5. To solve this ͵ ൈ ͷ system, we need to use Row Operation Method. First, we need to construct the augmented matrix. x1 x2 x3 x4 x5 C
Hence, we obtain: ͳۇ Ͳۈ Ͳۈ
consistent if it has a solution; otherwise, it is called
Ͳ ൭െʹ Ͳ
ͳ Ͳ െͳ
െͳ ͳ െͳ ͳ ͳ Ͳ െͳ อ Ͳ ൱ ͳ ʹ െͳͲ ͳʹ
The leader number here is 1. Ͳ ൭െʹ Ͳ
ͳ Ͳ െͳ
െͳ ͳ െͳ ͳ ͳ Ͳ െͳ อ Ͳ ൱ ͳ ʹ െͳͲ ͳʹ
By doing the same steps in example 1.1.3, we obtain
10
M. Kaabar
the following: R1+R3--Æ R3 x2
x3 x4
Ͳ ͳ ൭െʹ Ͳ Ͳ Ͳ
െͳ ͳ ͳ Ͳ Ͳ ͵
x1
ିଵ
x2
Ͳ ͳ Ͳ Ͳ
ଵ ଷ
x1 x2 Ͳ ൮ͳ Ͳ
ͳ Ͳ Ͳ
െͳ ͳ െͳ อ Ͳ ൱ െͳͳ ͳ͵
Since we have 3 leader numbers (numbers equal to 1)
x3 x4 െͳ ͳ െͳ Ͳ ʹ Ͳ ͵
ቌͳ Ͳ
C
R2
ଶ
x1
x5
x5
in x1, x2 and x4 columns, then x1, x2 and x4 are called
C
െͳ ͳ ͳ ተ Ͳ ቍ ʹ ͳ͵ െͳͳ
R3
x3
x4
െͳ ͳ െͳȀʹͲ ͳ Ͳ
Ͳ ቌͳ Ͳ
x3
x4
leading variables. All other variables are called free variables such that x3, x5 אԹ. Since we have leading and free variables, we need to write the leading variables in terms of free variables.
x5
C
ͳ െͳ Ͳ ͳȀʹ ተͳ͵ ൲ െͳͳȀ͵ ͵
-R3+R1--Æ R1 x1 x2
ͺ ͳͲ ݔଶ െ ݔଷ ݔହ ൌ െ ͵ ͵ ͳ ͳ ݔଵ െ ݔଷ ݔହ ൌ Ͳ ʹ ʹ ͳͳ ͳ͵ ݔସ െ ݔହ ൌ ͵ ͵
x5
ͳ ͳ ݔଵ ൌ ݔଷ െ ݔହ ʹ ʹ ͺ ͳͲ ݔଶ ൌ ݔଷ െ ݔହ െ ͵ ͵ ͳͳ ͳ͵ ݔସ ൌ ݔ ͵ ହ ͵ Now, let x3 = 0 and x5 = 0 (You can choose any value for
C
ͺȀ͵ െͳͲȀ͵ ͳ െͳ Ͳ ͳȀʹ ቮ Ͳ ቍ Ͳ െͳȀʹͲ ଵଷ ͳ െͳͳȀ͵ Ͳ Ͳ ଷ
Hence, we obtain from the above matrix:
x3 and x5 because both of them are free variables). Therefore, we obtain ͳ ͳ ݔଵ ൌ ሺͲሻ െ ሺͲሻ ൌ Ͳ ʹ ʹ ͳͲ ͺ ͳͲ ݔଶ ൌ Ͳ െ ሺͲሻ െ ൌെ ͵ ͵ ͵ ͳͳ ͳ͵ ͳ͵ ሺͲሻ ݔସ ൌ ൌ ͵ ͵ ͵ Hence, ݔଵ ൌ Ͳǡ ݔଶ ൌ െ
9
ଵ ଷ
ܽ݊݀ݔସ ൌ
ଵଷ ଷ
Another possible solution for example 1.1.4:
12
M. Kaabar
Now, let x3 = 1 and x5 = 0 (You can choose any value for x3 and x5 because both of them are free variables).
Example 1.1.5 Solve for x1, x2 and x3 for the following
Therefore, we obtain
ʹ ൈ ͵ system of linear equations:
ͳ ͳ ͳ ݔଵ ൌ ሺͳሻ െ ሺͲሻ ൌ ʹ ʹ ʹ ͺ ͳͲ ͳͲ ൌെ ݔଶ ൌ ͳ െ ሺͲሻ െ ൌͳെ ͵ ͵ ͵ ͵ ͳ͵ ͳ͵ ͳͳ ሺͲሻ ݔସ ൌ ൌ ͵ ͵ ͵ Hence, we obtain ͳ ͳ͵ ݔଵ ൌ ǡ ݔଶ ൌ െ ܽ݊݀ݔସ ൌ Ǥ ʹ ͵ ͵ Summary of Row Operations Method in ൈ Systems
Solution: In the above system, we have 2 equations and 3 unknown variables x1, x2 and x3. To solve this ʹ ൈ ͵ system, we need to use Row Operation Method. First, we need to construct the augmented matrix. x1 x2 x3 C ቀ
ͳ ʹ ʹ Ͷ
Suppose ןis a non-zero constant, and ݅ܽ݊݀݇ are row numbers in the augmented matrix. , ( Ͳ ്ןMultiply a row with a non-zero
constant)ן. * Ri՞Rk (Interchange two rows). * ןRi +Rk --Æ Rk (Multiply a row with a non-zero constantןǡ ). Note: The change is in Rk and no change is in Ri. Let’s start with other examples that will give us some important results in linear algebra.
11
െͳ ʹ ቚ ቁ െʹ
-2R1+R2--Æ R2 x1 x2
of Linear Equations:
* ןRi
ݔ ʹݔଶ െ ݔଷ ൌ ʹ ൜ ଵ ʹݔଵ Ͷݔଶ െ ʹݔଷ ൌ
ቀ
ͳ ʹ Ͳ Ͳ
x3
C
െͳ ʹ ቚ ቁ Ͳ ʹ
We stop here and read the above matrix as follows: ݔଵ ʹݔଶ െ ݔଷ ൌ ʹ Ͳൌʹ Hence, we get introduced to a new result. Result 1.1.1 The system is inconsistent if and only if after reducing one of the equations, we have 0 equals to a non-zero number. Example 1.1.6 Solve for x1, x2 and x3 for the following ʹ ൈ ͵ system of linear equations: ݔ ʹݔଶ െ ݔଷ ൌ ʹ ൜ ଵ ʹݔଵ Ͷݔଶ െ ʹݔଷ ൌ Ͷ Solution: In the above system, we have 2 equations and 3 unknown variables x1, x2 and x3. To solve this ʹ ൈ ͵ system, we need to use Row Operation Method.
First, we need to construct the augmented matrix as we did in the previous example. x1 x2 x3 C ቀ
ͳ ʹ ʹ Ͷ
െͳ ʹ ቚ ቁ െʹ Ͷ
-2R1+R2--Æ R2 x1 x2 ͳ ʹ ቀ Ͳ Ͳ
x3
C
14
M. Kaabar
For what values of b and d will the system be consistent? Solution: We need to multiply the leader number (2) by a non-zero constant to make it equal to 1 instead of 2 as follows: ଵ
െͳ ʹ ቚ ቁ Ͳ Ͳ
ଶ
We stop here and read the above matrix as follows: ͳݔ ʹ ʹݔെ ͵ݔൌ ʹ ͲൌͲ
൮
ͳ െʹ െʹ
The solution is x1= -2 x2 + x3 + 2 x1 is a leading variable, and x2, x3 אԹ free variables. Now, let x2 = 1 and x3 = 0 (You can choose any value for
2R1+R3--Æ R3 ൮
ݔଶ ൌ Ͳܽ݊݀ݔଷ ൌ ͲǤ Hence, we get introduced to new results. Result 1.1.2 The ݊ ൈ ݉system is consistent if and only if it has a unique solution (no free variable). Result 1.1.3 The ݊ ൈ ݉system is consistent if and only if it has infinitely many solutions (free variables). Result 1.1.4 Assume the ݊ ൈ ݉system is consistent, and we have more variables than equations (m>n). Then, the system has infinitely many solutions. Example 1.1.7 Given an augmented matrix of a ʹ ͷ system:൭െʹ Ͷ െʹ െͷ
͵ െͳ ܾ อ ͵ ൱ െ͵ ݀ 13
͵ ͳ െ ʹ ተ ʹ൲ ܾ ͵ െ͵ ݀
2R1+R2--Æ R2
x2 and x3 because both of them are free variables). Therefore, we obtain ͳݔൌ െʹሺͲሻ ሺͲሻ ʹ ൌ ʹܽ݊݀
ͷ ʹ Ͷ െͷ
R1
ͳ Ͳ Ͳ
͵ ͷ ͳ െ ʹ ተ ʹ ൲ ʹ ͻ ܾ͵ ʹ Ͳ Ͳ ݀െͳ
Therefore, ܾ אԹand d=1 When the system is consistent, then we must have infinitely many solutions. In this case, b is a free variable. Example 1.1.8 Given an augmented matrix of a ͳ െͳ system:൭െͳ ʹ െʹ ʹ
ʹ ܽ െʹอܾ൱ െͶ ܿ
For what values of a, b and c will the system be consistent? Solution: We need to use the Row Operation Method to find a, b and c as follows: R1+R2--Æ R2
2R1+R3--Æ R3 ͳ ൭Ͳ Ͳ
ܽ ʹ ܽ ܾ൱ Ͳอ Ͳ ʹܽ ܿ
െͳ ͳ Ͳ
R2+R1--Æ R1 ͳ Ͳ ൭Ͳ ͳ Ͳ Ͳ
ʹ ʹܽ ܾ Ͳอ ܽ ܾ ൱ Ͳ ʹܽ ܿ
We stop here and read the above matrix. Hence, we obtain: x1 + 2x3 = 2a + b x2 = a + b 0 = 2a + c Therefore, 2a + c = 0 which means that c = -2a Hence, the solution is ܽǡ ܾ אԹ (free variables), and c = -2a. Let’s now discuss some new definitions and results of linear algebra. Definition 1.1.3 Suppose we have the following ͵ ൈ ͵ system of linear equations: ʹݔଵ െ ͵ݔଶ ݔଷ ൌ Ͳ ൝ ݔଵ ݔଶ െ ͵ݔଷ ൌ Ͳ െݔଵ ͵ݔଶ െ ݔଷ ൌ Ͳ This system is called homogeneous system of linear equations because all constants on the right-side are zeros. Result 1.1.5 Every ݊ ൈ ݉homogeneous system is consistent. Result 1.1.6 Since every ݊ ൈ ݉homogeneous system is consistent, then x1=0, x2=0, x3=0, …, xm=0 is a solution. This solution is called Trivial Solution. 15
16
M. Kaabar
Result 1.1.7 The solution is called Non-Trivial if it has at least one value that is not zero. Result 1.1.8 In a homogeneous system if number of variables is greater than number of equations, then the system has infinitely many solutions. Result 1.1.9 Given the following ͵ ൈ matrix: Ͳ ͳ Ͳ Ͳ ͳ ͳ ͳ Ͳ ͵ʹ ͷ ൩ This matrix is called Reduced Ͳ Ͳ Ͳ Ͳ Ͳ ͳ Matrix because all the numbers exactly below each leader number are zeros. Result 1.1.10 Given the following ͵ ൈ matrix: Ͳ Ͳ ͳʹ Ͳ ͳ ͷ Ͳʹ Ͳ൩ This matrix is called CompletelyͲ Ͳ ͲͲ ͳ Reduced Matrix because all the numbers exactly below and above each leader number are zeros. Result 1.1.11 The matrix is called Echelon Matrix if it is a combination of Reduced Matrix and Leader Numbers that are arranged so that the leader number in ith row is to the right of the leader number in the previous row, and the rows that are entirely zeros come at the end of the matrix. Ͳ ͳ ʹ͵ ͳ Ͳ Ͳۍͷ ې ێ ۑ ۑ͵ ͳ Ͳ Ͳ ͲێThis is an Echelon Matrix. ۑͳ Ͳ Ͳ Ͳ Ͳ ێ ۑͲ ͲͲ Ͳ Ͳێ ےͲ Ͳ Ͳ Ͳ Ͳ ۏ Ͳ ͳ Ͳ Ͳ ͳ ͳ ͳ Ͳ ͵ʹ ͷ ൩ This is NOT Echelon Matrix. Ͳ Ͳ Ͳ Ͳ Ͳ ͳ
Example 1.1.9 Given the following reduced matrix: Ͳ ͳ Ͳ Ͳ ͳ ͳ ͳ Ͳ ͵ʹ ͷ ൩ Ͳ Ͳ Ͳ Ͳ Ͳ ͳ Covert this reduced matrix to echelon matrix. Solution: We just need to interchange two rows as follows: R1 Å--Æ R2 ͳ Ͳ ͵ ʹ ͷ Ͳ ͳ ͲͲ ͳ ͳ൩ Ͳ Ͳ Ͳ Ͳ Ͳ ͳ Now, the matrix is an echelon matrix. Result 1.1.12 To convert Reduced Matrix to Echelon Matrix, we just need to use interchange rows. Result 1.1.13 To convert Completely-Reduced Matrix to Reduced-Echelon Matrix, we just need to use interchange rows.
1.2 Basic Algebra of Matrix In this section, we discuss an example of basic algebra of matrix including matrix addition, subtraction and multiplication. Example 1.2.1 Given the following three matrices: ͳ ʹ ʹ ͳ ͵ െ͵ ʹ ͷቃ ቂ ቃ A= , B = ͳ൩ , C = ቂ Ͳ ͳ ͷ ͳ ͵ ͵ ͳ a) Find 3A. b) Find A+B. c) Find A-B. d) Find 3A-2C. Solution: Part a: We just need to multiply 3 by matrix A as follows: 17
18
M. Kaabar
ʹ ͳ ͵ ͵͵כ͵ ͳכ͵ ʹכ ͵ ͻ ቃൌቂ ቃൌቂ ቃ 3A = ͵ כቂ Ͳ ͳ ͷ ͵כ͵ ͳכ͵ Ͳכͷ Ͳ ͵ ͳͷ Part b: In matrix addition we can only add matrices of the same order only. The order of matrix A isʹ ൈ ͵. The order of matrix B is ͵ ൈ ʹ. Since matrix A and matrix B have different orders, then we cannot find A+B. Hence, there is no answer for part b. Part c: In matrix subtraction we can only subtract matrices of the same order only. The order of matrix A isʹ ൈ ͵. The order of matrix B is ͵ ൈ ʹ. Since matrix A and matrix B have different orders, then we cannot find A-B. Hence, there is also no answer for part c. Part d: First, we need to multiply 2 by matrix C as follows: െ͵ ʹ ͷቃ ቂʹ כെ͵ ʹ כ ʹ ʹ כͷቃ 2C = ʹ כቂ ൌ ൌ ͳ ͵ ʹכʹ ͳכ ʹ͵כ െ Ͷ ͳͲ ቂ ቃ ʹ ͳͶ Since we are already have 3A from part a, we just need to find 3A-2C. As we know, in matrix subtraction we can only subtract matrices of the same order only. The order of matrix 3A isʹ ൈ ͵. The order of matrix 2C is ʹ ൈ ͵. Both of them have the same order. Hence, we can find 3A-2C as follows: ͵ ͻ െ Ͷ ͳͲ ͳʹ െͳ െͳ ቃെቂ ቃൌቂ ቃ 3A-2C = ቂ Ͳ ͵ ͳͷ ʹ ͳͶ െʹ െͳͳ ͻ
1.3 Linear Combinations In this section, we discuss the concept of linear combinations of either columns or rows of a certain matrix. Example 1.3.1 Suppose we have 5 apples, 7 oranges and 12 bananas. Represent them as a linear combination of fruits. Solution: To represent them as a linear combination, we need to do the following steps: Step 1: Put each fruit individually: Apples Oranges Bananas Step 2: Separate each one of them by addition sign. Apples + Oranges + Bananas Step 3: Put 1 box in front of each one.. Apples + Oranges + Bananas Step 4: Write the number of each fruit in each box.. 7 Oranges + 12 Bananas 5 Apples + This representation is called a Linear Combination of Fruits. Now, we can apply what we have learned in example 1.3.1on rows and columns of a certain matrix. Example 1.3.2 Given the following matrix A: ͳ ʹ ͵ A = Ͳ ͷ ͳ ൩ ʹ ͳ ʹ a) Find a linear combination of the columns of matrix A. b) Find a linear combination of the rows of matrix A. 19
20
M. Kaabar
Solution: Part a: To represent the columns of matrix A as a linear combination, we need to do the same steps as we did in example 1.3.1. Step 1: Put each column of matrix A individually: ͳ Ͳ ൩ ʹ
ʹ ͷ ൩ ͳ
͵ ͳ ൩ ʹ
Step 2: Separate each one of them by addition sign. ͳ Ͳ ൩ ʹ
+
ʹ ͷ ൩ ͳ
͵ ͳ ൩ ʹ
+
Step 3: Put 1 box in front of each one. ͳ Ͳ ൩ ʹ
+
ʹ ͷ ൩ ͳ
͵ ͳ ൩ ʹ
+
Step 4: Write random number for each column in each box because it is not mentioned any number for any column in the question.
5
ͳ Ͳ ൩ + ʹ
-3
ʹ ͷ ൩ + ͳ
2
͵ ͳ ൩ ʹ
This representation is called a Linear Combination of the Columns of Matrix A.
Part b: To represent the rows of matrix A as a linear combination, we need to do the same steps as we did in part a. Step 1: Put each row of matrix A individually: ሾͳ ʹ
͵ሿ
ሾͲ
ͷ ͳሿ
ሾʹ
ͳ
ʹሿ
Step 2: Separate each one of them by addition sign. ሾͳ ʹ
͵ሿ + ሾͲ
ͷ
ͳሿ + ሾ ʹ
ͳ ʹሿ
Step 3: Put 1 box in front of each one. ሾͳ ʹ
͵ሿ +
ሾͲ ͷ
ͳሿ +
ሾʹ
ͳ
ʹሿ
Step 4: Write random number for each row in each box because it is not mentioned any number for any row in the question. 4 ሾ ͳ ʹ ͵ሿ + -8 ሾͲ ͷ ͳሿ + 11 ሾʹ ͳ ʹሿ This representation is called a Linear Combination of the Rows of Matrix A. Definition 1.3.1 Suppose matrix A has an order (size) ݊ ൈ ݉ , and matrix B has an order (size) ݉ ൈ ݇, then the number of columns of matrix A equals to the number of rows of matrix B. If we assume that the usual multiplication of matrix A by matrix B equals to a new matrix C such that A ή B = C, then the matrix C has an order (size) ݊ ൈ ݇. (i.e. If A has an orderሺሻ͵ ൈ , and B has an orderሺሻ ൈ ͳͲ, then C has an orderሺሻ͵ ൈ ͳͲ). 21
22
M. Kaabar
Example 1.3.3 Given the following matrix A and matrix B: ͳ ʹ ͳ ͳ ͳ ቃ A = ͵ ൩ , B= ቂ Ͳ ʹ ͳ ͳ ͳ a) Find AB such that each column of AB is a linear combination of the columns of A. b) Find AB such that each row of AB is a linear combination of the rows of B. c) Find AB using the usual matrix multiplication. d) If AB = C, then find c23. e) Find BA such that each column of BA is a linear combination of the columns of B. f) Find BA such that each row of BA is a linear combination of the rows of A. Solution: Part a: Since A has an orderሺሻ͵ ൈ ʹ, and B has an orderሺሻʹ ൈ ͵, then AB will have an orderሺሻ͵ ൈ ͵ according to Definition 1.3.1. Step 1: 1st column of AB: ͳ ʹ ͳ 1 *͵൩ + 0 * ൩ = ͵൩ ͳ ͳ ͳ Step 2: 2nd column of AB: ͳ ʹ ͷ 1 *͵൩ + 2 * ൩ = ͳͷ൩ ͳ ͳ ͵
Step 3: 3rd column of AB:
24
ͳ ʹ ͵ 1 *͵൩ + 1 * ൩ = ͻ൩ ͳ ͳ ʹ ͳ Hence, AB = ͵ ͳ
ͳͳכʹͳכͳ Ͳכʹͳכͳ ʹכʹͳכ AB ൌ ͵ ͳ כ ͳ כ ͵ Ͳ כ ͳ כ ͵ ʹ כ ͳ כ൩ ͳͳכͳͳכͳ Ͳכͳͳכͳ ʹכͳͳכ
ͷ ͵ ͳͷ ͻ൩ ͵ ʹ
ͳ Hence, AB = ͵ ͳ
Part b: As we did in part a but the difference here is rows instead of columns. Step 1: 1st row of AB: 1 * ሾͳ
ͳ ͳሿ + 2 * ሾ Ͳ ʹ
ͳሿ = ሾͳ
ͷ
͵ሿ
ͳሿ = ሾ͵
ͳͷ ͻሿ
ͳሿ = ሾͳ
͵
Step 2: 2nd row of AB: 3 * ሾͳ
ͳ ͳሿ + 6 * ሾ Ͳ ʹ
Step 3: 3rd row of AB: 1 * ሾͳ
ͳ ͳሿ + 1 * ሾ Ͳ ʹ
ͳ Hence, AB = ͵ ͳ
ʹሿ
ͷ ͵ ͳͷ ͻ൩ ͵ ʹ
Part c: Here we use the usual matrix multiplication. ͳ AB =͵ ͳ
ʹ ͳ ൩ כቂ Ͳ ͳ
ͳ ʹ
ͳ ቃ ͳ 23
M. Kaabar
ͷ ͵ ͳͷ ͻ൩ ͵ ʹ
Part d: Since AB = C, then c23 means that we need to find the number that is located in 2nd row and 3rd column. ܿଵଵ ܿଵଶ ܿଵଷ ͳ ͷ ͵ ܿ ܿ ܿ ൩ C = ଶଵ ଶଶ ଶଷ ൌ ͵ ͳͷ ͻ൩ ܿଷଵ ܿଷଶ ܿଷଷ ͳ ͵ ʹ Now, let’s explain each element of matrix C. c11 means that the number that is located in 1st row and 1st column. c11 = 1. c12 means that the number that is located in 1 st row and 2nd column. c12 = 5. c13 means that the number that is located in 1 st row and 3rd column. c13 = 3. c21 means that the number that is located in 2 nd row and 1st column. c21 = 3. c22 means that the number that is located in 2 nd row and 2nd column. c22 = 15. c23 means that the number that is located in 2 nd row and 3rd column. c23 = 9. c31 means that the number that is located in 3 rd row and 1st column. c31 = 1. c32 means that the number that is located in 3 rd row and 2nd column. c32 = 3. c33 means that the number that is located in 3 rd row and 3rd column. c33 = 2.
Hence, c23 = 9.
26
Part e: As we did in part a. Since B has an orderሺሻʹ ൈ ͵ǡ and A has an orderሺሻ͵ ൈ ʹ, then BA will have an orderሺሻʹ ൈ ʹ according to Definition 1.3.1. Step 1: 1st column of BA:
Example 1.3.4 Given the following matrix A and matrix B: ͳ Ͳ Ͳ Ͳ ቃ ቃ A=ቂ , B= ቂ Ͳ Ͳ ͳ Ͳ a) Find AB. b) Find BA.
ͳ ͳ ͳ ͷ 1 *ቂ ቃ + 3 * ቂ ቃ + 1 * ቂ ቃ = ቂ ቃ Ͳ ʹ ͳ
Solution: Part a: Using the usual matrix multiplication, we obtain: ͳ Ͳ Ͳ Ͳ Ͳ Ͳ ቃቂ ቃ=ቂ ቃ AB = ቂ Ͳ Ͳ ͳ Ͳ Ͳ Ͳ
Step 2: 2nd column of BA: ͳ ͳ ͳ ͻ 2 *ቂ ቃ + 6 * ቂ ቃ + 1 * ቂ ቃ = ቂ ቃ Ͳ ʹ ͳ ͳ͵ Hence, BA = ቂ
Part b: Using the usual matrix multiplication, we obtain: Ͳ Ͳ ͳ Ͳ Ͳ Ͳ ቃቂ ቃ= ቂ ቃ BA = ቂ ͳ Ͳ Ͳ Ͳ ͳ Ͳ
ͷ ͻቃ ͳ͵
Result 1.3.2 It is possible that the product of two nonzero matrices is a zero matrix. However, this is not true for product of real numbers.
Part f: As we did in part b. Step 1: 1st row of AB: 1 * ሾͳ Step 2:
2nd
0 * ሾͳ
ʹሿ + 1 * ሾ ͵
ሿ + 1 * ሾͳ ͳሿ = ሾͷ
ͻሿ
row of AB: ʹሿ + 2 * ሾ ͵
Hence, BA = ቂ
M. Kaabar
ሿ + 1 * ሾͳ ͳሿ = ሾ
ͳ͵ሿ
ͷ ͻቃ ͳ͵
Result 1.3.1 In general, matrix multiplication is not commutative (i.e. AB is not necessarily equal to BA). 25
Example 1.3.5 Given the following matrix A, matrix B and matrix D: ͳ Ͳ Ͳ Ͳ Ͳ Ͳ ቃ ቃ , ቃ A=ቂ , B= ቂ D= ቂ Ͳ Ͳ ͳ Ͳ Ͳ ͳ a) Find AB. b) Find AD. Solution: Part a: Using the usual matrix multiplication, we obtain: ͳ Ͳ Ͳ Ͳ Ͳ Ͳ ቃቂ ቃ=ቂ ቃ zero matrix. AB = ቂ Ͳ Ͳ ͳ Ͳ Ͳ Ͳ
Part b: Using the usual matrix multiplication, we obtain: ͳ Ͳ Ͳ Ͳ Ͳ Ͳ ቃቂ ቃ=ቂ ቃ zero matrix. AD = ቂ Ͳ Ͳ Ͳ ͳ Ͳ Ͳ Result 1.3.3 In general, If AB = AD and A is not a zero matrix, then it is possible that B ് D. Definition 1.3.2 Suppose matrix A has an order (size) ݊ ൈ ݉ , A is a zero matrix if each number of A is a zero number. Example 1.3.5 Given the following system of linear equations: ݔଵ ݔଶ ݔଷ ൌ ͵ ൝െݔଵ െ ݔଶ ʹݔଷ ൌ Ͳ ݔଶ Ͷݔଷ ൌ ͷ Write the above system in matrix-form. Solution: We write the above system in matrix-form as follows: (Coefficient Matrix) ή (Variable Column) = (Constant Column) CX = A where C is a Coefficient Matrix, X is a Variable Column, and A is a Constant Column. ͵ ͳ ͳ ͳ ݔଵ െͳ െͳ ʹ൩ ݔଶ ൩ ൌ Ͳ൩ ͷ Ͳ ͳ Ͷ ݔଷ The above matrix-form means the following: ͳ ͳ ͳ ݔଵ כെͳ൩ ݔଶ כെͳ൩ ݔଷ כʹ൩ Ͳ ͳ Ͷ Hence, we obtain: ݔଵ ݔଶ ݔଷ ൌ ͵ ͵ െݔଵ െ ݔଶ ʹݔଷ ൌ Ͳ൩ ൌ Ͳ൩ ݔଶ Ͷݔଷ ൌ ͷ ͷ 27
28
M. Kaabar
Now, let ݔଵ ൌ ͳǡ ݔଶ ൌ ͳǡ ݔଷ ൌ ͳǤ ͳ ͳ ͳ ͳ ͵ െͳ െͳ ʹ൩ ͳ൩ ൌ Ͳ൩ Ͳ ͳ Ͷ ͳ ͷ Result 1.3.4 Let CX = A be a system of linear equations ݔଵ ܽଵ ݔ ۍଶ ܽ ۍ ېଶ ې C ێǤ ێ= ۑǤ ۑ ێǤ ێ ۑǤ ۑ ݔ ۏ ܽ ۏ ے ے Then, ଵ ൌ ଵ ǡ ଶ ൌ ଶ ǡ ǥ Ǥ ǡ ୬ ൌ ୬ is a solution to the system if and only if ଵ ଵ ଶ ଶ ڮ ୬ ୫ ൌ ǡC1 is the 1st column of C, C2 is the 2nd column of C, …. , Cm is nth column of C. Example 1.3.6 Given the following system of linear equations: ݔ ݔଶ ൌ ʹ ൜ ଵ ʹݔଵ ʹݔଶ ൌ Ͷ Write the above system in matrix-form. Solution: We write the above system in matrix-form CX = A as follows: ͳ ͳ ͳݔ ʹ ቃ , C=ቂ X=ቂ ቃ , A=ቂ ቃ ʹ ʹ ʹݔ Ͷ Hence, CX = A: ͳ ͳ ݔଵ ʹ ቂ ቃቂ ቃ = ቂ ቃ ʹ ʹ ݔଶ Ͷ ͳ ͳ ʹ ݔଵ כቂ ቃ ݔଶ כቂ ቃ ൌ ቂ ቃ ʹ ʹ Ͷ Now, we need to choose values for x1 and x2 such that these values satisfy that above matrix-form. First, let’s try x1 = 3 and x2 = 4. ͳ ͳ ʹ ͵ כቂ ቃ Ͷ כቂ ቃ ൌ ቂ ቃ ് ቂ ቃ ʹ ʹ ͳͶ Ͷ
Therefore, x1 = 3, x2 = 4 is not solution, and our assumption is wrong. Now, let’s try x1 = 0 and x2 = 2. ͳ ͳ ʹ ʹ Ͳ כቂ ቃ ʹ כቂ ቃ ൌ ቂ ቃ ൌ ቂ ቃ ʹ ʹ Ͷ Ͷ Therefore, x1 = 0, x2 = 2 is solution.
30
Fact 1.4.5 Թൈ ൌ ଶ ሺԹሻ = set of all ʹ ൈ ʹ matrices.
Result 1.3.5 Let CX = A be a system of linear equations. The constant column A can be written uniquely as a linear combination of the columns of C.
In this section, we discuss some examples and facts of square matrix and identity matrix. Definition 1.4.1 In general, let A is a matrix, and n be a positive integer such that An is defined only if number of rows of A = number of columns of A. (i.e. Let A is a matrix with a size (order) ൈ ݉ , A2 = AήA is defined only if n = m). This matrix is called a SquareMatrix. The following is some facts regarding square-matrix and set of real numbers ԹǤ ଵ
Թ Է
Ժ
.
Fact 1.4.2 Let A is a matrix, ି୬ has a specific meaning if A is a square-matrix and A is invertible (nonsingular). We will talk about it in section 1.5. Fact 1.4.3 A set of real numbers Թ has a multiplicative identity equals to 1. (i.e. ܽ ή ͳ ൌ ͳ ή ܽ ൌ ܽ). Fact 1.4.4 A set of real numbers Թ has an additive identity equals to 0. (i.e. ܽ Ͳ ൌ Ͳ ܽ ൌ ܽ). 29
Now, we give some helpful notations: Ժ: Set of all integers. Է: Set of all rational numbers. Թ : Set of all real number. Գ : Set of all natural numbers. The following figure 1.4.1 represents three main sets: Ժǡ ԷԹǤ
1.4 Square Matrix
Fact 1.4.1 Let A is a matrix, ି୬ is not equal to
M. Kaabar
Figure 1.4.1: Representation of Three Sets of Numbers The above figure shows that the largest set is Թ , and the smallest set is Ժ. In addition, we know that a rational number is an integer divided by a non-zero integer.
Fact 1.4.6 ͵ אԹ means that 3 is an element of the setԹǤ (“ ”אis a mathematical symbol that means “belong to”). Fact 1.4.7 ξʹ אԹ means that ξʹ is an element of the setԹǤ (“ ”אis a mathematical symbol that means “belong to”). ଵ
ଵ
ଶ
ଶ
Fact 1.4.8 בԺ means that is not an element of the setԺǤ (“ ”בis a mathematical symbol that means “does not belong to”). Fact 1.4.9 אଶ ሺԹሻ means that A is a ʹ ൈ ʹ matrix. Fact 1.4.10 Թൈ ൌ ሺԹሻ = set of all ݊ ൈ ݊ matrices. (i.e. Թൈ ൌ ଷ ሺԹሻ = set of all ͵ ൈ ͵ matrices). After learning all above facts, let’s test our knowledge by asking the following question. Question 1.4.1 Does ሺԹሻ has a multiplicative identity? Solution: Yes; there is a matrix called where א ሺԹሻ such that ൌ ൌ for every א ሺԹሻ. Example 1.4.1 Givenଶ ሺԹሻ ൌ Թൈ . What is ଶ ? Solution: We need to find the multiplicative identity for all ʹ ൈ ʹ matrices. This means that ଶ ൌ ଶ ൌ for every אଶ ሺԹሻ. ͳ ଶ ൌ ቂ Ͳ
ܽଵଵ Ͳ ቃ and ൌ ቂܽ ͳ ଶଵ
ܽଵଶ ܽଶଶ ቃ
Now, let’s check if the multiplicative identity is right: ܽଵଵ ܽଵଶ ͳ Ͳ ܽଵଵ ܽଵଶ ቃ ቂ ቃ ቂ ଶ ൌ ቂܽ ൌ ܽଶଶ Ͳ ͳ ܽଶଵ ܽଶଶ ቃ ൌ ଶଵ ܽଵଵ ܽଵଶ ͳ Ͳ ܽଵଵ ܽଵଶ ቃ ቂܽ ቃ ൌ ቂܽ ଶ ൌ ቂ ܽ ܽଶଶ ቃ ൌ Ͳ ͳ ଶଵ ଶଶ ଶଵ 31
32
M. Kaabar
Hence, the multiplicative identity for all ʹ ൈ ʹ matrices ͳ is ଶ ൌ ቂ Ͳ
Ͳ ቃǤ ͳ
Example 1.4.2 Givenଷ ሺԹሻ ൌ Թൈ . What is ଷ ? Solution: We need to find the multiplicative identity for all ͵ ൈ ͵ matrices. This means that ଷ ൌ ଷ ൌ for every אଷ ሺԹሻ. ͳ Ͳ ଷ ൌ Ͳ ͳ Ͳ Ͳ
ܽଵଵ Ͳ Ͳ൩ and ൌ ܽଶଵ ܽଷଵ ͳ
ܽଵଶ ܽଶଶ ܽଷଶ
ܽଵଷ ܽଶଷ ൩ ܽଷଷ
Now, let’s check if the multiplicative identity is right: ܽଵଵ ܽଵଶ ܽଵଷ ܽଵଵ ܽଵଶ ܽଵଷ ͳ Ͳ Ͳ ଷ ൌ ܽଶଵ ܽଶଶ ܽଶଷ ൩ Ͳ ͳ Ͳ൩ ൌ ܽଶଵ ܽଶଶ ܽଶଷ ൩ ൌ ܽଷଵ ܽଷଶ ܽଷଷ Ͳ Ͳ ͳ ܽଷଵ ܽଷଶ ܽଷଷ ܽଵଵ ܽଵଶ ܽଵଷ ͳ Ͳ Ͳ ܽଵଵ ܽଵଶ ܽଵଷ ଷ ൌ Ͳ ͳ Ͳ൩ ܽଶଵ ܽଶଶ ܽଶଷ ൩ ൌ ܽଶଵ ܽଶଶ ܽଶଷ ൩ ൌ ܽଷଵ ܽଷଶ ܽଷଷ Ͳ Ͳ ͳ ܽଷଵ ܽଷଶ ܽଷଷ Hence, the multiplicative identity for all ͵ ൈ ͵ matrices ͳ Ͳ is ଷ ൌ Ͳ ͳ Ͳ Ͳ
Ͳ Ͳ൩Ǥ ͳ
Result 1.4.1 Assume we have a square-matrix with ݊ ൈ ݊ size and the main-diagonal is ܽଵଵ ǡ ܽଶଶ ǡ ܽଷଷ ǡ ǥ ǡ ܽ ܽͳͳ
ڭ ܽ݊ͳ
ڮ
ܽʹʹ
ǥ
ǥ
ǥ
ڮ
ܽͳ݊
ڭ൩ ܽ݊݊
Then, the multiplicative identity for the above ݊ ൈ ݊ matrix is as follows:
ͳ
ൌ ڭ
ڮ
ͳ Ͳ
Ͳ ͳ
Ͳ
ڭ
34
M. Kaabar
Ͳ ͳ ڮ All ones on the main-diagonal and zeros elsewhere.
d) Find Elementary Matrices, L1, L2,…, Ln such
1.5 Inverse Square Matrix
e) Find a matrix S such that SC=B.
that L1 L2 …. Ln B = Z. f) Find a matrix X such that XB=C.
In this section, we discuss the concept of inverse square matrix, and we give some examples of elementary matrix. Example 1.5.1Given the following matrices with the following row-operations steps: ͳ ʹ ͵ Ͷ െʹ െͶ െ െͺ ൌ Ͳ ͳെͳ ʹ൩ , ൌ Ͳ ͳ െͳ ʹ ൩ Ͳ ͳ ͳ ͵ Ͳ ͳ ͳ ͵ ͳ ʹ ͵ Ͷ Ͳ ͳെͳ ʹ൩ Ͳ ͳ ͳ ͵ -2R1Æ െʹ Ͳ Ͳ
െͶ െ ͳ െͳ ͳ ͳ
െͺ ʹ൩ ͵
2R1+R3--Æ R3 R3ÅÆ R1
Matrix C
a) Find a matrix D such that DZ=W. b) Find a matrix K such that KW=Z. c) Find Elementary Matrices, F1, F2, F3, …, Fm such that F1 F2 …. Fm Z = C. 33
D is called Elementary Matrix of Type I. െʹ Ͳ Ͳ Thus, Ͳ ͳ Ͳ൩ ൌ Ͳ Ͳ ͳ Part b: Since W is ͵ ൈ Ͷ matrix and we multiply from ଵ
ଵ
ଶ
ଶ
right by െ because the inverse of -2 is െ , then K must be a square matrix. Hence, K must be͵ ൈ ͵. We use the multiplicative identity, and we multiply it
-3R1+R2--Æ R2 Matrix B
Solution: Part a: Since Z is ͵ ൈ Ͷ matrix and we multiply from left by -2, then D must be a square matrix. Hence, D must be͵ ൈ ͵. We use the multiplicative identity, and we multiply it by -2 from left as follows: ͳ Ͳ Ͳ െʹ Ͳ Ͳ ଷ ൌ Ͳ ͳ Ͳ൩-2R1Æ Ͳ ͳ Ͳ൩ ൌ Ͳ Ͳ ͳ Ͳ Ͳ ͳ
ଵ
by െ ଶ from left as follows: ͳ ଷ ൌ Ͳ Ͳ
ଵ
Ͳ Ͳ െ Ͳ ଶ ଵ ൩ െ R Æ ͳ Ͳ 1 Ͳ ͳ ଶ Ͳ ͳ Ͳ Ͳ
Ͳ Ͳ ൌ ͳ
K is also called Elementary Matrix of Type I. ଵ
െଶ Thus, Ͳ Ͳ
Ͳ ͳ Ͳ
Ͳ Ͳ ൌ ͳ
Result 1.5.1 Each row-operation corresponds to one and only one elementary matrix. Part c: According to the given four row-operations steps, we need to find four elementary matrices from Z to C which means m=4 because we want to go 4 steps forward from Z to C. From part a, we already have the following: ͳ Ͳ Ͳ െʹ Ͳ Ͳ ଷ ൌ Ͳ ͳ Ͳ൩-2R1Æ Ͳ ͳ Ͳ൩ (Step One) Ͳ Ͳ ͳ Ͳ Ͳ ͳ െʹ Hence, Ͳ Ͳ ͳ ଷ ൌ Ͳ Ͳ
Ͳ ͳ Ͳ
Ͳ Ͳ൩ ൌ ͳ
Ͳ Ͳ ͳ ͳ Ͳ൩2R1+R3--Æ R3Ͳ Ͳ ͳ ʹ
Ͳ Ͳ ͳ Ͳ൩ (Step Two) Ͳ ͳ
36
M. Kaabar
ͳ ଷ ൌ Ͳ Ͳ
Ͳ Ͳ Ͳ ͳ Ͳ൩ R3ÅÆ R1Ͳ Ͳ ͳ ͳ
Ͳ ͳ ͳ Ͳ൩ (Step Four) Ͳ Ͳ
Hence, we obtain the following four elementary matrices: Ͳ Ͳ ͳ
Ͳ ͳ ͳ ͳ Ͳ൩െ͵ Ͳ Ͳ Ͳ
Ͳ ͳ Ͳ
Ͳ ͳ Ͳ Ͳ൩Ͳ ͳ ͳ ʹ Ͳ
Ͳ െʹ Ͳ൩ Ͳ ͳ Ͳ
Ͳ ͳ Ͳ
Ͳ Ͳ൩ ൌ ͳ
Part d: According to the given three row-operations steps from B to Z, we need to find three elementary matrices which means n=3 because we want to go 3 steps backward from B to Z. Backward steps mean that we need to do inverse steps (i.e. The inverse of ଵ
ͳ Hence, Ͳ ʹ ͳ ଷ ൌ Ͳ Ͳ
Ͳ Ͳ െʹ Ͳ ͳ Ͳ൩ Ͳ ͳ Ͳ ͳ Ͳ Ͳ
Ͳ Ͳ൩ ൌ ͳ
Ͳ Ͳ ͳ ͳ Ͳ൩-3R1+R2--Æ R2െ͵ Ͳ Ͳ ͳ
ͳ Hence, െ͵ Ͳ
Ͳ ͳ Ͳ
Ͳ ͳ Ͳ Ͳ൩Ͳ ͳ ͳ ʹ Ͳ
-2R1 is െ R1 because it is row-multiplication step). We
Ͳ െʹ Ͳ൩ Ͳ ͳ Ͳ
35
ଶ
Ͳ Ͳ ͳ Ͳ൩ (Step Three) Ͳ ͳ Ͳ ͳ Ͳ
Ͳ Ͳ൩ ൌ ͳ
start from the third step as follows: The inverse of -3R1+R2--Æ R2 is 3R1+R2--Æ R2 because it is row-addition step. ͳ Ͳ Ͳ ͳ Ͳ Ͳ ଷ ൌ Ͳ ͳ Ͳ൩3R1+R2--Æ R2 ͵ ͳ Ͳ൩ (Step Three) Ͳ Ͳ ͳ Ͳ Ͳ ͳ ͳ Ͳ Ͳ Hence, ͵ ͳ Ͳ൩ ൌ Ͳ Ͳ ͳ The inverse of 2R1+R3--Æ R3 is -2R1+R3--Æ R3 because it is row-addition step. ͳ ଷ ൌ Ͳ Ͳ
Ͳ Ͳ ͳ ͳ Ͳ൩-2R1+R3--Æ R3 Ͳ Ͳ ͳ െʹ
Ͳ Ͳ ͳ Ͳ൩ (Step Two) Ͳ ͳ
38 ͳ Ͳ Ͳ ͳ Ͳ Ͳ Hence, Ͳ ͳ Ͳ൩ ͵ ͳ Ͳ൩ ൌ െʹ Ͳ ͳ Ͳ Ͳ ͳ ଵ The inverse of -2R1 is െ R1 because it is rowଶ
multiplication step ͳ ଷ ൌ Ͳ Ͳ
ଵ
Ͳ Ͳ െ Ͳ ଶ ଵ ͳ Ͳ൩ െ ଶR1Æ Ͳ ͳ Ͳ ͳ Ͳ Ͳ
Ͳ Ͳ (Step One) ͳ
Hence, we obtain the following three elementary matrices: ͳ ൦ ʹ Ͳ Ͳ െ
Ͳ ͳ Ͳ
ͳ ൪ Ͳ Ͳ െʹ ͳ
Ͳ
Ͳ Ͳ ͳ ͳ Ͳ ൩ ͵ Ͳ ͳ Ͳ
Ͳ ͳ Ͳ
Ͳ Ͳ൩ ൌ ͳ
Part e: According to the given last row-operations step from C to B, we need to find one elementary matrix because we want to go 1 step backward from C to B. The inverse of R3ÅÆ R1 is R1ÅÆ R3 which is the same as R3ÅÆ R1. ͳ Ͳ Ͳ Ͳ Ͳ ͳ ଷ ൌ Ͳ ͳ Ͳ൩ R3ÅÆ R1Ͳ ͳ Ͳ൩ ͳ Ͳ Ͳ Ͳ Ͳ ͳ Ͳ Hence, we obtain: Ͳ ͳ Ͳ Ͳ ͳ S = Ͳ ͳ Ͳ൩. ͳ Ͳ Ͳ
Ͳ ͳ ͳ Ͳ൩ ൌ . This means that Ͳ Ͳ
M. Kaabar
Part f: According to the given last row-operations step from B to C, we need to find one elementary matrix because we want to go 1 step forward from B to C. ͳ Ͳ Ͳ Ͳ Ͳ ͳ ଷ ൌ Ͳ ͳ Ͳ൩ R3ÅÆ R1Ͳ ͳ Ͳ൩ Ͳ Ͳ ͳ ͳ Ͳ Ͳ Ͳ Hence, we obtain: Ͳ ͳ Ͳ Ͳ ͳ X = Ͳ ͳ Ͳ൩. ͳ Ͳ Ͳ
Ͳ ͳ ͳ Ͳ൩ ൌ . This means that Ͳ Ͳ
Example 1.5.2 Given the following matrix: ʹ ͳ ቃ ൌቂ Ͷ Ͳ Find A-1 if possible. Solution: Finding A-1 if possible means that we need to find a possible inverse matrix called A-1 such that AA-1 = ଶ = A-1A. To find this possible matrix A-1, we need to do the following steps: Step 1: Write A and ଶ in the following form:
ሺȁ ʹ ሻ ቀ
ʹ
ͳ ͳ
ቚ
Ͳ
ቁ
Ͷ Ͳ Ͳ ͳ Step 2: Do some Row-Operations until you get the following:
ሺ ʹ ǫ ǡ ൌ െͳ ȁሻ 37
ሺ ʹ ǫ ǡ ȁሻ
Now, let’s do the above step until we get the Completely-Reduced-Echelon matrix.
ቀ
ʹ
ͳ ͳ
Ͳ
Ͷ
Ͳ Ͳ
ͳ
ቚ
ͳ ͳ
ͳ
൬
Ͳ
ฬ
ʹ ʹ
Ͷ
Ͳ Ͳ
ଵ
ͳ
ଵ
ͳ ଶ ቤ ଶ ቆ Ͳ െʹ െʹ
ቌͳ Ͳ
ଵ
ଶቮ
ଵ ଶ
ͳͳ
ͳ ͳ
ͳ
ቁ െ ଶR1Æ ൬ ଵ
ฬ
ʹ ʹ
Ͷ
Ͳ Ͳ
൰-4R1+R2--Æ R2ቆͳ
ͳ
൰
ଵ
ଵ
ଶቮ
ଵ ଶ
ͳͳ
ͳ ଵቍ െ ଶR2+R1--Æ R1ቌͲ െ ଵ
ଶ
ଵ
Ͳ ቇ ͳ
ଶ
െ
ሺȁ ͵ ሻ
െ
ଵቍ ଶ
ͲͲ ቮ ͳͳ
ଵ ସ
െ
ଵ ସ
AA-1 = ଷ = A-1A. To find this possible matrix A-1, we need to do the following steps: Step 1: Write A and ଷ in the following form:
Ͳ
ଵቍ
ଶ
Example 1.5.3 Given the following matrix: ͳ Ͳ ʹ ൌ Ͳ ͳ Ͳ ൩ Ͳ െͳ ͳ -1 Find A if possible.
39
Ͳ
ʹ ͳ
Ͳ
Ͳ
ቆͲ
ͳ
ͲቤͲ
ͳ
Ͳቇ
ሺ ͵ ǫ ǡ ൌ െͳ ȁሻ ሺ ͵ ǫ ǡ ȁሻ Now, let’s do the above step until we get the Completely-Reduced-Echelon matrix. ͳ Ͳ ʹ ͳ Ͳ Ͳ
ቆͲ
ଵ
ͳ
Ͳ െͳ ͳ Ͳ Ͳ ͳ Step 2: Do some Row-Operations until you get the following:
ଶ
Since we got the Completely-Reduced-Echelon matrix which is the identity matrix ଶ , then A has an inverse matrix which is A-1. Ͳ Hence, A-1 = ͳ
M. Kaabar
Solution: Finding A-1 if possible means that we need to find a possible inverse matrix called A-1 such that
ቤ ଶ Ͳ െʹ െʹ
ଵ Ͳ ͳ ቇ െ R2Æ ቌ ଶ ͳ Ͳ
Ͳ
Ͳ
40
ͳ
ͲቤͲ
ͳ
ͲቇR2+R3--Æ R3
Ͳ
െͳ
ͳ Ͳ
Ͳ
ͳ
ͳ
Ͳ
ʹ ͳ
Ͳ
Ͳ
ቆͲ
ͳ
ͲቤͲ
ͳ
Ͳቇ -2R3+R1--Æ R1
Ͳ
Ͳ
ͳ Ͳ
ͳ
ͳ
ͳ
Ͳ
Ͳ ͳ
െʹ
െʹ
ቆͲ
ͳ
ͲቤͲ
ͳ
Ͳ
Ͳ
Ͳ
ͳ Ͳ
ͳ
ͳ
ቇ
42
M. Kaabar
Since we got the Completely-Reduced-Echelon matrix which is the identity matrix ଷ , then A has an inverse matrix which is A-1. ͳ െʹ െʹ -1 Hence, A = Ͳ ͳ Ͳ൩ Ͳ ͳ ͳ Result 1.5.2 Given ݊ ൈ ݉ matrix A and identity matrix
1.6 Transpose Matrix
, and with some row-operations, we got Ԣ and Ԣ as
Definition 1.6.1 Given ݊ ൈ ݉ matrix A, is called A
follows: ሺȁ ݊ ሻ Row-Operations-Æ ሺԢȁ ݊ Ԣሻ
transpose and it is ݉ ൈ ݊ matrix. We obtain from A
Then, ᇱ ൌ Ԣ. Result 1.5.3 Given ݊ ൈ ݊ matrix A and identity matrix ିଵ
, and with some row-operations, we got and as follows: ሺȁ ݊ ሻ Row-Operations-Æ ሺ ݊ ȁ
െͳ
ሻ
Then,AA-1 = = A-1A. Fact 1.5.1 Given ݊ ൈ ݉ matrix A and identity matrix ,
In this section, we introduce the concept of transpose matrix, and we discuss some examples of symmetric and skew-symmetric matrices.
by making columns of A rows or by making rows of A columns. Example 1.6.1 Given the following matrix: ʹ ͵ ͳ Ͳ ൌ ͷ ͳʹ ͵൩ ͳ ͳ Ͳ ͷ Find .
is called a right identity for all ݊ ൈ ݊ matrices such
Solution: According to definition 1.6.1, A is͵ ൈ Ͷ matrix. Thus, should beͶ ൈ ͵. By making columns of A rows, we obtain the following: ʹ ͷ ͳ ൌ ͵ ͳ ͳ ͳ ʹ Ͳ Ͳ ͵ ͷ
that = A. (i.e. Given͵ ൈ ͷ matrix A, then ହ = A).
Definition 1.6.2 Given ݊ ൈ ݉ matrix A and݉ ൈ ݊
Result 1.5.4 Given ݊ ൈ ݊ matrix A, A-4 has a meaning if
, then is always defined, and it is ݊ ൈ ݊
and only if A-1 exists.
matrix. (i.e. Let ൌ , then ൈ ൈ ൌ ൈ ).
Result 1.5.5 Given ݊ ൈ ݊ matrix A, If A-1 exists, then
Definition 1.6.3 Given ݊ ൈ ݉ matrix A and݉ ൈ ݊
A-4 = A-1 ൈ A-1ൈ A-1ൈ A-1.
, then is always defined, and it is ݉ ൈ ݉
is called a left identity for all ݊ ൈ ݉ matrices such that = A. (i.e. Given͵ ൈ ͷ matrix A, then ଷ = A). Fact 1.5.2 Given ݊ ൈ ݊ matrix A and identity matrix ,
41
matrix. (i.e. Let ൌ , then ൈ ൈ ൌ ൈ ).
Definition 1.6.4 Given ݊ ൈ ݊ matrix A and݊ ൈ ݊
44
M. Kaabar
, then is symmetric if ൌ Ǥ Definition 1.6.5 Given ݊ ൈ ݊ matrix A and݊ ൈ ݊ , then is skew-symmetric if ൌ െǤ Example 1.6.2 Given the following matrix: ͳ ͷ ͳͲ ൌͷ ͵ ൩ ͳͲ ͳͲ Show that A is symmetric. Solution: According to definition 1.6.4, A is͵ ൈ ͵ matrix. Thus, should be͵ ൈ ͵. By making columns of A rows, we obtain the following: ͳ ͷ ͳͲ ൌ ͷ ͵ ൩ ͳͲ ͳͲ ͳ ͷ ͳͲ Since ൌ ൌ ͷ ͵ ൩, then A is symmetric. ͳͲ ͳͲ Example 1.6.3 Given the following matrix: Ͳ ʹ ͷ ൌ െʹ Ͳ ͵൩ െͷ െ͵ Ͳ Show that A is skew-symmetric.
Ͳ Since ൌ െ ൌ ʹ ͷ
െʹ Ͳ ͵
െͷ െ͵൩, then A is skewͲ
symmetric. Fact 1.6.1 Given ݊ ൈ ݊ matrix A, if A is skewsymmetric, then all numbers on the main diagonal of A are always zeros. Fact 1.6.2 Given ݊ ൈ ݉ matrix A, then ሺ ሻ ൌ Ǥ Fact 1.6.3 Given ݊ ൈ ݉ matrix A and ݉ ൈ ݇ matrix B, then ሺ ሻ ൌ Ǥ (Warning:ሺ ሻ ് ). Fact 1.6.4 Given ݊ ൈ ݉ matrices A and B, then ሺ േ ሻ ൌ േ Ǥ Result 1.6.1 Given matrix A and constantߙ, if A is symmetric, then ߙ is symmetric such that ሺߙሻ ൌ ߙ Ǥ Result 1.6.2 Given matrix A and constantߙ, if A is skew-symmetric, then ߙ is skew-symmetric such that ሺߙሻ ൌ ߙ Ǥ Result 1.6.3 Let A be ݊ ൈ ݊ matrix, there exists a
Solution: According to definition 1.6.5, A is͵ ൈ ͵ matrix. Thus, should be͵ ൈ ͵. By making columns of A rows, we obtain the following: Ͳ െʹ െͷ ൌ ʹ Ͳ െ͵൩ ͷ ͵ Ͳ
symmetric matrix B and a skew-symmetric matrix C such that A is a linear combination of B and C. This means that those should be numbers ߙଵ ߙଶ such that ൌ ߙଵ ߙଶ . Proof of Result 1.6.3 We will show that A is a linear
43
combination of B and C.
We assume that B is symmetric such that ൌ . ൌ ሺ ሻ ൌ ሺ ሻ ൌ ൌ Ǥ
46
M. Kaabar
Now, we assume that C is skew-symmetric such that
1.7 Determinants
ൌ െ .
In this section, we introduce step by step for finding
ൌ ሺ െ ሻ ൌ െ ሺ ሻ ൌ െ ൌ െሺ െ ሻ ൌ െ
determinant of a certain matrix. In addition, we
By using algebra, we do the following:
discuss some important properties such as invertible
ͳ ͳ ͳ ͳ ൌ ሺ ሻ ሺ െ ሻ ʹ ʹ ʹ ʹ ͳ ͳ ͳ ͳ ൌ െ ൌ Ǥ ʹ ʹ ʹ ʹ
and non-invertible. In addition, we talk about the
Thus, A is a linear combination of B and C.
ଶ ሺԹሻ ൌ Թൈ ൌ Թൈ , let אଶ ሺԹሻ where A is ʹ ൈ ʹ ܽଵଵ ܽଵଶ matrix, ൌ ቂܽ ܽଶଶ ቃǤ The determinant of A is ଶଵ
Example 1.6.4 Given the following matrix: ʹ ͳ Ͷ ൌ ͵ Ͳ ͳ ൩ ͷ Find symmetric matrix B and skew-symmetric matrix C such that ൌ ߙଵ ߙଶ for some numbersߙଵ ߙଶ . Solution: As we did in the above proof, we do the following: ʹ ͳ Ͷ ʹ ͵ ͷ Ͷ Ͷ ͻ ൌ ൌ ͵ Ͳ ͳ൩ ͳ Ͳ ൩ ൌ Ͷ Ͳ ൩ ͷ ͻ ͳͶ Ͷ ͳ ʹ ͳ Ͷ ʹ ͵ ͷ Ͳ െʹ െͳ ൌ െ ൌ ͵ Ͳ ͳ൩ െ ͳ Ͳ ൩ ൌ ʹ Ͳ െͷ൩ ͷ ͳ ͷ Ͳ Ͷ ͳ ଵ Let ߙଵ ൌ ߙଶ ൌ ଶ
Ͷ Thus, ൌ ൌ Ͷ ଶ ଶ ଶ ͻ ଵ
ଵ
ଵ
Ͷ Ͳ 45
ͻ Ͳ െʹ െͳ ଵ ൩ ଶ ʹ Ͳ െͷ൩ ͳͶ ͳ ͷ Ͳ
effect of row-operations on determinants. Definition 1.7.1 Determinant is a square matrix. Given
represented by ሺሻȁȁ. Hence, ሺሻ ൌ ȁȁ ൌ ܽଵଵ ܽଶଶ െ ܽଵଶ ܽଶଵ אԹ. (Warning: this definition works only for ʹ ൈ ʹ matrices). Example 1.7.1 Given the following matrix: ͵ ʹ ቃ ൌቂ ͷ Find the determinant of A. Solution: Using definition 1.7.1, we do the following: ሺሻ ൌ ȁȁ ൌ ሺ͵ሻሺሻ െ ሺʹሻሺͷሻ ൌ ʹͳ െ ͳͲ ൌ ͳͳǤ Thus, the determinant of A is 11. Example 1.7.2 Given the following matrix: ͳ Ͳ ʹ ൌ ͵ ͳ െͳ൩ ͳ ʹ Ͷ Find the determinant of A.
Solution: Since A is ͵ ൈ ͵ matrix such that אଷ ሺԹሻ ൌ Թൈ , then we cannot use definition 1.7.1 because it is valid only for ʹ ൈ ʹ matrices. Thus, we need to use the following method to find the determinant of A. Step 1: Choose any row or any column. It is recommended to choose the one that has more zeros. In this example, we prefer to choose the second column or the first row. Let’s choose the second column as follows: ͳ Ͳ ʹ ൌ ͵ ͳ െͳ൩ ͳ ʹ Ͷ ܽଵଶ ൌ Ͳǡ ܽଶଶ ൌ ͳܽଷଶ ൌ ʹǤ Step 2: To find the determinant of A, we do the following: Forܽଵଶ , since ܽଵଶ is in the first row and second column, then we virtually remove the first row and second column. ͳ Ͳ ʹ ൌ ͵ ͳ െͳ൩ ͳ ʹ Ͷ ͵ െͳ ቃ ሺെͳሻଵାଶ ܽଵଶ ቂ ͳ Ͷ Forܽଶଶ , since ܽଶଶ is in the second row and second column, then we virtually remove the second row and second column. ͳ Ͳ ʹ ൌ ͵ ͳ െͳ൩ ͳ ʹ Ͷ ͳ ʹ ቃ ሺെͳሻଶାଶ ܽଶଶ ቂ ͳ Ͷ Forܽଷଶ , since ܽଷଶ is in the third row and second column, then we virtually remove the third row and second column. 47
48 ͳ ൌ ͵ ͳ
M. Kaabar Ͳ ͳ ʹ
ʹ െͳ൩ Ͷ ͳ ሺെͳሻଷାଶ ܽଷଶ ቂ ͵
ʹ ቃ െͳ
Step 3: Add all of them together as follows: ͵ െͳ ͳ ʹ ቃ ሺെͳሻଶାଶ ܽଶଶ ቂ ቃ ͳ Ͷ ͳ Ͷ ͳ ʹ ቃ ሺെͳሻଷାଶ ܽଷଶ ቂ ͵ െͳ ͵ െͳ ͳ ʹ ቃ ሺെͳሻସ ሺͳሻ ቂ ቃ ሺሻ ൌ ሺെͳሻଷ ሺͲሻ ቂ ͳ Ͷ ͳ Ͷ ͳ ʹ ቃ ሺെͳሻହ ሺʹሻ ቂ ͵ െͳ ͵ െͳ ͳ ʹ ቃ ሺͳሻሺͳሻ ቂ ቃ ሺሻ ൌ ሺെͳሻሺͲሻ ቂ ͳ Ͷ ͳ Ͷ ͳ ʹ ቃ ሺെͳሻሺʹሻ ቂ ͵ െͳ ሺሻ ൌ ሺെͳሻሺͲሻሺͳʹ െ െͳሻ ሺͳሻሺͳሻሺͶ െ ʹሻ ሺെͳሻሺʹሻሺെͳ
ሺሻ ൌ ሺെͳሻଵାଶ ܽଵଶ ቂ
െ ሻ ሺሻ ൌ Ͳ ʹ ͳͶ ൌ ͳǤ Thus, the determinant of A is 16. Result 1.7.1 Let א ሺԹሻ. Then, A is invertible (non-singular) if and only if ሺሻ ് ͲǤ The above result means that if ሺሻ ് Ͳ, then A is invertible (non-singular), and if A is invertible (nonsingular), then ሺሻ ് Ͳ. Example 1.7.3 Given the following matrix: ʹ ͵ ቃ ൌቂ Ͷ
Is A invertible (non-singular)?
50
Solution: Using result 1.7.1, we do the following: ሺሻ ൌ ȁȁ ൌ ሺʹሻሺሻ െ ሺ͵ሻሺͶሻ ൌ ͳʹ െ ͳʹ ൌ ͲǤ Since the determinant of A is 0, then A is noninvertible (singular). Thus, the answer is No because A is non-invertible (singular). ܽଵଵ ܽଵଶ Definition 1.7.2 Given ൌ ቂܽ ܽଶଶ ቃ. Assume that ଶଵ ሺሻ ് Ͳ such that ሺሻ ൌ ܽଵଵ ܽଶଶ െ ܽଵଶ ܽଶଵ . To find ିଵ (the inverse of A), we use the following format that applies only for ʹ ൈ ʹ matrices:
Result 1.7.2 Let א ሺԹሻ be a triangular matrix.
ିଵ ൌ ିଵ ൌ
ͳ ܽଶଶ ቂെܽ ଶଵ ሺܣሻ
െܽଵଶ ܽଵଵ ቃ
ͳ ܽଶଶ ቂെܽ ଶଵ ܽଵଵ ܽଶଶ െ ܽଵଶ ܽଶଵ
െܽଵଶ ܽଵଵ ቃ
Example 1.7.4 Given the following matrix: ͵ ʹ ቃ ൌቂ െͶ ͷ Is A invertible (non-singular)? If Yes, Find ିଵ . Solution: Using result 1.7.1, we do the following: ሺሻ ൌ ȁȁ ൌ ሺ͵ሻሺͷሻ െ ሺʹሻሺെͶሻ ൌ ͳͷ ͺ ൌ ʹ͵ ് ͲǤ Since the determinant of A is not 0, then A is invertible (non-singular). Thus, the answer is Yes, there exists ିଵ according to definition 1.7.2 as follows: ͷ ʹ െ ͳ ͳ ͷ െʹ ͷ െʹ ʹ͵൪ ቂ ቃൌ ቂ ቃ ൌ ൦ʹ͵ ିଵ ൌ Ͷ ͵ ሺ ሻ ܣͶ ͵ ʹ͵ Ͷ ͵ ʹ͵ ʹ͵ 49
M. Kaabar
Then, ሺሻ = multiplication of the numbers on the main diagonal of A. There are three types of triangular matrix: a) Upper Triangular Matrix: it has all zeros on the left side of the diagonal of ݊ ൈ ݊ matrix. ͳ (i.e. ൌ Ͳ ʹ Ͳ Ͳ
͵ ͷ൩ is an Upper Triangular Matrix). Ͷ
b) Diagonal Matrix: it has all zeros on both left and right sides of the diagonal of ݊ ൈ ݊ matrix. ͳ (i.e. ൌ Ͳ Ͳ
Ͳ Ͳ ʹ Ͳ൩ is a Diagonal Matrix). Ͳ Ͷ
c) Lower Triangular Matrix: it has all zeros on the right side of the diagonal of ݊ ൈ ݊ matrix. ͳ Ͳ (i.e. ൌ ͷ ʹ ͳ ͻ
Ͳ Ͳ൩ is a Diagonal Matrix). Ͷ
Fact 1.7.1 Let א ሺԹሻ. Then, ሺሻ ൌ ሺ ሻ. Fact 1.7.2 Let א ሺԹሻ. If A is an invertible (nonsingular) matrix, then is also an invertible (nonsingular) matrix. (i.e. ሺ ሻିଵ ൌ ሺିଵ ሻ ). Proof of Fact 1.7.2 We will show that ሺ ሻିଵ ൌ ሺିଵ ሻ . We know from previous results that ିଵ ൌ . By taking the transpose of both sides, we obtain:
ሺିଵ ሻ ൌ ሺ ሻ
52
M. Kaabar
Then, ሺିଵ ሻ ൌ ሺ ሻ Since ሺ ሻ ൌ , then ሺିଵ ሻ ൌ .
* Ri՞Rk (Interchange two rows). It has no effect on
Similarly, ሺ ሻିଵ ൌ ሺ ሻ ൌ .
the determinants.
Thus, ሺ ሻିଵ ൌ ሺିଵ ሻ .
In general, the effect of Column-Operations on
The effect of Row-Operations on determinants:
determinants is the same as for Row-Operations.
Suppose ןis a non-zero constant, and ݅ܽ݊݀݇ are row numbers in the augmented matrix. * ןRi
, ( Ͳ ്ןMultiply a row with a non-zero
constant)ן. ͳ ʹ i.e. ൌ Ͳ Ͷ ʹ Ͳ
͵ ͳ ͳ൩ 3R2 --Æ Ͳ ͳ ʹ
ʹ ͵ ͳʹ ͵൩ ൌ Ͳ ͳ
Assume that ሺሻ ൌ ɀ where ɀ is known, then ሺሻ ൌ ͵ɀ. Similarly, if ሺሻ ൌ Ⱦ Ⱦǡ then ଵ
ሺሻ ൌ ଷ Ⱦ.
Example 1.7.5 Given the following Ͷ ൈ Ͷ matrix A with some Row-Operations: 2R1 --Æ A1 3R3 --Æ A2 -2R4 --Æ A4 If ሺሻ ൌ Ͷ, then find ሺଷ ሻ Solution: Using what we have learned from the effect of determinants on Row-Operations: ሺଵ ሻ ൌ ʹ כሺሻ ൌ ʹ כͶ ൌ ͺ because ଵ has the first row of A multiplied by 2. ሺଶ ሻ ൌ ͵ כሺଵ ሻ ൌ ͵ כͺ ൌ ʹͶ because ଶ has the third row of ଵ multiplied by 3. Similarly, ሺଷ ሻ ൌ െʹ כሺଶ ሻ ൌ െʹ ʹ כͶ ൌ െͶͺ because ଷ has the fourth row of ଶ multiplied by -2.
* ןRi +Rk --Æ Rk (Multiply a row with a non-zero
Result 1.7.3 Assume ݊ ൈ ݊with a given
constantןǡ ).
ሺሻ ൌ ߛ . Let ߙ be a number. Then, ሺߙሻ ൌ ߙ ߛ כ.
ͳ ʹ ͵ i.e. ൌ Ͳ Ͷ ͳ൩ ןRi +Rk --Æ Rk ʹ Ͳ ͳ ͳ ʹ ͵ Ͳ ͳʹ ͵൩ ൌ ʹ Ͳ ͳ
Result 1.7.4 Assume ݊ ൈ ݊
Ǥ Then: a) ሺሻ ൌ ሺሻ כሺሻǤ b) Assume ିଵ exists and ିଵ exists. Then, ሺሻିଵ ൌ ିଵ ିଵ Ǥ c)ሺሻ ൌ ሺሻǤ
Then, ሺሻ ൌ ሺሻ.
d)ሺሻ ൌ ሺ ሻǤ 51
e) If ିଵ exists, then ሺିଵ ሻ ൌ
ଵ ୢୣ୲ሺሻ
Ǥ
Proof of Result 1.7.4 (b) We will show that ሺሻିଵ ൌ ିଵ ିଵ . If we multiply ( ିଵ ିଵ ) by (AB), we obtain:
ିଵ ሺ ିଵ
ሻ ൌ
Thus, ሺሻ
ିଵ
ିଵ ሺ
ൌ
ିଵ
ሻB = ൌ Ǥ
ିଵ ିଵ
Ǥ
Proof of Result 1.7.4 (e) We will show that ሺିଵ ሻ ൌ
ଵ ୢୣ୲ሺሻ
.
Since ିଵ ൌ , then ሺିଵ ሻ ൌ ሺܫ ሻ ൌ ͳǤ ሺିଵ ሻ ൌ ሺሻ כሺିଵ ሻ ൌ ͳǤ Thus, ሺିଵ ሻ ൌ
ଵ ୢୣ୲ሺሻ
Ǥ
1.8 Cramer’s Rule In this section, we discuss how to use Cramer’s Rule to solve systems of linear equations. Definition 1.8.1 Given ݊ ൈ ݊ system of linear equations. Let ൌ be the matrix form of the given system: ݔଵ ܽଵ ݔ ۍଶ ܽ ۍ ېଶ ې ۑ ێ ۑ ێ ݔ ێଷ ۑൌ ܽ ێଷ ۑ ۑڭێ ۑڭێ ݔ ۏ ܽ ۏ ے ے The system has a unique solution if and only if ሺሻ ് Ͳ. Cramer’s Rule tells us how to findݔଵ ǡ ݔଶ ǡ ǥ ǡ ݔ as follows: 53
54 ͳ Let C = ͳ
M. Kaabar ͵ Ͷ ʹ ͳ൩ Then, the solutions for the system of Ͷ ͵
linear equations are: ܽଵ ͵ Ͷ ͳ ʹ ڭ൩ ܽ Ͷ ͵ ݔଵ ൌ ሺሻ ͳ ܽଵ Ͷ ͳ ͳ ڭ൩ ܽ ͵ ݔଶ ൌ ሺሻ ͳ ͵ ܽଵ ͳ ʹ ڭ൩ Ͷ ܽ ݔଷ ൌ ሺሻ Example 1.8.1 Use Cramer’s Rule to solve the following system of linear equations: ʹݔଵ ݔଶ ൌ ͳ͵ ൜ െͳͲݔଵ ͵ݔଶ ൌ െͶ Solution: First of all, we write ʹ ൈ ʹ system in the form ൌ according to definition 1.8.1. ʹ ݔଵ ͳ͵ ቂ ቃቂ ቃ ൌ ቂ ቃ െͳͲ ͵ ݔଶ െͶ ʹ ቃ, then Since C in this form is ቂ െͳͲ ͵ ሺሻ ൌ ሺʹ ͵ כሻ െ ൫ כሺെͳͲሻ൯ ൌ െ ሺെͲሻ ൌ ് ͲǤ The solutions for this system of linear equations are:
ͳ͵ ͳ͵ ቃ ቂ ቂ െͶ ͵ െͶ ݔଵ ൌ ൌ ሺሻ
ቃ ͵ ൌ
56 ܿଵଵ ܿ ൌ ଶଵ ܿଷଵ
ʹ ͳ͵ ʹ ͳ͵ ቃ ቂ ቃ ͳʹʹ ቂ െͳͲ െͶ െͳͲ െͶ ൌ ൌ ݔଶ ൌ ሺሻ Thus, the solutions are ݔଵ ൌ
ݔଶ ൌ
ଵଶଶ
Ǥ
1.9 Adjoint Method In this section, we introduce a new mathematical method to find the inverse matrix. We also give some examples of using Adjoint Method to find inverse matrix and its entry elements. Example 1.9.1 Given the following matrix: ͳ Ͳ ʹ ൌ ʹ ͳ െʹ൩ Ͳ Ͳ ʹ Find ିଵ using Adjoint Method. Solution: To find the inverse matrix of A, we need to do the following steps: Step 1: Find the determinant of matrix A, and check whether ିଵ exists or not. Using the previous discussed method for finding the ͵ ൈ ͵inverse matrix, we get:ሺሻ ൌ ʹ ് ͲǤ Therefore, ିଵ exists. Step 2: Calculate the coefficient matrix of A such that A = ଷൈଷ because is ͵ ൈ ͵ matrix. 55
M. Kaabar ܿଵଶ ܿଶଶ ܿଷଶ
ܿଵଷ ܿଶଷ ൩ ܿଷଷ
In general, to find every element of coefficient matrix A, we need to use the following form: ܿ ൌ ሺെͳሻା ሾܴ݁݉ ݄݅݁ݐ݁ݒ௧ ݇݀݊ܽݓݎ௧ ܿܣ݂݊݉ݑ݈ሿ
Now, using the above form, we can find all coefficients of matrix A: ͳ ܿଵଵ ൌ ሺെͳሻଵାଵ ቂ Ͳ Ͳ ܿଵଶ ൌ ሺെͳሻଵାଶ ቂ Ͳ Ͳ ܿଵଷ ൌ ሺെͳሻଵାଷ ቂ Ͳ Ͳ ܿଶଵ ൌ ሺെͳሻଶାଵ ቂ Ͳ ͳ ܿଶଶ ൌ ሺെͳሻଶାଶ ቂ Ͳ ͳ ܿଶଷ ൌ ሺെͳሻଶାଷ ቂ Ͳ Ͳ ܿଷଵ ൌ ሺെͳሻଷାଵ ቂ ͳ ͳ ܿଷଶ ൌ ሺെͳሻଷାଶ ቂ Ͳ ͳ ܿଷଷ ൌ ሺെͳሻଷାଷ ቂ Ͳ
െʹ ቃൌʹ ʹ െʹ ቃൌͲ ʹ ͳ ቃൌͲ Ͳ ʹ ቃൌͲ ʹ ʹ ቃൌʹ ʹ Ͳ ቃൌͲ Ͳ ʹ ቃ ൌ െʹ െʹ ʹ ቃൌʹ െʹ Ͳ ቃൌͳ ͳ
ʹ Ͳ Hence, ൌ Ͳ ʹ െʹ ʹ
Ͳ Ͳ൩ ͳ
Step 3: Find the adjoint of matrix A as follows: In general, adjoint of A = ሺሻ ൌ . ʹ Ͳ െʹ ሺ ሻ Thus, ൌ ൌ Ͳ ʹ ʹ ൩ Ͳ Ͳ ͳ This formula is always true such that כሺሻ ൌ ሺሻ . If ሺሻ ് Ͳǡ ቂ
ଵ ୢୣ୲ሺሻ
כሺሻቃ ൌ Ǥ
ଵ
ଵ
Hence, ିଵ ൌ ୢୣ୲ሺሻ כሺሻ ൌ ୢୣ୲ሺሻ כ . ିଵ
ͳ ͳ ʹ Ͳ ൌ כ ൌ כͲ ʹ ሺሻ ʹ Ͳ Ͳ
ͳ െʹ Ͳ ʹ ൩ൌ൦ Ͳ ͳ
Ͳ ͳ Ͳ
െͳ ͳ ͳ൪ ʹ
Example 1.9.2 Given the following matrix: ʹ Ͳ െʹ ͳ ʹ Ͷ ൌ െʹ ͳ െͶ െͳ ͵ Ͳ Ͳ Ͳ Ͳ Ͷ Find (2,4) entry of ିଵ . Solution: To find the (2,4) entry of ିଵ , we need to do the following steps: Step 1: Find the determinant of matrix A, and check whether ିଵ exists or not. Use the Row-Operation Method, we obtain the following: ʹ Ͳ െʹ ͳ ʹ Ͳ െʹ ͳ ଵ ଶ ՜ ଶ Ͳ ͳ െʹ ͳ ʹ Ͷ Ͳ ͷ െͶ െͳ ͵ Ͳ ʹ ଶ ଷ ՜ ଷ Ͳ െͳ െͳ ʹ Ͳ Ͳ Ͳ Ͷ Ͳ Ͳ Ͳ Ͷ
58 ʹ Ͳ െʹ ଶ ଷ ՜ ଷ Ͳ ͳ Ͳ Ͳ Ͳ െͳ Ͳ Ͳ Ͳ
M. Kaabar ͳ ͷ Ͷ
Now, using result 1.7.2 for finding the determinant of the upper triangular matrix: Since א ሺԹሻ is a triangular matrix. Then, ሺሻ = multiplication of the numbers on the main diagonal of A. ʹ Ͳ െʹ ͳ Ͳ ͳ Ͳ ͷ Ͳ Ͳ െͳ Ͳ Ͳ Ͳ Ͷ ሺሻ ൌ ʹ כ ͳ כെͳ כͶ ൌ െͺ ് ͲǤ Therefore, ିଵ exists. Step 2: Use the following general form for (i,k) entry of ିଵ : ሺ݅ǡ ݇ሻ െ ݂݁݊ݕݎݐെͳ ൌ ሺെͳሻା ሾܴ݁݉ ݄݇݁ݐ݁ݒ௧ ݅݀݊ܽݓݎ௧ ܿܣ݂݊݉ݑ݈ሿ ܿ ൌ ሺሻ ሺሻ
Now, using the above form, we can find (2,4)-entry of matrix A: ʹ െʹ ͳ ʹ െʹ ͳ ሺെͳሻଶାସ െʹ ʹ Ͷ൩ ሺെͳሻଶାସ െʹ ʹ Ͷ൩ ܿସଶ െͶ ͵ Ͳ ൌ െͶ ͵ Ͳ ൌ ሺሻ ሺሻ െͺ ʹ
Step 3: Find the determinant of െʹ െͶ
57
െʹ ͳ ʹ Ͷ൩. ͵ Ͳ
Let’s call the above matrix F such that ʹ ൌ െʹ െͶ
െʹ ʹ ͵
60
ͳ Ͷ൩ Ͳ
To find the determinant of F, we need to use the RowOperation Method to reduce the matrix F. ʹ ൌ െʹ െͶ
െʹ ʹ ͵
ʹ ଶ ՞ ଷ Ͳ Ͳ
ͳ ʹ ଵ ଶ ՜ ଶ Ͳ Ͷ൩ ʹଵ ଷ ՜ ଷ Ͳ Ͳ െʹ െͳ Ͳ
െʹ Ͳ െͳ
ͳ ͷ൩ ʹ
ͳ ʹ൩ ͷ ʹ
െʹ
Ͳ
Ͳ
ͳ ʹ൩ ͷ
ͳ ʹ൩ ͷ
ܿସଶ ൌ ሺሻ
ሺെͳ
ͳ െͳ system:൭െͳ ͳ Ͷ െ͵
ܿ ʹ Ͷอ ͵ ൱ ܾ ͳͲ
a. For what values of c and b will the system be consistent? b. If the system is consistent, when do you have a unique solution?.
ሺሻ ൌ ʹ כെͳ כͷ ൌ െͳͲ ് ͲǤ Therefore, ሺ ሻ ൌ െሺሻ ൌ ͳͲ ് ͲǤ Thus, The (2,4)-entry of matrix A is: ሻଶାସ
1. Solve the following system of linear equations:
2. Given an augmented matrix of a
Now, using result 1.7.2 for finding the determinant of this upper triangular matrix: െʹ െͳ Ͳ
1.10 Exercises ʹݔଷ െ ݔସ ݔൌ ͳͲ െݔଵ ͵ݔଶ ݔହ ൌ Ͳ ൝ ݔଵ ʹݔଶ ʹݔଷ െ ݔସ ʹݔହ ݔൌ ͳʹ
Let’s call the above matrix D such that ൌ Ͳ െͳ
ʹ ൌ Ͳ Ͳ
M. Kaabar
ʹ െʹ ͳ െʹ ʹ Ͷ൩ െͶ ͵ Ͳ ൌ ሺെͳሻ Ͳͳ כൌ ͳͲ ൌ െ ͷ െͺ െͺ Ͷ െͺ
ൌ െͳǤʹͷ
െͳ 3. Let ൌ Ͳ െʹ
ͳ Ͳ ͳ ͵
b. Find the third column of FH. c. Let HF = A. Find ܽସଶ Ǥ ʹ െʹ െ͵
Ͷ െ͵൩. If possible find ିଵ Ǥ െͺ
5. Given A is ʹ ൈ Ͷ matrix such that ۯଵ ՜ ۯ ʹଵ ଶ ՜ ଶ ۯ .
59
ͳ ͳ ͳ Ͳ
a. Find the third row of HF.
ͳ 4. Let ൌ െͳ െʹ
(2,4)-entry of matrix ൌ െͳǤʹͷ
ʹ ʹ െͳ ͵ െͶ ͳ ͳ ͳ ൩ and ൌ ͳ Ͳ െͳ ͳ െͳ
a. Find two elementary matrices say ܧଵ ǡ ܧଶ such that ܧଵ ܧଶ ܣൌ ܣଶ Ǥ b. Find two elementary matrices say ܨଵ ǡ ܨଶ such that ܨଵ ܨଶ ܣଶ ൌ ܣǤ ͳ 6. Let ൌ െͳ െʹ
ʹ െʹ െ͵
Ͷ ͵ ൩. Find ሺሻ. Is A invertible? െ
Explain.
Ͷ 7. Let ൌ ቂ െ͵
െʹ ቃ. Is A invertible? If yes, find ିܣଵ . ʹ
62
M. Kaabar
Chapter 2 Vector Spaces We start this chapter reviewing some concepts of set theory, and we discuss some important concepts of vector spaces including span and dimension. In the
8. Use Cramer’s Rule to find the solution to ݔଶ in the
remaining sections we introduce the concept of linear
system:
independence. At the end of this chapter we discuss
ʹݔଵ ݔଶ െ ݔଷ ൌ ʹ ൝ െʹݔଵ Ͷݔଶ ʹݔଷ ൌ ͺ െʹݔଵ െ ݔଶ ͺݔଷ ൌ െʹ
other concepts such as subspace and basis.
ʹ 9. Let ൌ െʹ ͳ െʹ
െͶ ʹ Ͳ ʹ െʹ ͳʹ Ͷ െʹ
ͳ െͳ. Find (2,4) entry of ିଵ . Ͷ ͳʹ
10. Find a ʹ ൈ ʹ matrix ܣsuch that െʹ ʹ ͷቃ ቃ. ܣ ͵ܫଶ ൌ ʹ ܣ ቂ Ͷ ͵ ʹ ʹ െʹ െʹ ͳ െʹ െʹ ൩and 11. Given ିܣଵ ൌ െʹ ͵ ܤ ൌ Ͳ െͶ ʹ ʹ൩ െʹ ʹ ͵ Ͳ ͳ െʹ Ͳ Solve the system ܺܣൌ െͳ൩ ͳ ቂͷ ͳ
2.1 Span and Vector Spaces In this section, we review some concepts of set theory, and we give an introduction to span and vector spaces including some examples related to these concepts. Before reviewing the concepts of set theory, it is recommended to revisit section 1.4, and read the notations of numbers and the representation of the three sets of numbers in figure 1.4.1. Let’s explain some symbols and notations of set theory: ͵ אԺ means that 3 is an element of ԺǤ ଵ ଶ
61
ଵ
בԺ means that is not an element of ԺǤ ଶ
{ } means that it is a set.
{5} means that 5 is a subset of Ժ, and the set consists of exactly one element which is 5. Definition 2.1.1 The span of a certain set is the set of all possible linear combinations of the subset of that set. Example 2.1.1 Find Span{1}. Solution: According to definition 2.1.1, then the span of the set {1} is the set of all possible linear combinations of the subset of {1} which is 1. Hence, Span{1} = Թ. Example 2.1.2 Find Span{(1,2),(2,3)}. Solution: According to definition 2.1.1, then the span of the set {(1,2),(2,3)} is the set of all possible linear combinations of the subsets of {(1,2),(2,3)} which are (1,2) and (2,3). Thus, the following is some possible linear combinations: ሺͳǡʹሻ ൌ ͳ כሺͳǡʹሻ Ͳ כሺʹǡ͵ሻ
64
M. Kaabar
Hence, Span{0} = 0. Example 2.1.4 Find Span{c} where c is a non-zero integer. Solution: Using definition 2.1.1, the span of the set {c} is the set of all possible linear combinations of the subset of {c} which is ܿ ് Ͳ. Thus, Span{c} = Թ. Definition 2.1.2 Թ ൌ ሼሺܽଵ ǡ ܽଶ ǡ ܽଷ ǡ ǥ ǡ ܽ ሻȁܽଵ ǡ ܽଶ ǡ ܽଷ ǡ ǥ ǡ ܽ אԹሽ is a set of all points where each point has exactly ݊ coordinates. Definition 2.1.3 ሺܸǡ ǡήሻ is a vector space if satisfies the following: a. For every ݒଵ ǡ ݒଶ ܸ א, ݒଵ ݒଶ ܸ אǤ b. For every ߙ אԹܸ א ݒ, ߙܸ א ݒǤ (i.e. Given ܵ݊ܽሼݔǡ ݕሽݐ݁ݏሼݔǡ ݕሽǡ then
ሺʹǡ͵ሻ ൌ Ͳ כሺͳǡʹሻ ͳ כሺʹǡ͵ሻ
ξͳͲ ݔ ʹ݊ܽܵ א ݕሼݔǡ ݕሽ. Let’s assume that
ሺͷǡͺሻ ൌ ͳ כሺͳǡʹሻ ʹ כሺʹǡ͵ሻ Hence,ሼሺͳǡʹሻǡ ሺʹǡ͵ሻǡ ሺͷǡͺሻሽ אሼሺͳǡʹሻǡ ሺʹǡ͵ሻሽ.
݊ܽܵ א ݒሼݔǡ ݕሽ, then ݒൌ ܿଵ ݔ ܿଶ ݕfor some numbers
Example 2.1.3 Find Span{0}.
ܿଵ ܿଶ ).
Solution: According to definition 2.1.1, then the span of the set {0} is the set of all possible linear combinations of the subset of {0} which is 0. 63
2.2 The Dimension of Vector
66
Space
Claim: ܨൌ ܵ݊ܽሼሺǡͷሻሽ ് Թଶ where ሺǡͷሻ אԹଶ Ǥ
M. Kaabar
We cannot find a number ߙ such that ሺǡͷሻ ൌ ߙሺ͵ǡͶሻ
In this section, we discuss how to find the dimension of vector space, and how it is related to what we have
We prove the above claim, and hence ܵ݊ܽሼሺ͵ǡͶሻሽ ് Թଶ .
learned in section 2.1.
Fact 2.2.2 ܵ݊ܽሼሺͳǡͲሻǡ ሺͲǡͳሻሽ ൌ Թଶ .
Definition 2.2.1 Given a vector spaceܸ, the dimension
Fact 2.2.3 ܵ݊ܽሼሺʹǡͳሻǡ ሺͳǡͲǤͷሻሽ ് Թଶ .
of ܸ is the number of minimum elements needed in ܸ so that their ܵ ݊ܽis equal toܸ, and it is denoted by
2.3 Linear Independence
ሺܸሻ. (i.e. ሺԹሻ ൌ ͳ ሺԹଶ ሻ ൌ ʹ).
In this section, we learn how to determine whether
Result 2.2.1 ሺԹ ሻ ൌ ݊.
vector spaces are linearly independent or not.
Proof of Result 2.2.1 We will show that ሺԹ ሻ ൌ ݊Ǥ
Definition 2.3.1 Given a vector spaceሺܸǡ ǡήሻ, we say
Claim: ܦൌ ܵ݊ܽሼሺͳǡͲሻǡ ሺͲǤͳሻሽ ൌ Թଶ
ݒଵ ǡ ݒଶ ǡ ǥ ǡ ݒ ܸ אare linearly independent if none of them is a linear combination of the remaining ݒ Ԣݏ.
ߙଵ ሺͳǡͲሻ ߙଶ ሺͲǡͳሻ ൌ ሺߙଵ ǡ ߙଶ ሻ אԹଶ Thus, ܦis a subset of Թଶ ( ك ܦԹଶ ). For every ݔଵ ǡ ݕଵ אԹ, ሺݔଵ ǡ ݕଵ ሻ אԹଶ Ǥ Therefore, ሺݔଵ ǡ ݕଵ ሻ ൌ ݔଵ ሺͳǡͲሻ ݕଵ ሺͲǡͳሻ ܦ אǤ We prove the above claim, and hence
(i.e. ሺ͵ǡͶሻǡ ሺʹǡͲሻ אԹ are linearly independent because we cannot write them as a linear combination of each other, in other words, we cannot find a number ߙଵ ǡ ߙଶ such that ሺ͵ǡͶሻ ൌ ߙଵ ሺʹǡͲሻ and ሺʹǡͲሻ ൌ ߙଶ ሺ͵ǡͶሻ). Definition 2.3.2 Given a vector spaceሺܸǡ ǡήሻ, we say
ሺԹ ሻ ൌ ݊.
ݒଵ ǡ ݒଶ ǡ ǥ ǡ ݒ ܸ אare linearly dependent if at least one of ݒ Ԣ ݏis a linear combination of the others.
Fact 2.2.1 ܵ݊ܽሼሺ͵ǡͶሻሽ ് Թଶ .
Example 2.3.1 Assume ݒଵ ݒଶ are linearly
Proof of Fact 2.2.1 We will show that ܵ݊ܽሼሺ͵ǡͶሻሽ ് Թଶ Ǥ 65
independent. Show that ݒଵ and ͵ݒଵ ݒଶ are linearly independent.
Solution: We will show that ݒଵ and ͵ݒଵ ݒଶ are
68
M. Kaabar
linearly independent. Using proof by contradiction, we assume that ݒଵ and ͵ݒଵ ݒଶ are linearly dependent. For some non-zero number ܿଵ , ݒଵ ൌ ܿଵ ሺ͵ݒଵ ݒଶ ሻ. Using the distribution property and algebra, we obtain: ݒଵ ൌ ͵ݒଵ ܿଵ ݒଶ ܿଵ ݒଵ െ ͵ݒଵ ܿଵ ൌ ݒଶ ܿଵ ݒଵ ሺͳ െ ͵ܿଵ ሻ ൌ ݒଶ ܿଵ ሺͳ െ ͵ܿଵ ሻ ݒଵ ൌ ݒଶ ܿଵ Thus, none of ݒଵ and ͵ݒଵ ݒଶ is a linear combination of the others which means that ݒଵ and ͵ݒଵ ݒଶ are linearly independent. This is a contradiction. Therefore, our assumption that ݒଵ and ͵ݒଵ ݒଶ were linearly dependent is false. Hence, ݒଵ and ͵ݒଵ ݒଶ are linearly independent. Example 2.3.2 Given the following vectors: ݒଵ ൌ ሺͳǡͲǡ െʹሻ
ͳ Ͳ െʹ െʹ ʹ ͳ ൩ Each point is a row-operation. We need to െͳ Ͳ ͷ reduce this matrix to Semi-Reduced Matrix. Definition 2.3.3 Semi-Reduced Matrix is a reducedmatrix but the leader numbers can be any non-zero number. Now, we apply the Row-Reduction Method to get the Semi-Reduced Matrix as follows: ͳ Ͳ െʹ ʹܴ ܴ ՜ ܴ ͳ Ͳ ଶ ଶ െʹ ʹ ͳ ൩ ଵ ܴଵ ܴଷ ՜ ܴଷ Ͳ ʹ െͳ Ͳ ͷ Ͳ Ͳ Reduced Matrix.
Since none of the rows in the Semi-Reduced Matrix become zero-row, then the elements are independent because we cannot write at least one of them as a linear combination of the others. Example 2.3.3 Given the following vectors:
ݒଶ ൌ ሺെʹǡʹǡͳሻ
ݒଵ ൌ ሺͳǡ െʹǡͶǡሻ
ݒଷ ൌ ሺെͳǡͲǡͷሻ
ݒଶ ൌ ሺെͳǡʹǡͲǡʹሻ
Are these vectors independent elements? Solution: First of all, to determine whether these vectors are independent elements or not, we need to write these vectors as a matrix.
67
െʹ െ͵൩This is a Semi͵
ݒଷ ൌ ሺͳǡ െʹǡͺǡͳͶሻ Are these vectors independent elements? Solution: First of all, to determine whether these vectors are independent elements or not, we need to write these vectors as a matrix.
ͳ െʹ Ͷ െͳ ʹ Ͳ ʹ ൩ Each point is a row-operation. We ͳ െʹ ͺ ͳͶ need to reduce this matrix to Semi-Reduced Matrix.
70
M. Kaabar
Now, we apply the Row-Reduction Method to get the Semi-Reduced Matrix as follows: ͳ െͳ ͳ
െʹ ʹ െʹ
Ͷ Ͳ ͺ
ܴ ܴ ՜ܴ ͳ ଵ ଶ ଶ ʹ ൩ െܴ ܴ ՜ ܴ Ͳ ଵ ଷ ଷ Ͳ ͳͶ
ͳ െʹ Ͷ െܴଶ ܴଷ ՜ ܴଷ Ͳ Ͳ Ͷ Ͳ Ͳ Ͳ Matrix.
െʹ Ͳ Ͳ
Ͷ Ͷ ͺ൩ Ͷ ͺ
V
D
ͺ൩ This is a Semi-Reduced Ͳ
Since there is a zero-row in the Semi-Reduced Matrix, then the elements are dependent because we can write at least one of them as a linear combination of the others.
2.4 Subspace and Basis In this section, we discuss one of the most important concepts in linear algebra that is known as subspace. In addition, we give some examples explaining how to find the basis for subspace. Definition 2.4.1 Subspace is a vector space but we call it a subspace because it lives inside a bigger vector space. (i.e. Given vector spaces ܸ and ܦ, then according to the figure 2.4.1, ܦis called a subspace of ܸ).
69
Figure 2.4.1: Subspace of ܸ Fact 2.4.1 Every vector space is a subspace of itself. Example 2.4.1 Given a vector space ܮൌ ሼሺܿǡ ͵ܿሻȁܿ אԹሽ. a. Does ܮlive in Թଶ ? b. Does ܮequal to Թଶ ? c. Is ܮa subspace of Թଶ ? d. Does ܮequal to ܵ݊ܽሼሺͲǡ͵ሻሽǫ e. Does ܮequal to ܵ݊ܽሼሺͳǡ͵ሻǡ ሺʹǡሻሽǫ Solution: To answer all these questions, we need first to draw an equation from this vector space, say ݕൌ ͵ݔ. The following figure represents the graph of the above equation, and it passes through a point ሺͳǡ͵ሻ.
72
M. Kaabar
Part d: No; according to the graph in figure 2.4.2, ሺͲǡ͵ሻ does not belong to ܮ. Part e: Yes; because we can write ሺͳǡ͵ሻ and ሺʹǡሻ as a linear combination of each other. ߙଵ ሺͳǡ͵ሻ ߙଶ ሺʹǡሻ ൌ ሼሺߙଵ ʹߙଶ ሻǡ ሺ͵ߙଵ ߙଶ ሻሽ ߙଵ ሺͳǡ͵ሻ ߙଶ ሺʹǡሻ ൌ ሼሺߙଵ ʹߙଶ ሻǡ ͵ሺߙଵ ʹߙଶ ሻሽ Figure 2.4.2: Graph of ݕൌ ͵ݔ
Assumeܿ ൌ ሺߙଵ ʹߙଶ ሻ, then we obtain: ߙଵ ሺͳǡ͵ሻ ߙଶ ሺʹǡሻ ൌ ሼሺܿǡ ͵ܿ ሻȁܿ אԹሽ ൌ ܮ.
Now, we can answer the given questions as follows:
Thus, ܮൌ ܵ݊ܽሼሺͳǡ͵ሻǡ ሺʹǡሻሽ.
Part a: Yes; ܮlives in Թଶ .
Result 2.4.1 ܮis a subspace of Թଶ if satisfies the following:
Part b: No; ܮdoes not equal to Թଶ . To show that we prove the following claim: Claim: ܮൌ ܵ݊ܽሼሺͷǡͳͷሻሽ ് Թଶ where ሺͷǡͳͷሻ אԹଶ Ǥ It is impossible to find a number ߙ ൌ ͵ such that ሺʹͲǡͲሻ ൌ ߙሺͷǡͳͷሻ because in this case ߙ ൌ Ͷ where ሺʹͲǡͲሻ ൌ Ͷሺͷǡͳͷሻ. We prove the above claim, and ܵ݊ܽሼሺͷǡͳͷሻሽ ് Թଶ . Thus, ܮdoes not equal to Թଶ Part c: Yes; ܮis a subspace of Թଶ because ܮlives inside a bigger vector space which is Թଶ . 71
a. ܮlives inside Թଶ . b. ܮhas only lines through the origin ሺͲǡͲሻ. Example 2.4.2 Given a vector space ܦൌ ሼሺܽǡ ܾǡ ͳሻȁܽǡ ܾ אԹሽ. a. Does ܦlive in Թଷ ? b. Is ܦa subspace of Թଷ ǫ Solution: Since the equation of the above vector space is a three-dimensional equation, there is no need to draw it because it is difficult to draw it exactly. Thus, we can answer the above questions immediately. Part a: Yes; ܦlives inside Թଷ .
74 Part b: No; since ሺͲǡͲǡͲሻ ܦ ב, then ܦis not a subspace of Թଷ .
Example 2.4.3 Let ܦൌ ܵ݊ܽሼሺͳǡ െͳǡͲሻǡ ሺʹǡʹǡͳሻǡ ሺͲǡͶǡͳሻሽ. a. Find ሺܦሻ.
Fact 2.4.2 Assume ܦlives insideԹ . If we can write ܦ as a ܵ݊ܽ, then it is a subspace of Թ .
Fact 2.4.3 Assume ܦlives insideԹ . If we cannot write ܦas a ܵ݊ܽ, then it is not a subspace of Թ .
M. Kaabar
b. Find a basis for ܦ. Solution: First of all, we have infinite set of points, and ܦlives inside Թଷ . Let’s assume the following: ݒଵ ൌ ሺͳǡ െͳǡͲሻ
Fact 2.4.4 Assume ܦlives insideԹ . If ሺͲǡͲǡͲǡ ǥ ǡͲሻ is in ܦ, then ܦis a subspace of Թ .
Fact 2.4.5 Assume ܦlives insideԹ . If ሺͲǡͲǡͲǡ ǥ ǡͲሻ is not in ܦ, then ܦis not a subspace of Թ . Now, we list the main results on Թ : Result 2.4.2 Maximum number of independent points is ݊. Result 2.4.3 Choosing any ݊ independent points in Թ , say ܳଵ ǡ ܳଶ ǡ ǥ ǡ ܳ , then Թ ൌ ܵ݊ܽሼܳଵ ǡ ܳଶ ǡ ǥ ǡ ܳ ሽ.
ݒଶ ൌ ሺʹǡʹǡͳሻ ݒଷ ൌ ሺͲǡͶǡͳሻ Part a: To find ሺܦሻ, we check whether ݒଵ ǡ ݒଶ and ݒଷ are dependent elements or not. Using what we have learned so far from section 2.3: We need to write these vectors as a matrix. ͳ െͳ Ͳ ʹ ʹ ͳ൩ Each point is a row-operation. We need to Ͳ Ͷ ͳ reduce this matrix to Semi-Reduced Matrix. Now, we apply the Row-Reduction Method to get the Semi-Reduced Matrix as follows:
Result 2.4.4 ሺԹ ሻ ൌ ݊. Results 2.4.3 and 2.4.4 tell us the following: In order to get all Թ , we need exactly ݊ independent points. Result 2.4.5 Assume Թ ൌ ܵ݊ܽሼܳଵ ǡ ܳଶ ǡ ǥ ǡ ܳ ሽ, then ݇ ݊ (݊ points of the ܳ Ԣ ݏare independents). Definition 2.4.2 Basis is the set of points that is needed to ܵ ݊ܽthe vector space. 73
ͳ ʹ Ͳ
െͳ ʹ Ͷ
ͳ െͳ Ͳ ͳ൩െʹܴଵ ܴଶ ՜ ܴଶ Ͳ Ͷ Ͳ Ͷ ͳ
Ͳ ͳ൩ െܴଶ ܴଷ ՜ ܴଷ ͳ
ͳ Ͳ Ͳ
െͳ Ͷ Ͳ
Ͳ ͳ൩ This is a Semi-Reduced Matrix. Ͳ
Since there is a zero-row in the Semi-Reduced Matrix, then these elements are dependent because we can write at least one of them as a linear combination of
the others. Only two points survived in the SemiReduced Matrix. Thus, ሺܦሻ ൌ ʹ.
76
Part b: ܦis a plane that passes through the origin ሺͲǡͲǡͲሻ. Since ሺܦሻ ൌ ʹ, then any two independent points in ܦwill form a basis for ܦ. Hence, the following are some possible bases for ܦ:
Now, we apply the Row-Reduction Method to get the Semi-Reduced Matrix as follows:
Basis for ܦis ሼሺͳǡ െͳǡͲሻǡ ሺʹǡʹǡͳሻሽ. Another basis for ܦis ሼሺͳǡ െͳǡͲሻǡ ሺͲǡͶǡͳሻሽ. Result 2.4.6 It is always true that ȁݏ݅ݏܽܤȁ ൌ ݀݅݉ሺܦሻ. Example 2.4.4 Given the following: ܯൌ ܵ݊ܽሼሺെͳǡʹǡͲǡͲሻǡ ሺͳǡ െʹǡ͵ǡͲሻǡ ሺെʹǡͲǡ͵ǡͲሻሽ. Find a basis for ܯ. Solution: We have infinite set of points, and ܯlives inside Թସ . Let’s assume the following: ݒଵ ൌ ሺെͳǡʹǡͲǡͲሻ
M. Kaabar
െͳ ͳ െʹ
ʹ െʹ Ͳ
Ͳ ͵ ͵
െͳ Ͳ ܴ ܴ ՜ܴ ଵ ଶ ଶ Ͳ൩ െʹܴ ܴ ՜ ܴ Ͳ ଵ ଷ ଷ Ͳ Ͳ
െͳ െܴଶ ܴଷ ՜ ܴଷ Ͳ Ͳ Matrix.
ʹ Ͳ െͶ
ʹ Ͳ െͶ
Ͳ ͵ ͵
Ͳ Ͳ൩ Ͳ
Ͳ Ͳ ͵ Ͳ൩ This is a Semi-Reduced Ͳ Ͳ
Since there is no zero-row in the Semi-Reduced Matrix, then these elements are independent. All the three points survived in the Semi-Reduced Matrix. Thus, ሺܯሻ ൌ ͵. Since ሺܯሻ ൌ ͵, then any three independent points in ܯfrom the above matrices will form a basis for ܯ. Hence, the following are some possible bases for ܯ:
ݒଶ ൌ ሺͳǡ െʹǡ͵ǡͲሻ
Basis for ܯis ሼሺെͳǡʹǡͲǡͲሻǡ ሺͲǡͲǡ͵ǡͲሻǡ ሺͲǡ െͶǡͲǡͲሻሽ.
ݒଷ ൌ ሺെʹǡͲǡ͵ǡͲሻ
Another basis for ܯisሼሺെͳǡʹǡͲǡͲሻǡ ሺͲǡͲǡ͵ǡͲሻǡ ሺͲǡ െͶǡ͵ǡͲሻሽ.
We check if ݒଵ ǡ ݒଶ and ݒଷ are dependent elements. Using what we have learned so far from section 2.3 and example 2.4.3: We need to write these vectors as a matrix. െͳ ʹ Ͳ Ͳ ͳ െʹ ͵ Ͳ൩ Each point is a row-operation. We െʹ Ͳ ͵ Ͳ need to reduce this matrix to Semi-Reduced Matrix. 75
Another basis for ܯisሼሺെͳǡʹǡͲǡͲሻǡ ሺͳǡ െʹǡ͵ǡͲሻǡ ሺെʹǡͲǡ͵ǡͲሻሽ. Example 2.4.5 Given the following: ܹ ൌ ܵ݊ܽሼሺܽǡ െʹܽ ܾǡ െܽሻȁܽǡ ܾ אԹሽ. a. Show that ܹ is a subspace of Թଷ . b. Find a basis for ܹ. c. Rewrite ܹ as a ܵ݊ܽ. Solution: We have infinite set of points, and ܹ lives inside Թଷ .
78 Part a: We write each coordinate of ܹ as a linear combination of the free variables ܽ and ܾ.
Is ܪa subspace of Թସ ?
െʹܽ ܾ ൌ െʹ ή ܽ ͳ ή ܾ െܽ ൌ െͳ ή ܽ Ͳ ή ܾ Since it is possible to write each coordinate of ܹ as a linear combination of the free variables ܽ and ܾ, then we conclude that ܹ is a subspace of Թଷ . Part b: To find a basis for ܹ, we first need to find ሺܹሻ. To find ሺܹሻ, let’s play a game called (ONOFF GAME) with the free variables ܾܽǤ ܾ Ͳ ͳ
Example 2.4.6 Given the following: ܪൌ ܵ݊ܽሼሺܽଶ ǡ ͵ܾ ܽǡ െʹܿǡ ܽ ܾ ܿሻȁܽǡ ܾǡ ܿ אԹሽ.
ܽ ൌͳήܽͲήܾ
ܽ ͳ Ͳ
M. Kaabar
ܲݐ݊݅ ሺͳǡ െʹǡ െͳሻ ሺͲǡͳǡͲሻ
Now, we check for independency: We already have the ͳ െʹ െͳ ቃǤThus, ሺܹ ሻ ൌ ʹ. Semi-Reduced Matrix: ቂ Ͳ ͳ Ͳ Hence, the basis for ܹ is ሼሺͳǡ െʹǡ െͳሻǡ ሺͲǡͳǡͲሻሽ. Part b: Since we found the basis for ܹ, then it is easy to rewrite ܹ as a ܵ ݊ܽas follows: ܹ ൌ ܵ݊ܽሼሺͳǡ െʹǡ െͳሻǡ ሺͲǡͳǡͲሻሽǤ Fact 2.4.6 ሺܹ ሻ ܰ ݁݁ݎܨ݂ݎܾ݁݉ݑെ ܸܽݏ݈ܾ݁ܽ݅ݎǤ
Solution: We have infinite set of points, and ܪlives inside Թସ . We try write each coordinate of ܪas a linear combination of the free variables ܽǡ ܾ and ܿ. ܽଶ ൌ ݎܾ݁݉ݑܰ݀݁ݔ݅ܨή ܽ ݎܾ݁݉ݑܰ݀݁ݔ݅ܨή ܾ ݎܾ݁݉ݑܰ݀݁ݔ݅ܨή ܿ
ܽଶ is not a linear combination of ܽǡ ܾ and ܿ. We assume that ݓൌ ሺͳǡͳǡͲǡͳሻ ܪ א, and ܽ ൌ ͳǡ ܾ ൌ ܿ ൌ Ͳ. If ߙ ൌ െʹ, then െʹ ή ݓൌ െʹ ή ሺͳǡͳǡͲǡͳሻ ൌ ሺെʹǡ െʹǡͲǡ െʹሻ ܪ ב. Since it is impossible to write each coordinate of ܪas a linear combination of the free variables ܽǡ ܾ and ܿ, then we conclude that ܪis not a subspace of Թସ . Example 2.4.7 Form a basis for Թସ . Solution: We just need to select any random four independent points, and then we form a Ͷ ൈ Ͷ matrix with four independent rows as follows: ʹ Ͳ Ͳ Ͳ
͵ ͷ Ͳ Ͳ
Ͳ Ͷ ͳ ͳ Note: ߨ is a number. ʹ ͵ Ͳ ߨ
Let’s assume the following: ݒଵ ൌ ሺʹǡ͵ǡͲǡͶሻ
77
ݒଶ ൌ ሺͲǡͷǡͳǡͳሻ
80
M. Kaabar
ݒଷ ൌ ሺͲǡͲǡʹǡ͵ሻ ݒସ ൌ ሺͲǡͲǡͲǡ ߨ ሻ Thus, the basis for Թସ ൌ ሼݒଵ ǡ ݒଶ ǡ ݒଷ ǡ ݒସ ሽ, and
Ͳ ʹ Ͳ െʹ Ͳ Ͳ Ͳ Ͳ
ͳ Ͷ Ͳ ͵ െͳͲ ܴ ܴ ՜ ܴ ͵ ଵ ଶ ଶ Ͳ Ͷ െ Ͳ ͳͲͲͲ Ͳ
ʹ Ͳ Ͳ Ͳ
ͳ Ͷ ͷ ͵Ͳ Ͷ െ Ͳ ͳͲͲͲ
This is a Semi-Reduced Matrix.
ܵ݊ܽሼݒଵ ǡ ݒଶ ǡ ݒଷ ǡ ݒସ ሽ ൌ Թସ . Example 2.4.8 Form a basis for Թସ that contains the
Thus, the basis for Թସ is
following two independent points:
ሼሺͲǡʹǡͳǡͶሻǡ ሺͲǡ െʹǡ͵ǡ െͳͲሻǡ ሺ͵ǡͲǡͷǡ͵Ͳሻǡ ሺͲǡͲǡͲǡͳͲͲͲሻሽ.
ሺͲǡʹǡͳǡͶሻሺͲǡ െʹǡ͵ǡ െͳͲሻ.
Example 2.4.9 Given the following:
Solution: We need to add two more points to the given
ܦൌ ܵ݊ܽሼሺͳǡͳǡͳǡͳሻǡ ሺെͳǡ െͳǡͲǡͲሻǡ ሺͲǡͲǡͳǡͳሻሽ
one so that all four points are independent. Let’s
Is ሺͳǡͳǡʹǡʹሻ ?ܦ א
assume the following:
Solution: We have infinite set of points, and ܦlives inside Թସ . There are two different to solve this example:
ݒଵ ൌ ሺͲǡʹǡͳǡͶሻ ݒଶ ൌ ሺͲǡ െʹǡ͵ǡ െͳͲሻ ݒଷ ൌ ሺͲǡͲǡͶǡ െሻ This is a random point. ݒସ ൌ ሺͲǡͲǡͲǡͳͲͲͲሻ This is a random point. Then, we need to write these vectors as a matrix. Ͳ ʹ ͳ Ͷ Ͳ െʹ ͵ െͳͲ Each point is a row-operation. We Ͳ Ͳ Ͷ െ Ͳ Ͳ Ͳ ͳͲͲͲ need to reduce this matrix to Semi-Reduced Matrix. Now, we apply the Row-Reduction Method to get the Semi-Reduced Matrix as follows:
The First Way: Let’s assume the following: ݒଵ ൌ ሺͳǡͳǡͳǡͳሻ ݒଶ ൌ ሺെͳǡ െͳǡͲǡͲሻ ݒଷ ൌ ሺͲǡͲǡͳǡͳሻ We start asking ourselves the following question: Question: Can we find ߙଵ ǡ ߙଶ and ߙଷ such that ሺͳǡͳǡʹǡʹሻ ൌ ߙଵ ή ݒଵ ߙଶ ή ݒଶ ߙଷ ή ݒଷ ? Answer: Yes but we need to solve the following system of linear equations: ͳ ൌ ߙଵ െ ߙଶ Ͳ ή ߙଷ
79
ͳ ൌ ߙଵ െ ߙଶ Ͳ ή ߙଷ
82
M. Kaabar
ʹ ൌ ߙଵ ߙଷ ܦൌ ܵ݊ܽሼሺͳǡͳǡͳǡͳሻǡ ሺͲǡͲǡͳǡͳሻሽ.
ʹ ൌ ߙଵ ߙଷ Using what we have learned from chapter 1 to solve the above system of linear equations, we obtain: ߙଵ ൌ ߙଶ ൌ ߙଷ ൌ ͳ
The Second Way (Recommended): We first need to find ݀݅݉ሺܦሻ, and then a basis for ܦ. We have to write ݒଵ ǡ ݒଶ ݒଷ as a matrix. ͳ ͳ ͳ ͳ െͳ െͳ Ͳ Ͳ൩ Each point is a row-operation. We Ͳ Ͳ ͳ ͳ need to reduce this matrix to Semi-Reduced Matrix. Now, we apply the Row-Reduction Method to get the Semi-Reduced Matrix as follows: ͳ െͳ Ͳ
ͳ Ͳ ͳ
ͳ ͳ ͳ ൩ܴ ܴ ՜ ܴ Ͳ ଵ ଶ ଶ Ͳ Ͳ Ͳ Ͳ ͳ
ͳ ͳ െܴଶ ܴଷ ՜ ܴଷ Ͳ Ͳ Ͳ Ͳ Matrix.
ͳ ͳ Ͳ
Question: Can we find ߙଵ ǡ ߙଶ and ߙଷ such that ሺͳǡͳǡʹǡʹሻ ൌ ߙଵ ή ሺͳǡͳǡͳǡͳሻ ߙଶ ή ሺͲǡͲǡͳǡͳሻ? Answer: Yes:
Hence, Yes: ሺͳǡͳǡʹǡʹሻ ܦ אǤ
ͳ െͳ Ͳ
Now, we ask ourselves the following question:
ͳ ͳ ͳ
ͳ ͳ൩ ͳ
ͳ ͳ൩ This is a Semi-Reduced Ͳ
Since there is a zero-row in the Semi-Reduced Matrix, then these elements are dependent. Thus, ሺܦሻ ൌ ʹ. Thus, Basis for ܦis ሼሺͳǡͳǡͳǡͳሻǡ ሺͲǡͲǡͳǡͳሻሽ, and 81
ͳ ൌ ߙଵ ͳ ൌ ߙଵ ʹ ൌ ߙଵ ߙଶ ʹ ൌ ߙଵ ߙଶ Thus, ߙଵ ൌ ߙଶ ൌ ߙଷ ൌ ͳ. Hence, Yes: ሺͳǡͳǡʹǡʹሻ ܦ אǤ
2.5 Exercises 1. Let ܯൌ ܵ݊ܽሼሺͳǡ െͳǡͳሻǡ ሺെͳǡͲǡͳሻǡ ሺͲǡ െͳǡʹሻሽ a. Find a basis for ܯ. b. Is ሺͳǡ െ͵ǡͷሻ ?ܯ אWhy? 2. Let ܦൌ ሼሺݔǡ ݕǡ ݖǡ ݐሻ אԹସ ȁݔǡ ݕǡ ݖǡ א ݐԹǡ ݔ ʹ ݖ ͵ ݐൌ Ͳǡ ܽ݊݀ ݕെ ݖ ݐൌ Ͳሽ a. Show that ܦis a subspace of Թସ . b. Find a basis for ܦ. c. Write ܦas a ܵ݊ܽ.
84 ݉ 3. Let ܧൌ ሼቈ݉ ݓ ȁ݉ǡ א ݓԹሽ ݓ
Chapter 3
a. Show that ܧis a subspace of Թଷ .
Homogeneous Systems
b. Find a basis for ܧ. c. Write ܧas a ܵ݊ܽ. 4. Find a basis for the subspace ܨwhere ݓെ ݏ ͵ݑ ܨൌ ሼቂ ͺ ݓ ʹ ݏെ ݑ
M. Kaabar
Ͷ ݓ ͵ ݏെ ͻݑ ͷݓ
ʹݓ ቃ ȁݓǡ ݏǡ א ݑԹሽ Ͳ
In this chapter, we introduce the homogeneous systems, and we discuss how they are related to what we have learned in chapter 2. We start with an
5. Determine the value(s) of ݔsuch that the points
introduction to null space and rank. Then, we study
ሺͳǡͲǡͷሻǡ ሺͳǡʹǡͶሻǡ ሺͳǡͶǡ ݔሻ are dependent.
one of the most important topics in linear algebra
ʹ 6. Let ܦൌ ܵ݊ܽሼቂ Ͳ
which is linear transformation. At the end of this
െͳ ͳ ቃǡቂ ͳ Ͳ
െͳ ͷ െ͵ ቃǡቂ ቃሽ ͲǤͷ Ͳ ʹǤͷ
a. Find ݀݅݉ሺܦሻ.
chapter we discuss how to find range and kernel, and their relation to sections 3.1 and 3.2.
b. Find a basis ܣfor ܦ. െʹ Ͳ ቃ ?ܦ אWhy? c. Is ܥൌ ቂ Ͳ െͳ
3.1 Null Space and Rank
d. If the answer to part c is yes, then write ܥas a
In this section, we first give an introduction to
linear combination of the elements in ܣ. Otherwise,
homogeneous systems, and we discuss how to find the
write the basis ܣas a ܵ݊ܽ.
null space and rank of homogeneous systems. In
7. Let ܭൌ ܵ݊ܽሼሺͳǡ െͳǡͲሻǡ ሺʹǡ െͳǡͲሻǡ ሺͳǡͲǡͲሻሽ.
addition, we explain how to find row space and column
Find݀݅݉ሺܭሻ.
space. ସ
8. Find a basis for the subspace of Թ spanned by ሼሺʹǡͻǡ െʹǡͷ͵ሻǡ ሺെ͵ǡʹǡ͵ǡ െʹሻǡ ሺͺǡ െ͵ǡ െͺǡͳሻǡ ሺͲǡ െ͵ǡͲǡͳͷሻሽ. 9. Does the ܵ݊ܽሼሺെʹǡͳǡʹሻǡ ሺʹǡͳǡ െͳሻǡ ሺʹǡ͵ǡͲሻሽ equal to ଷ
Թ ? 83
Definition 3.1.1 Homogeneous System is a ݉ ൈ ݊ system of linear equations that has all zero constants. (i.e. the following is an example of homogeneous ʹݔଵ ݔଶ െ ݔଷ ݔସ ൌ Ͳ system): ൝͵ݔଵ ͷݔଶ ͵ݔଷ Ͷݔସ ൌ Ͳ െݔଶ ݔଷ െ ݔସ ൌ Ͳ
86 Imagine we have the following solution to the
M. Kaabar
is Թସ ) : ሺͲǡͲǡͲǡͲሻ
݉ଵ ݓଵ Ͳ ݉ ۍଶ ې ݓۍଶ ېͲۍ ې ۑ ێ ۑ ێ ۑ ێ Now, using algebra: ܯ ܹ ൌ ݉ ێ ܥଷ ۑ ݓێ ܥଷ ۑൌ ۑͲێ ۑ ڭ ێ ۑڭێ ۑ ڭ ێ ݓۏ ےͲۏ ے ݉ۏ ے
Result 3.1.1 The solution of a homogeneous system
By taking ܥas a common factor, we obtain:
homogeneous system: ݔଵ ൌ ݔଶ ൌ ݔଷ ൌ ݔସ ൌ Ͳ. Then, this solution can be viewed as a point of Թ (here
݉ଵ ݓଵ Ͳ ݉ ۍଶ ݓۍ ېଶ ې ېͲۍ ۑ ێ ۊ ۑ ێ ۑ ێۇ ݉ ێۈ ܥଷ ۑ ݓێଷ ۋۑൌ ۑͲێ ۑ ڭ ێ ۑ ڭ ێ ۑڭێ ۏ ے ۏ ݉ ے ݓ ےͲۏ ی ۉ
݉ ൈ ݊ can be written as ሼሺܽଵ ǡ ܽଶ ǡ ܽଷ ǡ ܽସ ǡ ǥ ǡ ܽ ȁܽଵ ǡ ܽଶ ǡ ܽଷ ǡ ܽସ ǡ ǥ ǡ ܽ אԹሽ. Result 3.1.2 All solutions of a homogeneous system ݉ ൈ ݊ form a subset of Թ , and it is equal to the
݉ଵ ݓଵ Ͳ ݉ۍ ې ۍ ې ݓ ଶ ଶ ݉ ێ ۑͲێ ۑ ݓ ێܥଷ ଷ ۑൌ ۑͲێ ڭ ێ ۑڭێ ۑ ݉ۏ ݓ ےͲۏ ے
number of variables. Result 3.1.3 Given a homogeneous system ݉ ൈ ݊. We ݔଵ Ͳ ݔ ۍଶ ېͲۍ ې ۑ ێ ۑ ێ write it in the matrix-form: ݔ ێ ܥଷ ۑൌ ۑͲێwhere ܥis a ۑڭێ ۑ ڭ ێ ݔۏ ےͲۏ ے coefficient. Then, the set of all solutions in this system is a subspace of Թ . Proof of Result 3.1.3 We assume that ܯଵ ൌ ሺ݉ଵ ǡ ݉ଶ ǡ ǥ ǡ ݉ ሻ and ܹଵ ൌ ሺݓǡ ݓଶ ǡ ǥ ǡ ݓ ሻ are two solutions to the above system. We will show that ܯ ܹ is a solution. We write them in the matrix-form: ݉ଵ ݓଵ Ͳ Ͳ ݉ ۍଶ ې Ͳ ۍ ې ݓۍଶ ېͲۍ ې ۑ ێ ۑ ێ ۑ ێ ۑ ێ ݉ێ ܥଷ ۑൌ ۑͲێand ݓێଷ ۑൌ ۑͲێ ۑڭێ ۑ ڭ ێ ۑڭێ ۑ ڭ ێ ݓۏ ےͲۏ ے ݉ ۏ ے Ͳ ۏ ے 85
Thus, ܯ ܹ is a solution. Fact 3.1.1 If ܯଵ ൌ ሺ݉ଵ ǡ ݉ଶ ǡ ǥ ǡ ݉ ሻ is a solution, and ߙ אԹ, then ߙ ܯൌ ሺߙ݉ଵ ǡ ߙ݉ଶ ǡ ǥ ǡ ߙ݉ ሻ is a solution. Fact 3.1.2 The only system where the solutions form a vector space is the homogeneous system. Definition 3.1.2 Null Space of a matrix, say ܣis a set of all solutions to the homogeneous system, and it is denoted by ݈݈ܰݑሺܣሻ or ܰሺܣሻ. Definition 3.1.3 Rank of a matrix, say ܣis the number of independent rows or columns of ܣ, and it is denoted by ܴܽ݊݇ሺܣሻ.
88
M. Kaabar
Definition 3.1.4 Row Space of a matrix, say ܣis the ܵ݊ܽof independent rows of ܣ, and it is denoted by ܴݓሺܣሻ. Definition 3.1.5 Column Space of a matrix, say ܣis the ܵ݊ܽof independent columns of ܣ, and it is denoted by ݊݉ݑ݈ܥሺܣሻ. Example 3.1.1 Given the following ͵ ൈ ͷ matrix: ͳ ܣൌ Ͳ Ͳ
െͳ ͳ Ͳ
ʹ ʹ Ͳ
Ͳ െͳ Ͳ ʹ ൩. ͳ Ͳ
ͳ ൭Ͳ Ͳ
െͳ ʹ ͳ ʹ Ͳ Ͳ
ͳ Ͳ Ͷ ൭Ͳ ͳ ʹ Ͳ Ͳ Ͳ Matrix.
Ͳ Ͳ ͳ
െͳ Ͳ ʹ อͲ൱ܴଶ ܴଵ ՜ ܴଵ Ͳ Ͳ
Ͳ ͳͲ Ͳ ʹอͲ൱ This is a Completely-Reduced ͳ ͲͲ
Step 3: Read the solution for the above system of linear equations after using Row-Operation. ݔଵ Ͷݔଷ ݔହ ൌ Ͳ ݔଶ ʹݔଷ ʹݔହ ൌ Ͳ ݔସ ൌ Ͳ
a. Find ݈݈ܰݑሺܣሻ. b. Find ݀݅݉ሺ݈݈ܰݑሺܣሻሻ. c. Rewrite ݈݈ܰݑሺܣሻ as ܵ݊ܽ.
Free variables are ݔଷ and ݔହ .
d. Find ܴܽ݊݇ሺܣሻ.
Assuming that ݔଷ ,ݔହ אԹ. Then, the solution of the above homogeneous system is as follows:
e. Find ܴݓሺܣሻ. Solution: Part a: To find the null space of ܣ, we need to find the solution of ܣas follows:
ݔଵ ൌ െͶݔଷ െ ݔହ ݔଶ ൌ െʹݔଷ െ ʹݔହ ݔସ ൌ Ͳ
Step 1: Write the above matrix as an AugmentedMatrix, and make all constants’ terms zeros.
Thus, according to definition 3.1.2,
ͳ ൭Ͳ Ͳ
െͳ ʹ ͳ ʹ Ͳ Ͳ
Ͳ Ͳ ͳ
݈݈ܰݑሺܣሻ ൌ ሼሺെͶݔଷ െ ݔହ ǡ െʹݔଷ െ ʹݔହ ǡ ݔଷ ǡ Ͳǡ ݔହ ሻȁݔଷ ,ݔହ אԹሽ.
െͳ Ͳ ʹ อͲ൱ Ͳ Ͳ
Part b: It is always true that
Step 2: Apply what we have learned from chapter 1 to solve systems of linear equations use Row-Operation Method. 87
݀݅݉൫݈݈ܰݑሺܣሻ൯ ൌ ݀݅݉൫ܰሺܣሻ൯ ൌ ݄ܶ݁ܰݏ݈ܾ݁ܽ݅ݎܸܽ݁݁ݎܨ݂ݎܾ݁݉ݑ
Here, ݀݅݉൫݈݈ܰݑሺܣሻ൯ ൌ ʹ.
90 Definition 3.1.6 The nullity of a matrix, say ܣis the dimension of the null space of ܣ, and it is denoted by ݀݅݉ሺ݈݈ܰݑሺܣሻሻ or݀݅݉ሺܰሺܣሻሻ. Part c: We first need to find a basis for ݈݈ܰݑሺܣሻ as follows: To find a basis for ݈݈ܰݑሺܣሻ, we play a game called (ON-OFF GAME) with the free variables ݔଷ and ݔହ Ǥ ݔଷ ͳ Ͳ
ݔହ Ͳ ͳ
ܲݐ݊݅ ሺെͶǡ െʹǡͳǡͲǡͲሻ ሺെͳǡ െʹǡͲǡͲǡͳሻ
The basis for ݈݈ܰݑሺܣሻ ൌ ሼሺെͶǡ െʹǡͳǡͲǡͲሻǡ ሺെͳǡ െʹǡͲǡͲǡͳሻሽ. Thus, ܰ ݈݈ݑሺܣሻ ൌ ܵ݊ܽሼሺെͶǡ െʹǡͳǡͲǡͲሻǡ ሺെͳǡ െʹǡͲǡͲǡͳሻሽ. Part d: To find the rank of matrix ܣ, we just need to change matrix ܣto the Semi-Reduced Matrix. We already did that in part a. Thus, ܴܽ݊݇ሺܣሻ ൌ ͵Ǥ
M. Kaabar
Result 3.1.6 Let ܣbe ݉ ൈ ݊ matrix. The geometric meaning of ݊݉ݑ݈ܥሺܣሻ ൌ ܵ݊ܽሼݏ݊݉ݑ݈ܥݐ݊݁݀݊݁݁݀݊ܫሽ “lives” inside Թ . Result 3.1.7 Let ܣbe ݉ ൈ ݊ matrix. Then, ܴܽ݊݇ሺܣሻ ൌ ݀݅݉൫ܴݓሺܣሻ൯ ൌ ݀݅݉ሺ݊݉ݑ݈ܥሺܣሻሻ. Example 3.1.2 Given the following ͵ ൈ ͷ matrix: ͳ ͳ ͳ ͳ ܤൌ െͳ െͳ െͳͲ Ͳ Ͳ Ͳ Ͳ
ͳ ʹ൩. Ͳ
a. Find ܴݓሺܤሻ. b. Find ݊݉ݑ݈ܥሺܤሻ. c. Find ܴܽ݊݇ሺܤሻ. Solution: Part a: To find the row space of ܤ, we need to change matrix ܤto the Semi-Reduced Matrix as follows: ͳ ͵൩ Ͳ
ܴݓሺܣሻ ൌ ܵ݊ܽሼሺͳǡ െͳǡʹǡͲǡ െͳሻǡ ሺͲǡͳǡʹǡͲǡʹሻǡ ሺͲǡͲǡͲǡͳǡͲሻሽǤ
This is a Semi-Reduced Matrix. To find the row space
It is also a subspace of Թହ .
of matrix ܤ, we just need to write the ܵ ݊ܽof
Result 3.1.4 Let ܣbe ݉ ൈ ݊ matrix. Then,
independent rows. Thus,
ܴܽ݊݇ሺܣሻ ݀݅݉൫ܰ ሺܣሻ൯ ൌ ݊ ൌ ܰܣ݂ݏ݊݉ݑ݈ܥ݂ݎܾ݁݉ݑ.
ܴݓሺܤሻ ൌ ܵ݊ܽሼሺͳǡͳǡͳǡͳǡͳሻǡ ሺͲǡͲǡͲǡͳǡ͵ሻሽǤ
Result 3.1.5 Let ܣbe ݉ ൈ ݊ matrix. The geometric
Part b: To find the column space of ܤ, we need to change matrix ܤto the Semi-Reduced Matrix. We already did that in part a. Now, we need to locate the columns in the Semi-Reduced Matrix of ܤthat contain
inside Թ . 89
ͳ ͳ െͳͲ Ͳ Ͳ
ͳ ͳ ͳ Ͳ Ͳͳ Ͳ Ͳ Ͳ
to write the ܵ ݊ܽof independent rows. Thus,
meaning of ܴݓሺܣሻ ൌ ܵ݊ܽሼݏݓܴݐ݊݁݀݊݁݁݀݊ܫሽ “lives”
ͳ െͳ Ͳ
ͳ ܴ ܴ ՜ܴ ͳ ଵ ଶ ଶ ʹ ൩ ܴ ܴ ՜ ܴ Ͳ ଵ ଷ ଷ Ͳ Ͳ
ͳ െͳ Ͳ
Part e: To find the row space of matrix ܣ, we just need
92 the leaders, and then we should locate them to the original matrix ܤ.
ͳ Ͳ Ͳ
ͳ െͳ Ͳ
ͳ ͳ ͳ Ͳ Ͳͳ Ͳ Ͳ Ͳ
ͳ െͳ Ͳ
Before discussing polynomials, we need to know the following mathematical facts: Fact 3.2.1 Թൈ ൌ Թൈ ൌ ܯൈ ሺԹሻ is a vector space.
ͳ ͵൩ Semi-Reduced Matrix Ͳ
ͳ ͳ െͳͲ Ͳ Ͳ
M. Kaabar
Fact 3.2.2 Թଶൈଷ is equivalent to Թ as a vector space. ͳ (i.e. ቂ Ͳ
ʹ ͳ
͵ ቃ ሺͳǡʹǡ͵ǡͲǡͳǡͳሻ). ͳ
Fact 3.2.3 Թଷൈଶ is equivalent to Թ as a vector space.
ͳ ʹ൩ Matrix ܤ Ͳ
Each remaining columns is a linear combination of the first and fourth columns.
ͳ (i.e. ͵ ͳ
ʹ Ͳ൩ ሺͳǡʹǡ͵ǡͲǡͳǡͳሻ). ͳ
Thus, ݊݉ݑ݈ܥሺܤሻ ൌ ܵ݊ܽሼሺͳǡ െͳǡͲሻǡ ሺͳǡͲǡͲሻሽ.
After knowing the above polynomials as follows:
Part c: To find the rank of matrix ܤ, we just need to
ܲ ൌ ܵ݁ ݁݁ݎ݂݃݁݀ݏ݈ܽ݅݉݊ݕ݈݈݈݂ܽݐ൏ ݊Ǥ
change matrix ܣto the Semi-Reduced Matrix. We already did that in part a. Thus,
The algebraic expression of polynomials is in the following from: ܽ ݔ ܽିଵ ݔିଵ ڮ ܽଵ ݔଵ ܽ
ܴܽ݊݇ሺܣሻ ൌ ݀݅݉൫ܴݓሺܤሻ൯ ൌ ݀݅݉ሺ݊݉ݑ݈ܥሺܤሻሻ ൌ ʹǤ
ܽ ǡ ܽିଵ ܽଵ are coefficients.
3.2 Linear Transformation
݊݊ െ ͳ are exponents that must be positive integers whole numbers.
We start
this section
with
an
introduction
to
polynomials, and we explain how they are similar to Թ as vector spaces. At the end of this section we discuss a new concept called linear transformation. 91
facts,
we
introduce
ܽ is a constant term. The degree of polynomial is determined by the highest power (exponent). We list the following examples of polynomials:
94 x
M. Kaabar
ܲଶ ൌ ܵ݁ ݁݁ݎ݂݃݁݀ݏ݈ܽ݅݉݊ݕ݈݈݈݂ܽݐ൏ ʹ (i.e. ͵ ݔ ʹ ܲ אଶ , Ͳ ܲ אଶ , ͳͲ ܲ אଶ , ξ͵ ܲ אଶ but
͵ ݔଶ െ ʹ ൌ െʹ Ͳ ݔ ͵ ݔଶ ՞ ሺെʹǡͲǡ͵ሻ
ݔଶ െ ͳͲ ݔെ Ͷ ൌ െͶ െ ͳͲ ݔ ݔଶ ՞ ሺെͶǡ െͳͲǡሻ
x
ξ͵ξܲ ב ݔଶ ). ܲସ ൌ ܵ݁ ݁݁ݎ݂݃݁݀ݏ݈ܽ݅݉݊ݕ݈݈݈݂ܽݐ൏ Ͷ (i.e. ͵ͳ ݔଶ Ͷ ܲ אସ ). If ܲሺ ݔሻ ൌ ͵, then ݀݁݃൫ܲ ሺ ݔሻ൯ ൌ Ͳ.
x
ξ ݔ ͵ is not a polynomial.
x
Result 3.2.1 ܲ is a vector space. Fact 3.2.4 Թଶൈଷ ൌ ܯଶൈଷ ሺԹሻ as a vector space same as Թ . Result 3.2.2 ܲ is a vector space, and it is the same as Թ . (i.e. ܽ ܽଵ ݔଵ ڮ ܽିଵ ݔିଵ ՞ ሺܽ ǡ ܽଵ ǡ ǥ ǡ ܽିଵ ሻ. Note: The above form is in an ascending order. Result 3.2.3 ݀݅݉ሺܲ ሻ ൌ ݊Ǥ Fact 3.2.5 ܲଷ ൌ ܵ݊ܽሼ͵ݏ݈ܽ݅݉݊ݕ݈ܲݐ݊݁݀݊݁݁݀݊ܫǡ ܽ݊݀ ݁݁ݎ݃݁ܦ݂݄ܿܽܧ൏ ͵ሽ. (i.e. ܲଷ ൌ ܵ݊ܽሼͳǡ ݔǡ ʹݔሽ). Example 3.2.1 Given the following polynomials: ͵ ݔଶ െ ʹǡ െͷݔǡ ݔଶ െ ͳͲ ݔെ Ͷ. a. Are these polynomials independent? b. Let ܦൌ ܵ݊ܽሼ͵ ݔଶ െ ʹǡ െͷݔǡ ݔଶ െ ͳͲ ݔെ Ͷሽ. Find a basis for ܦ. Solution: Part a: We know that these polynomial live in ܲଷ , and as a vector space ܲଷ is the same as Թଷ . According to result 3.2.2, we need to make each polynomial equivalent to Թ as follows: 93
െͷ ݔൌ Ͳ െ ͷ ݔ Ͳ ݔଶ ՞ ሺͲǡ െͷǡͲሻ
Now, we need to write these vectors as a matrix. െʹ Ͳ ͵ Ͳ െͷ Ͳ൩ Each point is a row-operation. We need െͶ െͳͲ to reduce this matrix to Semi-Reduced Matrix. Then, we apply the Row-Reduction Method to get the Semi-Reduced Matrix as follows: െʹ Ͳ െͶ
Ͳ െͷ െͳͲ
െʹ ͵ Ͳ൩െʹܴଵ ܴଷ ՜ ܴଷ Ͳ Ͳ
Ͳ െͷ െͳͲ
͵ Ͳ൩ Ͳ
െʹ Ͳ ͵ െʹܴଶ ܴଷ ՜ ܴଷ Ͳ െͷ Ͳ൩ This is a Semi-Reduced Ͳ Ͳ Ͳ Matrix. Since there is a zero-row in the Semi-Reduced Matrix, then these elements are dependent. Thus, the answer to this question is NO. Part b: Since there are only 2 vectors survived after checking for dependency in part a, then the basis for ܦ ሺͲǡ െͷǡͲሻ ՞ െͷݔ. Result 3.2.4 Given ݒଵ ǡ ݒଶ ǡ ǥ ǡ ݒ points in Թ where ݇ ൏ ݊. Choose one particular point, say ܳ, such that ܳ ൌ ܿଵ ݒଵ ܿଶ ݒଶ ڮ ܿ ݒ where ܿଵ ǡ ܿଶ ǡ ǥ ǡ ܿ are
96 constants. If ܿଵ ǡ ܿଶ ǡ ǥ ǡ ܿ are unique, then ݒଵ ǡ ݒଶ ǡ ǥ ǡ ݒ are independent. Note: The word “unique” in result 3.2.4 means that there is only one value for each of ܿଵ ǡ ܿଶ ǡ ǥ ǡ ܿ . Proof of Result 3.2.4 By using proof by contradiction, we assume that ݒଵ ൌ ߙଶ ݒଶ ߙଷ ݒଷ ڮ ߙ ݒ where
M. Kaabar
Example 3.2.2 Givenܶǣ Թଶ ՜ Թଷ where Թଶ is a domain and Թଷ is a co-domain. ܶ൫ሺܽଵ ǡ ܽଶ ሻ൯ ൌ ሺ͵ܽଵ ܽଶ ǡ ܽଶ ǡ െܽଵ ሻ. a. Find ܶሺሺͳǡͳሻሻ. b. Find ܶሺሺͳǡͲሻሻ. c. Show that ܶ is a linear transformation.
ߙଶ ǡ ߙଷ ǡ ǥ ǡ ߙ are constants. Our assumption means that
Solution: Part a: Since ܶ൫ሺܽଵ ǡ ܽଶ ሻ൯ ൌ ሺ͵ܽଵ ܽଶ ǡ ܽଶ ǡ െܽଵ ሻ, then ܽଵ ൌ ܽଶ ൌ ͳ. Thus, ܶ൫ሺͳǡͳሻ൯ ൌ ሺ͵ሺͳሻ ͳǡͳǡ െͳሻ ൌ
it is dependent. Using algebra, we obtain:
ሺͶǡͳǡ െͳሻ.
ܳ ൌ ܿଵ ߙଶ ݒଶ ܿଵ ߙଷ ݒଷ ڮ ܿଵ ߙ ݒ ܿଶ ݒଶ ڮ ܿ ݒ . ܳ ൌ ሺܿଵ ߙଶ ܿଶ ሻݒଶ ሺܿଵ ߙଷ ܿଷ ሻݒଷ ڮ ሺܿଵ ߙ ܿ ሻݒ
Part b: Since ܶ൫ሺܽଵ ǡ ܽଶ ሻ൯ ൌ ሺ͵ܽଵ ܽଶ ǡ ܽଶ ǡ െܽଵ ሻ, then ܽଵ ൌ ͳܽଶ ൌ Ͳ. Thus, ܶ൫ሺͳǡͲሻ൯ ൌ ሺ͵ሺͳሻ ͲǡͲǡ െͳሻ ൌ
Ͳݒଵ Ǥ Thus, none of them is a linear combination of the
ሺ͵ǡͲǡ െͳሻ.
others which means that they are linearly
Part c: Proof: We assume that ݒଵ ൌ ሺܽଵ ǡ ܽଶ ሻ,
independent. This is a contradiction. Therefore, our
ݒଶ ൌ ሺܾଵ ǡ ܾଶ ሻ, and ߙ אԹ. We will show that ܶ is a linear
assumption that ݒଵ ǡ ݒଶ ǡ ǥ ǡ ݒ were linearly
transformation. Using algebra, we start from the Left-
dependent is false. Hence, ݒଵ ǡ ݒଶ ǡ ǥ ǡ ݒ are linearly
Hand-Side (LHS):
independent.
ߙݒଵ ݒଶ ൌ ሺߙܽଵ ܾଵ ǡ ߙܽଶ ܾଶ ሻ
Result 3.2.5 Assume ݒଵ ǡ ݒଶ ǡ ǥ ǡ ݒ are independent and ܳ ݊ܽܵ אሼݒଵ ǡ ݒଶ ǡ ǥ ǡ ݒ ሽ. Then, there exists unique number ܿଵ ǡ ܿଶ ǡ ǥ ǡ ܿ such that ܳ ൌ ܿଵ ݒଵ ܿଶ ݒଶ ڮ ܿ ݒ .
ܶሺߙݒଵ ݒଶ ሻ ൌ ܶሺሺߙܽଵ ܾଵ ǡ ߙܽଶ ܾଶ ሻሻ
Linear Transformation: Definition 3.2.1 ܶǣ ܸ ՜ ܹ where ܸ is a domain and ܹ is a co-domain. ܶ is a linear transformation if for every ݒଵ ǡ ݒଶ ܸ אand ߙ אԹ, we have the following: ܶሺߙݒଵ ݒଶ ሻ ൌ ߙܶሺݒଵ ሻ ܶሺݒଶ ሻ. 95
ܶሺߙݒଵ ݒଶ ሻ ൌ ሺ͵ߙܽଵ ͵ܾଵ ߙܽଶ ܾଶ ǡ ߙܽଶ ܾଶ ǡ െߙܽଵ െ ܾଵ ሻ
Now, we start from the Right-Hand-Side (RHS): ߙܶሺݒଵ ሻ ܶሺݒଶ ሻ ൌ ߙܶሺܽଵ ǡ ܽଶ ሻ ܶሺܾଵ ǡ ܾଶ ሻ
ߙܶሺݒଵ ሻ ܶሺݒଶ ሻ ൌ ߙሺ͵ܽଵ ܽଶ ǡ ܽଶ ǡ െܽଵ ሻ ሺ͵ܾଵ ܾଶ ǡ ܾଶ ǡ െܾଵ ሻ
ൌ ሺ͵ߙܽଵ ߙܽଶ ǡ ߙܽଶ ǡ െߙܽଵ ሻ ሺ͵ܾଵ ܾଶ ǡ ܾଶ ǡ െܾଵ ሻ ൌ ሺ͵ߙܽଵ ߙܽଶ ͵ܾଵ ܾଶ ǡ ߙܽଶ ܾଶ ǡ െߙܽଵ െ ܾଵ ሻ Thus, ܶ is a linear transformation.
98
M. Kaabar
Result 3.2.6 Given ܶǣ Թ ՜ Թ . Then, ܶሺሺܽଵ ǡ ܽଶ ǡ ܽଷ ǡ ǥ ǡ ܽ ሻሻ ൌ Each coordinate is a linear combination of the ܽ Ԣݏ. Example 3.2.3 Givenܶǣ Թଷ ՜ Թସ where Թଷ is a domain and Թସ is a co-domain. a. If ܶ൫ሺݔଵ ǡ ݔଶ ǡ ݔଷ ሻ൯ ൌ ሺെ͵ݔଷ ݔଵ ǡ െͳͲݔଶ ǡ ͳ͵ǡ െݔଷ ሻ, is ܶ a linear transformation? b. If ܶ൫ሺݔଵ ǡ ݔଶ ǡ ݔଷ ሻ൯ ൌ ሺെ͵ݔଷ ݔଵ ǡ െͳͲݔଶ ǡ Ͳǡ െݔଷ ሻ, is ܶ a linear transformation? Solution: Part a: Since 13 is not a linear combination of ݔଵ ǡ ݔଶ ݔଷ . Thus, ܶ is not a linear transformation. Part b: Since 0 is a linear combination of ݔଵ ǡ ݔଶ ݔଷ . Thus, ܶ is a linear transformation. Example 3.2.4 Givenܶǣ Թଶ ՜ Թଷ where Թଶ is a domain and Թଷ is a co-domain. If ܶ൫ሺܽଵ ǡ ܽଶ ሻ൯ ൌ ሺܽଵ ଶ ܽଶ ǡ െܽଶ ሻ, is ܶ a linear transformation? Solution: Since ܽଵ ଶ ܽଶ is not a linear combination of ܽଵ ܽଶ . Hence, ܶ is not a linear transformation. Example 3.2.5 Givenܶǣ Թ ՜ Թ. If ܶሺ ݔሻ ൌ ͳͲݔ, is ܶ a linear transformation? Solution: Since it is a linear combination of ܽଵ such thatߙܽଵ ൌ ͳͲݔ. Hence, ܶ is a linear transformation. Example 3.2.6 Find the standard basis for Թଶ . ଶ
Solution: The standard basis for Թ is the rows of ܫଶ . 97
ͳ Since ܫଶ ൌ ቂ Ͳ ሼሺͳǡͲሻǡ ሺͲǡͳሻሽ.
Ͳ ቃ, then the standard basis for Թଶ is ͳ
Example 3.2.7 Find the standard basis for Թଷ . Solution: The standard basis for Թଷ is the rows of ܫଷ . ͳ Ͳ Ͳ Since ܫଷ ൌ Ͳ ͳ Ͳ൩, then the standard basis for Թଷ is Ͳ Ͳ ͳ ሼሺͳǡͲǡͲሻǡ ሺͲǡͳǡͲሻǡ ሺͲǡͲǡͳሻሽ. Example 3.2.8 Find the standard basis for ܲଷ . Solution: The standard basis for ܲଷ is ሼͳǡ ݔǡ ݔଶ ሽ. Example 3.2.9 Find the standard basis for ܲସ . Solution: The standard basis for ܲସ is ሼͳǡ ݔǡ ݔଶ ǡ ݔଷ ሽ. Example 3.2.10 Find the standard basis for Թଶൈଶ ൌ ܯଶൈଶ ሺԹሻ. Solution: The standard basis for Թଶൈଶ ൌ ܯଶൈଶ ሺԹሻ is ͳ Ͳ Ͳ ͳ Ͳ Ͳ Ͳ Ͳ ቃǡቂ ቃǡቂ ቃǡቂ ቃሽ because Թଶൈଶ ൌ ሼቂ Ͳ Ͳ Ͳ Ͳ ͳ Ͳ Ͳ ͳ ܯଶൈଶ ሺԹሻ ൌ Թସ as a vector space where standard basis ͳ Ͳ Ͳ Ͳ for Թଶൈଶ ൌ ܯଶൈଶ ሺԹሻ is the rows of ܫସ ൌ Ͳ ͳ Ͳ Ͳthat Ͳ Ͳ ͳ Ͳ Ͳ Ͳ Ͳ ͳ are represented by ʹ ൈ ʹ matrices. Example 3.2.11 Let ܶǣ Թଶ ՜ Թଷ be a linear transformation such that ܶሺʹǡͲሻ ൌ ሺͲǡͳǡͶሻ
100 ܶሺെͳǡͳሻ ൌ ሺʹǡͳǡͷሻ
3.3 Kernel and Range
Find ܶሺ͵ǡͷሻ. Solution: The given points are ሺʹǡͲሻ and ሺെͳǡͳሻ. These two points are independent because of the following: ʹ Ͳ ͳ ʹ ቂ ቃ ܴଵ ܴଶ ՜ ܴଶ ቂ െͳ ͳ ʹ Ͳ
M. Kaabar
Ͳ ቃ ͳ
Every point in Թଶ is a linear combination of ሺʹǡͲሻ and ሺെͳǡͳሻ. There exists unique numbers ܿଵ and ܿଶ such that ሺ͵ǡͷሻ ൌ ܿଵ ሺʹǡͲሻ ܿଶ ሺെͳǡͳሻ. ͵ ൌ ʹܿଵ െ ܿଶ ͷ ൌ ܿଶ Now, we substitute ܿଶ ൌ ͷ in ͵ ൌ ʹܿଵ െ ܿଶ , we obtain: ͵ ൌ ʹܿଵ െ ͷ ܿଵ ൌ Ͷ Hence, ሺ͵ǡͷሻ ൌ ͶሺʹǡͲሻ ͷሺെͳǡͳሻ. ܶሺ͵ǡͷሻ ൌ ܶሺͶሺʹǡͲሻ ͷሺെͳǡͳሻሻ ܶሺ͵ǡͷሻ ൌ ͶܶሺʹǡͲሻ ͷܶሺെͳǡͳሻ ܶሺ͵ǡͷሻ ൌ ͶሺͲǡͳǡͶሻ ͷሺʹǡͳǡͷሻ ൌ ሺͳͲǡͻǡͶͳሻ Thus, ܶሺ͵ǡͷሻ ൌ ሺͳͲǡͻǡͶͳሻ. Example 3.2.12 Let ܶǣ Թ ՜ Թ be a linear transformation such that ܶሺͳሻ ൌ ͵. Find ܶሺͷሻ. Solution: Since it is a linear transformation, then ܶሺͷሻ ൌ ܶሺͷ ή ͳሻ ൌ ͷܶሺͳሻ ൌ ͷሺ͵ሻ ൌ ͳͷ. If it is not a linear transformation, then it is impossible to find ܶሺͷሻ. 99
In this section, we discuss how to find the standard matrix representation, and we give examples of how to find kernel and range. Definition 3.3.1 Given ܶǣ Թ ՜ Թ where Թ is a domain and Թ is a co-domain. Then, Standard Matrix Representation is a ݉ ൈ ݊matrix. This means that it is ݀݅݉ሺ ܥെ ݊݅ܽ݉ܦሻ ൈ ݀݅݉ሺ݊݅ܽ݉ܦሻ matrix. Definition 3.3.2 Given ܶǣ Թ ՜ Թ where Թ is a domain and Թ is a co-domain. Kernel is a set of all points in the domain that have image which equals to the origin point, and it is denoted by ݎ݁ܭሺܶሻ. This means that ݎ݁ܭሺܶሻ ൌ ݂݈݈ܰܶ݁ܿܽܵݑ. Definition 3.3.3 Range is the column space of standard matrix representation, and it is denoted by ܴܽ݊݃݁ሺܶሻ. Example 3.3.1 Given ܶǣ Թଷ ՜ Թସ where Թଷ is a domain and Թସ is a co-domain. ܶ൫ሺݔଵ ǡ ݔଶ ǡ ݔଷ ሻ൯ ൌ ሺെͷݔଵ ǡ ʹݔଶ ݔଷ ǡ െݔଵ ǡ Ͳሻ a. Find the Standard Matrix Representation. b. Find ܶሺሺ͵ǡʹǡͳሻሻ. c. Find ݎ݁ܭሺܶሻ. d. Find ܴܽ݊݃݁ሺܶሻ. Solution: Part a: According to definition 3.3.1, the Standard Matrix Representation, let’s call it ܯ, here is
102 Ͷ ൈ ͵. We know from section 3.2 that the standard basis for domain (here is Թଷ ) is ሼሺͳǡͲǡͲሻǡ ሺͲǡͳǡͲሻǡ ሺͲǡͲǡͳሻሽ. We assume the following: ݒଵ ൌ ሺͳǡͲǡͲሻ ݒଶ ൌ ሺͲǡͳǡͲሻ ݒଷ ൌ ሺͲǡͲǡͳሻ Now, we substitute each point of the standard basis for domain in ܶ൫ሺݔଵ ǡ ݔଶ ǡ ݔଷ ሻ൯ ൌ ሺെͷݔଵ ǡ ʹݔଶ ݔଷ ǡ െݔଵ ǡ Ͳሻ as follows: ܶ൫ሺͳǡͲǡͲሻ൯ ൌ ሺെͷǡͲǡ െͳǡͲሻ ܶ൫ሺͲǡͳǡͲሻ൯ ൌ ሺͲǡʹǡͲǡͲሻ ܶ൫ሺͲǡͲǡͳሻ൯ ൌ ሺͲǡͳǡͲǡͲሻ ݔଵ ݔ Our goal is to find ܯso that ܶ൫ሺݔଵ ǡ ݔଶ ǡ ݔଷ ሻ൯ ൌ ܯ ଶ ൩. ݔଷ െͷ Ͳ Ͳ ܯൌ Ͳ ʹ ͳ This is the Standard Matrix െͳ Ͳ Ͳ Ͳ Ͳ Ͳ Representation. The first, second and third columns represent ܶሺݒଵ ሻǡ ܶሺݒଶ ሻܶሺݒଷ ሻ. ݔଵ Part b: Since ൫ሺݔଵ ǡ ݔଶ ǡ ݔଷ ሻ൯ ൌ ܯݔଶ ൩ , then ݔଷ െͷ Ͳ Ͳ ͵ ܶ൫ሺ͵ǡʹǡͳሻ൯ ൌ Ͳ ʹ ͳ ʹ൩ െͳ Ͳ Ͳ ͳ Ͳ Ͳ Ͳ
101
M. Kaabar
െͷ Ͳ Ͳ െͳͷ ܶ൫ሺ͵ǡʹǡͳሻ൯ ൌ ͵ ή Ͳ ʹ ή ʹ ͳ ή ͳ ൌ ͷ Ͳ Ͳ െͳ െ͵ Ͳ Ͳ Ͳ Ͳ െͳͷ ͷ is equivalent to ሺെͳͷǡͷǡ െ͵ǡͲሻ. This lives in the െ͵ Ͳ co-domain. Thus, ܶ൫ሺ͵ǡʹǡͳሻ൯ ൌ ሺെͳͷǡͷǡ െ͵ǡͲሻ. Part c: According to definition 3.3.2, ݎ݁ܭሺܶሻ is a set of all points in the domain that have imageൌ ሺͲǡͲǡͲǡͲሻ. Hence, ܶ൫ሺݔଵ ǡ ݔଶ ǡ ݔଷ ሻ൯ ൌ ሺͲǡͲǡͲǡͲሻ. This means the Ͳ ݔଵ following: ܯݔଶ ൩ ൌ Ͳ Ͳ ݔଷ Ͳ െͷ Ͳ Ͳ ݔ Ͳ ଵ Ͳ ʹ ͳ ݔଶ ൩ ൌ Ͳ Ͳ െͳ Ͳ Ͳ ݔ ଷ Ͳ Ͳ Ͳ Ͳ Since ݎ݁ܭሺܶሻ ൌ ݈݈ܰݑሺܯሻ, then we need to find ܰሺܯሻ as follows: ͳ Ͳ ͲͲ െͷ Ͳ Ͳ Ͳ ͳ ቌ Ͳ ʹ ͳቮͲቍ െ ܴଵ ቌ Ͳ ʹ ͳቮͲቍ ܴଵ ܴଷ ՜ ܴଷ െͳ Ͳ Ͳ Ͳ െͳ Ͳ Ͳ Ͳ ͷ Ͳ Ͳ ͲͲ Ͳ Ͳ Ͳ Ͳ ͳ Ͳ ͲͲ ͳ Ͳ Ͳ Ͳ ଵ ቌͲ ʹ ͳቮͲቍ ܴଶ ቌͲ ͳ ͲǤͷቮͲቍ This is a CompletelyͲ Ͳ ͲͲ ଶ Ͳ Ͳ Ͳ Ͳ Ͳ Ͳ ͲͲ Ͳ Ͳ Ͳ Ͳ Reduced Matrix. Now, we need to read the above matrix as follows: ݔଵ ൌ Ͳ
104 ͳ ݔଶ ݔଷ ൌ Ͳ ʹ ͲൌͲ ͲൌͲ To write the solution, we need to assume that ݔଷ אԹሺ݈ܾ݁ܽ݅ݎܸܽ݁݁ݎܨሻ. ଵ
Hence,ݔଵ ൌ Ͳ and ݔଶ ൌ െ ݔଷ . ଵ
ଶ
ܰሺܯሻ ൌ ሼሺͲǡ െ ݔଷ ǡ ݔଷ ሻȁݔଷ אԹሽ. ଶ
By letting ݔଷ ൌ ͳ, we obtain: ܰݕݐ݈݈݅ݑሺܯሻ ൌ ܰ ݏ݈ܾ݁ܽ݅ݎܸܽ݁݁ݎܨ݂ݎܾ݁݉ݑൌ ͳ, and ଵ
ݏ݅ݏܽܤൌ ሼሺͲǡ െ ǡ ͳሻሽ ଶ
ଵ
Thus, ݎ݁ܭሺܶሻ ൌ ܰሺ ܯሻ ൌ ܵ݊ܽሼሺͲǡ െ ǡ ͳሻሽ. ଶ
Part d: According to definition 3.3.3, ܴܽ݊݃݁ሺܶሻ is the column space of ܯ. Now, we need to locate the columns in the Completely-Reduced Matrix in part c that contain the leaders, and then we should locate them to the original matrix as follows:
ͳ ቌͲ Ͳ Ͳ
െͷ ቌͲ െͳ Ͳ
Ͳ Ͳ Ͳ ͳ ͲǤͷቮͲቍ Completely-Reduced Matrix Ͳ Ͳ Ͳ Ͳ Ͳ Ͳ
Ͳ ʹ Ͳ Ͳ
ͲͲ ͳቮͲቍOrignial Matrix ͲͲ ͲͲ 103
M. Kaabar
Thus, ܴܽ݊݃݁ሺܶሻ ൌ ܵ݊ܽሼሺെͷǡͲǡ െͳǡͲሻǡ ሺͲǡʹǡͲǡͲሻሽ. Result 3.3.1 Given ܶǣ Թ ՜ Թ where Թ is a domain and Թ is a co-domain. Let ܯbe a standard matrix representation. Then, ܴܽ݊݃݁ሺܶሻ ൌ ܵ݊ܽሼܯ݂ݏ݊݉ݑ݈ܥݐ݊݁݀݊݁݁݀݊ܫሽ. Result 3.3.2 Given ܶǣ Թ ՜ Թ where Թ is a domain and Թ is a co-domain. Let ܯbe a standard matrix representation. Then, ݀݅݉൫ܴܽ݊݃݁ሺܶሻ൯ ൌ ܴܽ݊݇ሺܯሻ = Number of Independent Columns. Result 3.3.3 Given ܶǣ Թ ՜ Թ where Թ is a domain and Թ is a co-domain. Let ܯbe a standard matrix representation. Then, ݀݅݉൫ݎ݁ܭሺܶሻ൯ ൌ ܰݕݐ݈݈݅ݑሺܯሻ. Result 3.3.4 Given ܶǣ Թ ՜ Թ where Թ is a domain and Թ is a co-domain. Let ܯbe a standard matrix representation. Then, ݀݅݉൫ܴܽ݊݃݁ሺܶሻ൯ ݀݅݉ሺݎ݁ܭሺܶሻሻ ൌ ݀݅݉ሺ݊݅ܽ݉ܦሻ. ଵ
Example 3.3.2 Given ܶǣ ܲଶ ՜ Թ. ܶሺ݂ ሺ ݔሻሻ ൌ ݂ሺ ݔሻ݀ ݔis a linear transformation. a. Find ܶሺʹ ݔെ ͳሻ. b. Find ݎ݁ܭሺܶሻ. c. Find ܴܽ݊݃݁ሺܶሻ. Solution: ଵ ݔൌͳ Part a: ܶሺʹ ݔെ ͳሻ ൌ ሺʹ ݔെ ͳሻ݀ ݔൌ ݔଶ െ ݔቚ ൌ ͲǤ ݔൌͲ Part b: To find ݎ݁ܭሺܶሻ, we set equation of ܶ ൌ Ͳ, and ݂ሺ ݔሻ ൌ ܽ ܽଵ ܲ א ݔଶ .
106 ଵ
Thus, ܶሺ݂ሺݔሻሻ ൌ ሺܽ ܽଵ ݔሻ݀ ݔൌ ܽ ݔ ܽଵ ܽ െ Ͳ ൌ Ͳ ʹ ܽ ൌ െ ଶభ Hence, ݎ݁ܭሺܶሻ ൌ ሼെ
భ ଶ
భ ଶ
ݔൌͳ ݔଶ ቚ ൌͲ ݔൌͲ
ܽଵ ݔȁܽଵ אԹሽ. We also know that
݀݅݉ሺݎ݁ܭሺܶሻ ൌ ͳ because there is one free variable. In addition, we can also find basis by letting ܽଵ be any real number not equal to zero, say ܽଵ ൌ ͳ, as follows: ͳ ݏ݅ݏܽܤൌ ሼെ ݔሽ ʹ ଵ ሺ Thus, ܶ ݎ݁ܭሻ ൌ ܵ݊ܽሼെ ݔሽ. ଶ
Part c: It is very easy to find range here. ܴܽ݊݃݁ሺܶሻ ൌ Թ because we linearly transform from a second degree polynomial to a real number. For example, if we linearly transform from a third degree polynomial to a second degree polynomial, then the range will be ܲଶ .
M. Kaabar
2. Let ܶǣ Թଶ ՜ Թ be a linear transformation such that ܶሺͳǡͲሻ ൌ Ͷ, ܶሺʹǡ െʹሻ ൌ ʹ. Find the standard matrix representation of ܶ. 3. Given ܶǣ ܲଷ ՜ Թ such that ܶሺܽ ܾ ݔ ܿ ݔଶ ሻ ൌ ଵ ሺܽ ܾ ݔ ܿ ݔଶ ሻ݀ݔ. Find ݎ݁ܭሺܶሻ. 4. Given ܶǣ Թସ ՜ Թଷ is a linear transformation such that ܶሺݔǡ ݕǡ ݖǡ ݓሻ ൌ ሺ ݔ ݕ ݖെ ʹݓǡ െʹݓǡ ݓሻ. a. Find the standard matrix representation of ܶ. b. Find ݀݅݉ሺݎ݁ܭሺܶሻሻ. c. Find ܴܽ݊݃݁ሺܶሻ. 5. Given ܶǣ ܲସ ՜ Թଶൈଶ such that ܶሺ݃ሺݔሻሻ ൌ
݃ሺെͳሻ ݃ሺͲሻ ൨ is a linear transformation. ݃ሺെͳሻ ݃ሺͲሻ
3.4 Exercises
a. Find the standard matrix representation of ܶ.
1. Given ܶǣ Թଷ ՜ Թଶൈଶ such that ݔଵ ݔଵ ܶ൫ሺݔଵ ǡ ݔଶ ǡ ݔଷ ሻ൯ ൌ ቂ ݔ ݔቃ is a linear transformation. ଷ ଶ
c. Find a basis for ܴܽ݊݃݁ሺܶሻ and write ܴܽ݊݃݁ሺܶሻ as
b. Find ݎ݁ܭሺܶሻ and write ݎ݁ܭሺܶሻ as a ܵ݊ܽ. aܵ݊ܽ.
a. Find the standard matrix representation of ܶ.
6. Given ܶǣ ܲଷ ՜ Թ is a linear transformation such that
b. Find ݎ݁ܭሺܶሻ.
ܶሺͳሻ ൌ ǡ ܶሺ ݔଶ ݔሻ ൌ െͷ, and ܶሺ ݔଶ ʹ ݔ ͳሻ ൌ Ͷ.
c. Find a basis for ܴܽ݊݃݁ሺܶሻ and write ܴܽ݊݃݁ሺܶሻ as
Ǥ Find ܶሺ ݔሻǡ ܶሺ ݔଶ ሻܶሺͷ ݔଶ ͵ ݔ ͺሻ. b. Find the standard matrix representation of ܶ.
aܵ݊ܽ.
c. Find ݎ݁ܭሺܶሻ and write ݎ݁ܭሺܶሻ as a ܵ݊ܽ. 105
Chapter 4
108
Characteristic Equation of Matrix In this chapter, we discuss how to find eigenvalues and eigenvectors of a matrix. Then, we introduce the diagonalizable
matrix,
and
we
explain
how
to
determine whether a matrix is diagonalizable or not. At the end of this chapter we discuss how to find diagonal matrix and invertible matrix.
4.1 Eigenvalues and Eigenvectors In this section, we give an example explaining the steps for finding eigenvalues and eigenvectors. Example 4.1.1 Given ݊ ൈ ݊ matrix, say ܣ, Let ߙ אԹ. Can we find a point ܳ in Թ , ܳ ൌ ሺܽଵ ǡ ܽଶ ǡ ǥ ǡ ܽ ሻ, such that ܳ ് ሺͲǡͲǡͲǡͲǡ ǥ ǡͲሻ, and ܽଵ ܽଵ ܽ ۍଶ ې ܽ ۍଶ ې ܣൈ ܳ ൌ ܽ ێ ܣଷ ۑൌ ߙ ܽ ێଷ ?ۑ ۑڭێ ۑڭێ ܽ ۏ ے ܽۏ ے
107
M. Kaabar
Solution: If such ߙ and such ܳ exist, we say ߙ is an eigenvalue of ܣ, and we say ܳ is an eigenvector of ܣ corresponds to the eigenvalue ߙ. ܽଵ ܽଵ ܽ ۍଶ ې ܽ ۍଶ ې ܽ ێ ܣଷ ۑൌ ߙ ܽ ێଷ ۑ ۑڭێ ۑڭێ ܽ ۏ ے ܽۏ ے ܽଵ ܽଵ Ͳ ܽ ۍଶ ې ܽ ۍଶ ې Ͳ ۍ ې ܽ ێ ܣଷ ۑെ ߙ ܽ ێଷ ۑൌ ۑۑͲێێ ۑڭێ ۑڭێ ۑ ڭ ێ ܽ ۏ ے ܽ ۏ ے Ͳ ۏ ے ܽଵ Ͳ ܽ ۍଶ ېͲۍ ې ۑ ێ ሾ ܣെ ߙܫ ሿ ܽ ێଷ ۑൌ ۑͲێ ۑڭێ ۑ ڭ ێ ܽۏ ےͲۏ ے We conclude that such ߙ and ሺܳ ് ሻexist if and only if ݀݁ݐሺ ܣെ ߙܫ ሻ ൌ Ͳ. Note: ݀݁ݐሺ ܣെ ߙܫ ሻ is called characteristic polynomial of ܣ. Example 4.1.2 Given the following ͵ ൈ ͵ matrix: ʹ Ͳ ͳ ܣൌ Ͳ ͳ െʹ൩ Ͳ Ͳ െͳ a. Find all eigenvalues of ܣ. b. For each eigenvalue, find the corresponding eigenspace. Solution: Part a: To find all eigenvalues, we set characteristic polynomial of ܣൌ Ͳ. This means that ݀݁ݐሺ ܣെ ߙܫ ሻ ൌ Ͳ. Then, we do the following:
110 ݀݁ݐሺ ܣെ ߙܫ ሻ ൌ Ͳ ʹ Ͳ ͳ ͳ Ͳ Ͳ ݀݁ ݐ൭Ͳ ͳ െʹ൩ െ ߙ Ͳ ͳ Ͳ൩൱ ൌ Ͳ Ͳ Ͳ െͳ Ͳ Ͳ ͳ ʹ Ͳ ͳ ߙ Ͳ Ͳ ݀݁ ݐ൭Ͳ ͳ െʹ൩ െ Ͳ ߙ Ͳ ൩൱ ൌ Ͳ Ͳ Ͳ െͳ Ͳ Ͳ ߙ ʹെߙ Ͳ ͳ ݀݁ ݐ൭ Ͳ ͳെߙ െʹ ൩൱ ൌ Ͳ Ͳ Ͳ െͳ െ ߙ ʹെߙ ݀݁ ݐ൭ Ͳ Ͳ
Ͳ ͳ ͳെߙ െʹ ൩൱ ൌ ሺʹ െ ߙሻ ή ሺͳ െ ߙሻ ή ሺെͳ െ ߙሻ ൌ Ͳ Ͳ െͳ െ ߙ
ߙൌʹ Thus, the eigenvalues of ܣare: ߙ ൌ ͳ ߙ ൌ െͳ Part b: Since the first eigenvalue is 2, then the eigenspace of ܣthat corresponds to the eigenvalue 2 equals to the set of all points in Թଷ such that ሺܣݐ݊݅ሻ ൌ ߙ ή ݐ݊݅ൌ set of all eigenvectors of ܣthat correspond to ߙ ൌ ሺʹ ݐ݊݅݊݅݃݅ݎሻ. To find eigenspace that corresponds to ߙ ൌ ʹ, say ܧଶ , we need to find ܧଶ ൌ ܰሺ ܣെ ʹܫଷ ሻ. Note: ܰ here represents݈݈ܰݑ. Now, we do the following: ʹ Ͳ ͳ ͳ Ͳ Ͳ ܧଶ ൌ ܰሺ ܣെ ʹܫଷ ሻ ൌ ܰ ൭Ͳ ͳ െʹ൩ െ ʹ Ͳ ͳ Ͳ൩൱ Ͳ Ͳ െͳ Ͳ Ͳ ͳ ʹെʹ Ͳ ͳ ܧଶ ൌ ܰሺ ܣെ ʹܫଷ ሻ ൌ ܰ ൭ Ͳ ͳെʹ െʹ ൩൱ Ͳ Ͳ െͳ െ ʹ Ͳ Ͳ ͳ ܧଶ ൌ ܰሺ ܣെ ʹܫଷ ሻ ൌ ܰ Ͳ െͳ െʹ൩ Ͳ Ͳ െ͵ 109
M. Kaabar
Thus, we form the augmented-matrix for the homogeneous system as follows: Ͳ Ͳ ͳ Ͳ ൭Ͳ െͳ െʹอͲ൱ Ͳ Ͳ െ͵ Ͳ ܽଷ ൌ Ͳ It is very easy to solve it: ܽଶ ൌ െʹܽଷ ൌ Ͳ ܽଷ ൌ Ͳ ܽଵ אԹ (Free-Variable), and ܽଶ ൌ ܽଷ ൌ Ͳ. ܧଶ ൌ ሼሺܽଵ ǡ ͲǡͲሻȁܽଵ אԹሽ. Now, we can select any value for ܽଵ ് Ͳ. Let’s select ܽଵ ൌ ͳ. We can also find ݀݅݉ሺܧଶ ሻ ൌ ͳ. Hence, ܧଶ ൌ ܵ݊ܽሼሺͳǡͲǡͲሻሽ. Similarly, we can find the eigenspaces that correspond to the other eigenvalues: ߙ ൌ ͳ and ߙ ൌ െͳ. To find eigenspace that corresponds to ߙ ൌ ͳ, say ܧଵ , we need to find ܧଵ ൌ ܰሺ ܣെ ͳܫଷ ሻ. Note: ܰ here represents݈݈ܰݑ. Now, we do the following: ʹ Ͳ ͳ ͳ Ͳ Ͳ ܧଵ ൌ ܰሺ ܣെ ͳܫଷ ሻ ൌ ܰ ൭Ͳ ͳ െʹ൩ െ ͳ Ͳ ͳ Ͳ൩൱ Ͳ Ͳ െͳ Ͳ Ͳ ͳ ʹെͳ Ͳ ͳ ܧଵ ൌ ܰሺ ܣെ ͳܫଷ ሻ ൌ ܰ ൭ Ͳ ͳെͳ െʹ ൩൱ Ͳ Ͳ െͳ െ ͳ ͳ Ͳ ͳ ܧଵ ൌ ܰሺ ܣെ ͳܫଷ ሻ ൌ ܰ Ͳ Ͳ െʹ൩ Ͳ Ͳ െʹ Thus, as we did previously, we form the augmentedmatrix for the homogeneous system as follows: ͳ Ͳ ͳ Ͳ ൭Ͳ Ͳ െʹอͲ൱ Ͳ Ͳ െʹ Ͳ ܽଵ ൌ െܽଷ It is very easy to solve it: ܽ ൌ Ͳ ଷ
112 ܽଶ אԹ (Free-Variable), and ܽଵ ൌ ܽଷ ൌ Ͳ. ܧଵ ൌ ሼሺͲǡ ܽଶ ǡ Ͳሻȁܽଶ אԹሽ. Now, we can select any value for ܽଶ ് Ͳ. Let’s select ܽଶ ൌ ͳ. We can also find ݀݅݉ሺܧଵ ሻ ൌ ͳ. Hence, ܧଵ ൌ ܵ݊ܽሼሺͲǡͳǡͲሻሽ. Finally, to find eigenspace that corresponds to ߙ ൌ െͳ, say ିܧଵ , we need to find ିܧଵ ൌ ܰሺ ܣ ͳܫଷ ሻ. Note: ܰ here represents݈݈ܰݑ. Now, we do the following: ʹ Ͳ ͳ ͳ Ͳ Ͳ ିܧଵ ൌ ܰሺ ܣ ͳܫଷ ሻ ൌ ܰ ൭Ͳ ͳ െʹ൩ ͳ Ͳ ͳ Ͳ൩൱ Ͳ Ͳ െͳ Ͳ Ͳ ͳ ʹͳ Ͳ ͳ ିܧଵ ൌ ܰሺ ܣ ͳܫଷ ሻ ൌ ܰ ൭ Ͳ ͳͳ െʹ ൩൱ Ͳ Ͳ െͳ ͳ ͵ Ͳ ͳ ିܧଵ ൌ ܰሺ ܣ ͳܫଷ ሻ ൌ ܰ Ͳ ʹ െʹ൩ Ͳ Ͳ Ͳ Thus, as we did previously, we form the augmentedmatrix for the homogeneous system as follows: ͵ Ͳ ͳ Ͳ ൭Ͳ ʹ െʹอͲ൱ Ͳ Ͳ Ͳ Ͳ ଵ ܽଵ ൌ െ ܽଷ ଷ It is very easy to solve it: ܽଶ ൌ ܽଷ ͲൌͲ ܽଷ אԹ (Free-Variable).
M. Kaabar
4.2 Diagonalizable Matrix In
this
section,
we
explain
the
concept
diagonalization, and we give some examples explaining it, and how to find the diagonal and invertible matrices. Definition 4.2.1 ܣis diagonalizable if there exists an invertible matrix ܮ, and a diagonal matrix ܦsuch that ܣൌ ିܮܦܮଵ . Result 4.2.1 ܣis ݊ ൈ ݊diagonalizable matrix if and only if ݀݁ݐሺ ܣെ ߙܫ ሻ is written as multiplication of linear equation, say ݀݁ݐሺ ܣെ ߙܫ ሻ = some constants ሺܿଵ െ ߙ ሻ ή ሺܿଶ െ ߙ ሻ ή ǥ ή ሺܿ െ ߙሻ, and ݀݅݉൫ܧ ൯ ൌ ݊ (Multiplicity of the Eigenvalues) for ݅ ݅ ݇. Example 4.2.1 Assume ܣis ͷ ൈ ͷ matrix, and ݀݁ݐሺ ܣെ ߙܫହ ሻ ൌ ሺ͵ െ ߙ ሻଶ ή ሺെʹ െ ߙ ሻ ή ሺͶ െ ߙ ሻ, and the ߙൌ͵ eigenvalues of ܣare ߙ ൌ െʹ ߙൌͶ
ିܧଵ ൌ ሼሺെ ܽଷ ǡ ܽଷ ǡ ܽଷ ሻȁܽଷ אԹሽ. Now, we can select any
Given ݀݅݉ሺܧଷ ሻ ൌ ͳǡ ݀݅݉ሺିܧଶ ሻ ൌ ͳ ݀݅݉ሺܧସ ሻ ൌ ʹ.
value for ܽଷ ് Ͳ. Let’s select ܽଷ ൌ ͳ. We can also find ݀݅݉ሺିܧଵ ሻ ൌ ͳ.
Is ܣdiagonalizable matrix?
ଵ ଷ
ଵ
Hence, ିܧଵ ൌ ܵ݊ܽሼሺെ ǡ ͳǡͳሻሽ. ଷ
111
of
Solution: It is not a diagonalizable matrix because ݀݅݉ሺܧଷ ሻ ൌ ͳ, and it must be equal to 2 instead of 1.
114 Example 4.2.2 Assume ܣis Ͷ ൈ Ͷ matrix, and ݀݁ݐሺ ܣെ ߙܫସ ሻ ൌ ሺʹ െ ߙ ሻଷ ή ሺ͵ െ ߙ ሻ, and given ݀݅݉ሺܧଶ ሻ ൌ ͵ ݀݅݉ሺܧଷ ሻ ൌ ͳ. Is ܣdiagonalizable matrix? Solution: According result 4.2.1, it is a diagonalizable matrix. Example 4.2.3 Given the following ͵ ൈ ͵ matrix: ʹ Ͳ ͳ ܣൌ Ͳ ͳ െʹ൩. Use example 4.1.2 from section 4.1 to Ͳ Ͳ െͳ answer the following questions: a. Is ܣdiagonalizable matrix? If yes, find a diagonal matrix ܦ, and invertible matrix ܮsuch that ܣൌ ିܮܦܮଵ . b. Find ܣsuch that ܣൌ ିܮܦܮଵ . c. Find ܣଷ such that ܣൌ ିܮܦܮଵ . Solution: Part a: From example 4.1.2, we found the following: ܧଶ ൌ ܵ݊ܽሼሺͳǡͲǡͲሻሽ ܧଵ ൌ ܵ݊ܽሼሺͲǡͳǡͲሻሽ ͳ ିܧଵ ൌ ܵ݊ܽሼሺെ ǡ ͳǡͳሻሽ ͵ According to result 4.2.1, ܣis a diagonalizable matrix. Now, we need to find a diagonal matrix ܦ, and invertible matrix ܮsuch that ܣൌ ିܮܦܮଵ . To find a diagonal matrix ܦ, we create a ͵ ൈ ͵ matrix, and we put eigenvalues on the main diagonal with 113
M. Kaabar
repetition if there is a repetition, and all other elements are zeros. Hence, the diagonal matrix ܦis as follows: ʹ Ͳ Ͳ ܦൌ Ͳ െͳ Ͳ൩ Ͳ Ͳ ͳ To find an invertible matrix ܮ, we create a ͵ ൈ ͵ matrix like the one above but each eigenvalue above is represented by a column of the eigenspace that corresponds to that eigenvalue as follows: ͳ ͳ െ Ͳ ͵ ൪ ܮൌ൦ Ͳ ͳ ͳ Ͳ ͳ Ͳ ʹ Ͳ Thus, ܣൌ Ͳ ͳ Ͳ Ͳ ͳ െ
ൌ Ͳ Ͳ
ଵ ଷ
ͳ ͳ
ͳ െʹ൩ ൌ ିܮܦܮଵ െͳ
Ͳ ʹ Ͳ ͳ Ͳ െͳ Ͳ Ͳ Ͳ
Ͳ ͳ Ͳ൩ Ͳ ͳ Ͳ
െ
ଵ ଷ
ͳ ͳ
Ͳ
ିଵ
ͳ Ͳ
Part b: To find ܣsuch that ܣൌ ିܮܦܮଵ , we do the following steps: ܣൌ ିܮܦܮଵ ܣൌ ሺିܮܦܮଵ ሻ ή ሺିܮܦܮଵ ሻ ή ǥ ή ሺିܮܦܮଵ ሻ ܣൌ ሺܦܮଶ ିܮଵ ሻ ή ǥ ή ሺିܮܦܮଵ ሻ ܣൌ ିܮ ܦܮଵ
116 Thus, ʹ Ͳ Ͳ ܣൌ ିܮ ܦܮଵ ൌ ܮͲ െͳ Ͳ൩ ିܮଵ ൌ Ͳ Ͳ ͳ Ͳ Ͳ ʹ Ͷ Ͳ Ͳ ܮ Ͳ ሺെͳሻ Ͳ ൩ ିܮଵ ൌ ܮ Ͳ ͳ Ͳ൩ ିܮଵ Ͳ Ͳ ͳ Ͳ Ͳ ͳ Part c: To find ܣଷ such that ܣൌ ିܮܦܮଵ , we do the following steps: ܣൌ ିܮܦܮଵ ܣଷ ൌ ሺିܮܦܮଵ ሻ ή ሺିܮܦܮଵ ሻ ή ǥ ή ሺିܮܦܮଵ ሻ ܣଷ ൌ ሺ ܦܮଶ ିܮଵ ሻ ή ǥ ή ሺିܮܦܮଵ ሻ ܣଷ ൌ ܦܮଷ ିܮଵ ʹ Ͳ Ͳ ଷ Thus, ܣଷ ൌ ܦܮଷ ିܮଵ ൌ ܮͲ െͳ Ͳ൩ ିܮଵ ൌ Ͳ Ͳ ͳ ʹଷ Ͳ Ͳ ʹଷ Ͳ Ͳ ିଵ ଷ ܮ Ͳ ሺെͳሻ Ͳ ൩ ܮൌ ܮ Ͳ ͳ Ͳ൩ ିܮଵ Ͳ Ͳ ͳ Ͳ Ͳ ͳଷ
M. Kaabar
݀݅݉ሺܧଷ ሻ ൌ ͵ ܧଶ ൌ ܵ݊ܽሼሺͲǡͲǡͲǡͳǡͳሻǡ ሺͲǡͲǡͲǡͲǡͳͲሻሽ a. Find the characteristic polynomial of ܣ. b. Find a diagonal matrix ܦsuch that ܣൌ ିܮܦܮଵ . c. Find an invertible matrix ܮsuch that ܣൌ ିܮܦܮଵ . 3. Assume that ܹ is ͵ ൈ ͵ matrix, and ܹ െ ߙܫଷ ൌ ሺͳ െ ߙሻሺʹ െ ߙሻଶ . Given ܰሺܹ െ ܫሻ ൌ ܵ݊ܽሼሺͳǡʹǡͲሻሽ, and ܰሺܹ െ ʹ ܫሻ ൌ ܵ݊ܽሼሺʹǡͲǡ͵ሻሽ. Is ܹ diagonalizable matrix? Explain.
Chapter 5 Matrix Dot Product In this chapter, we discuss the dot product only in Թ . In addition, we give some results about the dot product in Թ Ǥ At the end of this chapter, we get introduced to a
4.3 Exercises
concept called “Gram-Schmidt Orthonormalization”.
1. Given the following Ͷ ൈ Ͷ matrix: ͳ Ͳ Ͳ Ͳ ܥൌ Ͳ ͳ ͳ ͳ . Is ܥdiagonalizable matrix? Ͳ Ͳ െͳ ͳ Ͳ Ͳ Ͳ െͳ 2. Assume that ܣis ͷ ൈ ͷ diagonalizable matrix, and given the following: ܧଷ ൌ ܵ݊ܽሼሺʹǡͳǡͲǡͲǡͳሻǡ ሺͲǡͳǡͲǡͳǡͳሻǡ ሺͲǡͲǡʹǡʹǡͲሻሽ 115
5.1 The Dot Product in Թ In this section, we first give an example of the dot product in Թ , and then we give three important results related to this concept. Example 5.1.1 Assume that ܣൌ ሺʹǡͶǡͳǡ͵ሻ and ܤൌ ሺͲǡͳǡʹǡͷሻ where ܣǡ א ܤԹସ . Find ܣή ܤ.
118 Solution: To find ܣή ܤ, we need to do a simple vector dot product as follows: ܣή ܤൌ ሺʹǡͶǡͳǡ͵ሻ ή ሺͲǡͳǡʹǡͷሻ ܣή ܤൌ ʹήͲͶήͳͳήʹ͵ήͷ ܣή ܤൌ Ͳ Ͷ ʹ ͳͷ Thus, ܣή ܤൌ ʹͳ Result 5.1.1 If ܹଵ ܹଶ in Թ and ܹଵ ് ሺͲǡͲǡͲǡ ǥ ǡͲሻ
M. Kaabar
Result 5.1.4 Assume that ܹ ൌ ሺݔଵ ǡ ݔଶ ǡ ݔଷ ǡ ǥ ǡ ݔ ሻ, then the squared-norm of ܹ is written as follows: ȁȁܹ ȁȁଶ ൌ ݔଵ ଶ ݔଶ ଶ ݔଷ ଶ ڮ ݔ ଶ ൌ ܹ ή ܹ.
5.2 Gram-Schmidt Orthonormalization
and ܹଶ ് ሺͲǡͲǡͲǡ ǥ ǡͲሻ, and ܹଵ ή ܹଶ ൌ Ͳ, then ܹଵ and ܹଶ
In this section, we give one example that explains the
are independent.
concept of Gram-Schmidt Orthonormalization, and
Result 5.1.2 If ܹଵ ܹଶ in Թ are independent, then
how it is related to what we have learned in chapters 2
may/maybe not ܹଵ ή ܹଶ ൌ Ͳ. (i.e. Assume that
and 3.
ܹଵ ൌ ሺͳǡͳǡͳǡͳሻ and ܹଶ ൌ ሺͲǡͳǡͳǡͳሻ, then ܹଵ ή ܹଶ ൌ ͵)
Example 5.2.1 Given the following: ܣൌ ܵ݊ܽሼሺͳǡͲǡͳǡͳሻǡ ሺͲǡͳǡͲǡͳሻǡ ሺͲǡͳǡͳǡͳሻሽ. Find the orthogonal basis for ܣ. Hint: The orthogonal basis means Gram-Schmidt Orthonormalization.
Result 5.1.3 If ܹଵ ǡ ܹଶ ǡ ܹଷ ܹସ in Թ and none of them is ሺͲǡͲǡͲǡ ǥ ǡͲሻ, then we say that ܹଵ ǡ ܹଶ ǡ ܹଷ ܹସ are orthogonal if they satisfy the following conditions: ܹଵ ή ܹଶ ൌ Ͳ ܹଵ ή ܹଷ ൌ Ͳ ܹଵ ή ܹସ ൌ Ͳ ܹଶ ή ܹଷ ൌ Ͳ ܹଶ ή ܹସ ൌ Ͳ ܹଷ ή ܹସ ൌ Ͳ
117
Solution: To find the orthogonal basis for ܣ, we need to do the following steps: Step 1: Find a basis for ܣ. ͳ Ͳ ͳ ͳ ͳ Ͳ Ͳ ͳ Ͳ ͳ൩ െܴଶ ܴଷ ՜ ܴଷ Ͳ ͳ Ͳ ͳ ͳ ͳ Ͳ Ͳ Semi-Reduced Matrix.
ͳ Ͳ ͳ
ͳ ͳ൩ This is the Ͳ
Since we do now have a zero-row, then ݀݅݉ሺܣሻ ൌ ͵. To write a basis for ܣ, it is recommended to choose rows in the above Semi-Reduced Matrix. Thus, a basis for ܣis ሼሺͳǡͲǡͳǡͳሻǡ ሺͲǡͳǡͲǡͳሻǡ ሺͲǡͲǡͳǡͲሻሽ.
120 Step 2: Vector spaces assumptions.
M. Kaabar ௩ ήௐ
Since the basis for ܣis ሼሺͳǡͲǡͳǡͳሻǡ ሺͲǡͳǡͲǡͳሻǡ ሺͲǡͲǡͳǡͲሻሽ, we assume the following: ݒଵ ൌ ሺͳǡͲǡͳǡͳሻ
ܹଷ ൌ ݒଷ െ ߙଶ ή ܹଶ െ ߙଵ ή ܹଵ where ߙଵ ൌ ቀ య ȁȁభమ ቁ and ȁȁௐ భ
௩ ήௐ
య మ ቁ. ߙଶ ൌ ቀȁȁௐ ȁȁమ మ
Thus, ܹଷ ൌ ݒଷ െ ൬
ݒଶ ൌ ሺͲǡͳǡͲǡͳሻ ݒଷ ൌ ሺͲǡͲǡͳǡͲሻ Step 3: Use Gram-Schmidt Orthonormalization method. Let’s assume that the orthogonal basis for ܣis ܤ୰୲୦୭୭୬ୟ୪ ൌ ሼܹଵ ǡ ܹଶ ǡ ܹଷ ሽ. Now, we need to find ܹଵ ǡ ܹଶ ܹଷ . To find them, we need to do the following: ܹଵ ൌ ݒଵ ൌ ሺͳǡͲǡͳǡͳሻ ௩ ήௐ
ܹଶ ൌ ݒଶ െ ߙଵ ή ሺܹݏݑ݅ݒ݁ݎ ᇲ ௦ ሻ where ߙଵ ൌ ቀ మ ȁȁభమ ቁ ȁȁௐ భ
Thus, ܹଶ ൌ ݒଶ െ ൬ ܹଶ ൌ ݒଶ െ ൭
௩మήௐభ
ݒଶ ή ܹଵ หȁܹଵ ȁห
మ
หȁௐభ ȁห
ଶ൱
൰ ή ܹଵ
௩యήௐమ
మ
หȁௐమ ȁห
௩ ήௐ
య భ ൰ ή ܹଶ െ ቀȁȁௐ ቁ ή ܹଵ ȁȁమ భ
ͳ ͳ ʹ ۇሺͲǡͲǡͳǡͲሻ ή ቀെ ͵ ǡ ͳǡ െ ͵ ǡ ͵ቁۊ ͳ ͳ ʹ ܹଷ ൌ ሺͲǡͲǡͳǡͲሻ െ ۈ ଶ ۋή ൬െ ͵ ǡ ͳǡ െ ͵ ǡ ͵൰ ͳ ͳ ʹ ቤቚቀെ ǡ ͳǡ െ ǡ ቁቚቤ ͵ ͵ ͵ ۉ ی െ൭
Thus, ܹଷ ൌ ቀെ
ሺͲǡͲǡͳǡͲሻ ή ሺͳǡͲǡͳǡͳሻ ൱ ή ሺͳǡͲǡͳǡͳሻ ଶ หȁሺͳǡͲǡͳǡͳሻȁห
ସ ଵହ
ଵ ଵଵ
ǡെ ǡ
ହ ଵହ
ǡ െ ቁ. ଵହ
Hence, the orthogonal basis for ( ܣGram-Schmidt Orthonormalization) is ଵ
ଵ ଶ
ସ
ଷ
ଷ ଷ
ଵହ
ሼሺͳǡͲǡͳǡͳሻǡ ቀെ ǡ ͳǡ െ ǡ ቁ ǡ ቀെ
ଵ ଵଵ
ǡെ ǡ
ହ ଵହ
ǡ െ ቁሽ. ଵହ
5.3 Exercises 1. Let ܦൌ ܵ݊ܽሼሺͲǡͲǡͳǡͳሻǡ ሺͳǡͲǡͳǡͳሻǡ ሺͳǡ െͳǡͳǡͲሻሽ. Find the orthogonal basis for ܦ.
ή ܹଵ
ሺͲǡͳǡͲǡͳሻ ή ሺͳǡͲǡͳǡͳሻ ൱ ή ሺͳǡͲǡͳǡͳሻ ൌ ሺͲǡͳǡͲǡͳሻ െ ൭ ଶ หȁሺͳǡͲǡͳǡͳሻȁห ͳ ൌ ሺͲǡͳǡͲǡͳሻ െ ή ሺͳǡͲǡͳǡͳሻ ͵ ͳ ͳ ʹ ͳ ͳ ͳ ൌ ሺͲǡͳǡͲǡͳሻ െ ൬ ǡ Ͳǡ ǡ ൰ ൌ ൬െ ǡ ͳǡ െ ǡ ൰ ͵ ͵ ͵ ͵ ͵ ͵ 119
2. Let ܧൌ ܵ݊ܽሼሺͳǡͳǡ െͳǡͲሻǡ ሺͲǡͳǡͳǡͳሻǡ ሺ͵ǡͷǡ െͳǡʹሻሽ. Find the orthogonal basis for ܧ.
Answers to Odd-Numbered Exercises 1.10 Exercises 1. ݔସ ǡ ݔହ ǡ א ݔԹ ଷ
ଷ
ݔଵ ൌ ݔସ ݔହ െ ݔ ͳͷ ଶ
ଷ
ଷ
ସ
ଶ
ଶ
ଷ
ଵଷ
ସ
ଶ
ݔଶ ൌ െ ݔସ ݔହ ݔെ ଵ
ଵ
ݔଷ ൌ ݔସ െ ݔ ͷ ଶ
ଶ
3. a. ሾെ͵ b. Ͷ൩ Ͳ c. െʹ
͵ െͳ ͷሿ
ͳ 5. a. ቂ ʹ
Ͳ ʹ Ͳ ቃቂ ቃ ܣൌ ܣଶ ͳ Ͳ ͳ ଵ Ͳ ቂ ͳ Ͳቃ b. ቈ ଶ ܣଶ ൌ ܣ Ͳ ͳ െʹ ͳ ͳ ͳ
7. ܣis invertible (non-singular),, ିܣଵ ൌ ቈ ଷ ʹ ଶ
9.
రమ ୢୣ୲ሺሻ
ൌ
ଶ ଶ ሺିଵሻమశర ୢୣ୲ିଶ ଶ ଵ ଵଶ ିଵଵସସ
ଵ ିଵ൩ ସ
122
M. Kaabar
2.5 Exercises 1. a. Basis for ܯis ሼሺͳǡ െͳǡͳሻǡ ሺെͳǡͲǡͳሻሽ. b. ሺͳǡ െ͵ǡͷሻ ܯ אbecause there is a zero row that corresponds to ሺͳǡ െ͵ǡͷሻ which means that ሺͳǡ െ͵ǡͷሻ is dependent, and ሺͳǡ െ͵ǡͷሻcan be written as a linear combination. ݔ ܽ ݔ ݕ ൩ ܧ אǡ and ݒଶ ൌ ቈܽ ݖ, then 3. a. Let ݒଵ ൌ ݕ ݖ ݔܽ ݒଵ ݒଶ ൌ ሺ ݔ ܽሻ ሺ ݕ ݖሻ൩ ܧ א. ݕݖ ݔ For ݉ אԹ and ݒଵ ൌ ݔ ݕ൩ ܧ א, then ݕ ߙݔ ߙݔ ߙݒଵ ൌ ߙሺ ݔ ݕሻ൩ ൌ ߙ ݔ ߙݕ൩ ܧ א. Thus, ܧis a subspace ߙݕ ߙݕ of Թଷ . b. Basis for ܧis ሼሺͳǡͳǡͲሻǡ ሺͲǡͳǡͳሻሽ. c. ܧൌ ܵ݊ܽሼሺͳǡͳǡͲሻǡ ሺͲǡͳǡͳሻሽ. 5. ݔൌ ͵
ଶ଼
ൌ െ ଵଵସସ ൌ െ ଶ଼
7. ݀݅݉ሺ ܭሻ ൌ ʹ െʹ ʹ ʹ 9. No, since the determinant of ͳ െͳ ͵൩ equals ʹ െͳ Ͳ zero, then the elements ሼሺെʹǡͳǡʹሻǡ ሺʹǡͳǡ െͳሻǡ ሺʹǡ͵ǡͲሻሽ do not Թଷ .
ݔଵ ൌ Ͳ 11. ݔଶ ൌ െ͵ ݔଷ ൌ ͳ
121
124
3.4 Exercises
M. Kaabar
5.3 Exercises
1. a. Standard Matrix Representation of ܶ is ͳ Ͳ Ͳ ͳ Ͳ Ͳ Ͳ Ͳ ͳ Ͳ ͳ Ͳ b. Basis for ݎ݁ܭሺܶሻ is ሼሺͳǡͳǡͲǡͲሻǡ ሺͲǡͲǡͲǡͳሻǡ ሺͲǡͲǡͳǡͲሻሽ. Thus, ݎ݁ܭሺܶሻ is ܵ݊ܽሼሺͳǡͳǡͲǡͲሻǡ ሺͲǡͲǡͲǡͳሻǡ ሺͲǡͲǡͳǡͲሻሽ. ͳ ͳ Ͳ Ͳ Ͳ Ͳ ቃǡቂ ቃǡቂ ቃቅ. c. Basis for ܴܽ݊݃݁ሺܶሻ ൌ ቄቂ Ͳ Ͳ Ͳ ͳ ͳ Ͳ ͳ ͳ Ͳ Ͳ Ͳ Ͳ ቃǡቂ ቃǡቂ ቃቅ. Thus, ݎ݁ܭሺܶሻ is ܵ ݊ܽቄቂ Ͳ Ͳ Ͳ ͳ ͳ Ͳ ଵ
3. ݎ݁ܭሺܶሻ ൌ ܵ݊ܽሼെͲǤͷ ݔǡ െ ݔଶ ሽ. ଷ
5. a. Standard Matrix Representation of ܶ is ͳ െͳ ͳ െͳ ͳ Ͳ Ͳ Ͳ ͳ െͳ ͳ െͳ ͳ Ͳ Ͳ Ͳ b. Basis for ݎ݁ܭሺܶሻ is ሼ ݔ ݔଶ ǡ െ ݔ ݔଷ ሽ. Thus, ݎ݁ܭሺܶሻ is ܵ݊ܽሼ ݔ ݔଶ ǡ െ ݔ ݔଷ ሽ. ͳ ͳ െͳ Ͳ ቃǡቂ ቃቅ. c. Basis for ܴܽ݊݃݁ሺܶሻ ൌ ቄቂ ͳ ͳ െͳ Ͳ ͳ ͳ െͳ Ͳ ቃǡቂ ቃቅ. Thus, ݎ݁ܭሺܶሻ is ܵ ݊ܽቄቂ ͳ ͳ െͳ Ͳ
4.3 Exercises 1. Since ݀݅݉ሺିܧଵ ሻ ് ʹ, ܥis not diagonalizable. 3. ܹ is not diagonalizable because ݀݅݉ሺܧଶ ሻ must be 2 not 1. 123
1. The orthogonal basis for ( ܣGram-Schmidt Orthonormalization) is ଵ
ଵ
ଶ
ଶ
ሼሺͲǡͲǡͳǡͳሻǡ ሺͳǡͲǡͲǡͲሻǡ ቀͲǡ െͳǡ ǡ െ ቁሽ.
Bibliography [1] Badawi, A.: MTH 221/Linear Algebra. http://www.ayman-badawi.com/MTH221.html (2004). Accessed 18 Aug 2014.
125
Proof Printed By Createspace
Digital Proofer