|| Linear Algebra || books



Matrices | Linear Algebra

01. Linear Algebra and Its Applications Wayne State University | By Gilbert Strang


Textbook solutions Verified

Chapter 1:Matrices and Gaussian Elimination

Chapter 2:Vector Spaces

Chapter 3:Orthogonality

Chapter 4:Determinants

Chapter 5:Eigenvalues and Eigenvectors

Chapter 6:Positive Definite Matrices

Chapter 7:Computations with Matrices

Chapter 8:Linear Programming and Game Theory

Chapter Appendix A:Intersection, Sum, and product of Spaces

Chapter Appendix B:The Jordan Form


02. Linear Algebra and Its Applications University of Maryland—College Park | By David C. Lay



Chapter 1:

Linear Models in Economics and Engineering


1.1 Sytems of Linear Equations (2)

1.2 Row Reduction and Echelon Forms (12)

1.3 Vector Equations (24)

1.4 The Matrix Equation Ax = b (35)

1.5 Solution sets for Linear Systems (43)

1.6 Applications of Linear Systems (50)

1.7 Linear Independence (56)

1.8 Introduction to Linear Transformation (63)

1.9 The Matrix of a Linear Transformation (71)

1.10 Linear Models in Business, Science and Engg (81)

Supplementary Exercises (89)


Chapter 2: MATRIX ALGEBRA (93)


INTRODUCTORY EXAMPLE : Computer Models in Aircraft Design

2.1 Matrix Operations (94)

2.2 The Inverse of a Matrix (104)

2.3 Characterization of Invertible Matrices (113)

2.4 Partitioned Matrices (119)

2.5 Matrix Factorizations (125)

2.6 The Leontief Input-Output Model (134)

2.7 Applications to Computer Graphics (140)

2.8 Subspaces of Rn (148)

2.9 Dimension and Rank (155)

Supplementary Exercises


Chapter 3: DETERMINANT (165)


INTRODUCTORY EXAMPLE : Random Paths and Distortion

3.1 Introduction to Determinants (166)

3.2 The Inverse of a Matrix (171)

3.3 Characterization of Invertible Matrices (179)

Supplementary Exercises (188)


Chapter 4: VECTOR SPACES (191)


INTRODUCTORY EXAMPLE : Random Paths and Distortion

4.1 Vector Spaces and Subspace (192)

4.2 Null Spaces, Column Spaces, and Linear Transformations (200)

4.3 Linearly Independent Sets, Bases (210)

4.4 Coordinate Systems (218)

4.5 The Dimension of a Vector Space (227)

4.6 Rank (232)

4.7 Change of Basis (241)

4.8 Applications to Difference Equations (246)

4.8 Applications to Markov Chains (255)

Supplementary Exercises (264)


Chapter 5: EIGEN VALUES AND EIGEN VECTORS (267)


INTRODUCTORY EXAMPLE : Dynamical Systems and Spotted Owls (267)

5.1 Eigenvalues and Eigenvectors (268)

5.2 The Characteristic Equation (276)

5.3 Diagonalization (283)

5.4 Eigenvectors and Linear Transformations (290)

5.5 Complex Eigenvalues (297)

5.6 Discrete Dynamical Systems (303)

5.7 Change of Basis (313)

5.8 Applications to Difference Equations (321)

Supplementary Exercises (328)


Chapter 6: ORTHOGONALITY AND LEAST SQUARES (331)


INTRODUCTORY EXAMPLE : The North American Datum and GPS Navigation (331)

6.1 Inner product, Length and Orthogonality (332)

6.2 Orthogonal sets (340)

6.3 Orthogonal Projections (349)

6.4 The Gramm-Schmidt Process (356)

6.5 Least-Squares problems (362)

6.6 Orthogonal Projections (370)

6.7 Orthogonal Projections (378)

6.8 Orthogonal Projections (385)

Supplementary Exercises (392)


Chapter 7: SYMMETRIC MATRICES AND QUADRATIC FORMS (395)


INTRODUCTORY EXAMPLE : Multichannel Image Processing (395)

7.1 Diagonalization of Symmetric Matrices (397)

7.2 Quadratic Forms (403)

7.3 Constrained Optimization (410)

7.4 The Singular Value Decomposition (416)

7.5 Applications to Image Processing and Staticis (426)

Supplementary Exercises (434)

Chapter 8: The Geometry of Vector Spaces (437)


INTRODUCTORY EXAMPLE : Multichannel Image Processing (437)

8.1 Affine Combinations (438)

8.2 Affine Independence (446)

8.3 Curve Combinations (456)

8.4 Hyperplanes (463)

8.5 Prototypes (471)

Supplementary Exercises (483)


Chapter 9: OPTIMIZATION (ONLINE)


INTRODUCTORY EXAMPLE : The Berlin Airlift

9.1 Matrix Games

9.2 Linear Programming -- Geometric Method

9.3 Linear Programming -- Simplex Method

9.4 Duality


Chapter 10: Finite-State Markov Chains (Online)


INTRODUCTORY EXAMPLE : Googling Markov Chains

10.1 Introduction and Examples

10.2 The Steady-State Vector and Google's Page Rank

10.3 Communication Classes

10.4 Classification of States

10.5 Fundamental Matrix

10.6 Markov Chains and Baseball Statistics



\[\begin{bmatrix} 1+3\mathrm{i} & 2+\mathrm{i} & 10\\ 4-3\mathrm{i} & 5 & -2 \end{bmatrix}\] A = [[ 6 -2 -1 ], [ -2 6 -1 ], [ -1 -1 5 ]] \[\begin{bmatrix} 6 & -2 & -1\\ -2 & 6 & -1\\ -1 & -1 & 5 \end{bmatrix}\] x^n + y^n = z^n
M = \begin{pmatrix} x_{11} & \cdots & x_{1j} \\ \vdots & \ddots & \vdots \\ x_{i1} & \cdots & x_{ij} \end{pmatrix} \begingroup \renewcommand*{\arraystretch}{1.5} \left[ \begin{array}{rrr} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \\ \end{array}\right] \endgroup
\int_{0}^{\pi}\frac{x^{4}\left(1-x\right)^{4}}{1+x^{2}}dx =\frac{22}{7}-\pi
The characteristic equation of A is [ 91 411 ]
\begin{matrix} 1a & 1b \\ 2a & 2b\end{matrix} \begin{pmatrix} 1a & 1b \\ 2a & 2b\end{pmatrix} \begin{bmatrix} 1a & 1b \\ 2a & 2b\end{bmatrix} \begin{Bmatrix} 1a & 1b \\ 2a & 2b\end{Bmatrix} \begin{vmatrix} 1a & 1b \\ 2a & 2b\end{vmatrix} \begin{Vmatrix} 1a & 1b \\ 2a & 2b\end{Vmatrix}
\begin{smallmatrix} a & b \\ c & d \end{smallmatrix} ( 1 0 0 0 1 0 0 0 1 )
\left( \begin{matrix} a_1 & b_1 \\ a_2 & b_2 \\ a_3 & b_3 \end{matrix} \left| \begin{matrix} c_1 \\ c_2 \\ c_3 \end{matrix} \right. \right)

No comments:

Post a Comment