Table of Contents
1. Matrix and Types of Matrix
Definition: A matrix is a rectangular array of numbers (or functions) arranged in m rows and n columns. It is an m × n ("m by n") matrix.
Types of Matrix:
| Type | Description | Example |
|---|---|---|
| Row Matrix (or Row Vector) | A matrix with only one row (1 × n). | [ 1 5 -2 ] |
| Column Matrix (or Column Vector) | A matrix with only one column (m × 1). | [ 3 ] [ 0 ] [ 7 ] |
| Square Matrix | Number of rows equals number of columns (n × n). | [ 1 2 ] [ 3 4 ] |
| Null (or Zero) Matrix | All elements are zero. | [ 0 0 ] [ 0 0 ] |
| Identity (or Unit) Matrix (I) | A square matrix with 1s on the main diagonal (top-left to bottom-right) and 0s elsewhere. | [ 1 0 ] [ 0 1 ] |
| Diagonal Matrix | A square matrix where all non-diagonal elements are zero. | [ 5 0 0 ] [ 0 -2 0 ] [ 0 0 1 ] |
| Symmetric Matrix | A square matrix where A = AT (aij = aji). | [ 1 5 3 ] [ 5 2 7 ] [ 3 7 4 ] |
2. Matrix Operations
A. Addition and Subtraction
You can only add or subtract matrices that have the same dimensions (same number of rows and columns). The operation is performed element-wise.
B. Scalar Multiplication
To multiply a matrix by a scalar (a single number), you multiply every element in the matrix by that scalar.
C. Matrix Multiplication (A × B)
This is the most complex operation. To multiply matrix A (dimensions m × n) by matrix B (dimensions n × p), two conditions must be met:
- The number of columns in A (n) must equal the number of rows in B (n).
- The resulting matrix C will have dimensions m × p.
Each element cij in the new matrix is found by taking the dot product of the i-th row of A and the j-th column of B.
In general, A × B ≠ B × A.
In fact, B × A might not even be possible if the dimensions don't align.
3. Determinants and its properties
The determinant is a special scalar value that can be computed from a square matrix. It is denoted as det(A) or |A|.
Calculating Determinants:
- For a 2×2 Matrix:
A = [ a b ]|A| = ad - bc
[ c d ] - For a 3×3 Matrix (Sarrus's Rule or Cofactor Expansion):
Cofactor expansion is the general method. For any row i or column j, the determinant is the sum of each element multiplied by its cofactor.
|A| = a11C11 + a12C12 + a13C13Where the cofactor Cij = (-1)i+j * Mij (Mij is the "minor", the determinant of the 2x2 matrix left after removing row i and column j).
A square matrix A is invertible (has an inverse) if and only if its determinant is non-zero (|A| ≠ 0).
If |A| = 0, the matrix is called singular.
Properties of Determinants:
- If you swap two rows (or columns), the sign of the determinant flips.
- If a matrix has a row (or column) of all zeros, |A| = 0.
- If a matrix has two identical rows (or columns), |A| = 0.
- |AT| = |A| (Transpose has the same determinant).
- |A × B| = |A| × |B|
4. Transpose of a matrix
Definition: The transpose of a matrix A, denoted AT (or A'), is the matrix obtained by swapping its rows and columns.
If A is m × n, then AT is n × m.
Example:
[ 4 5 6 ] [ 2 5 ]
[ 3 6 ]
Properties of Transpose:
- (AT)T = A
- (A + B)T = AT + BT
- (k * A)T = k * AT (where k is a scalar)
- (A × B)T = BT × AT (Note the reversal of order!)
5. Scalar products, norms, orthogonality (Vectors)
A vector is a matrix with only one row (row vector) or one column (column vector). We'll consider two vectors, u and v.
Let u = (u1, u2, ..., un) and v = (v1, v2, ..., vn)
- Scalar (Dot) Product: The dot product u · v is a single scalar value. u · v = u1v1 + u2v2 + ... + unvn
- Norm (Length or Magnitude): The norm of a vector u, written ||u||, is its length.
||u|| = √(u1² + u2² + ... + un²)
Note: ||u||² = u · u
- Orthogonality: Two vectors u and v are orthogonal (perpendicular) if their dot product is zero. u · v = 0
6. Linear transformations
A transformation T is a function that maps a vector from one space to another (e.g., T(v) = w).
A transformation T is linear if it satisfies two properties for all vectors u, v and any scalar c:
- Additivity: T(u + v) = T(u) + T(v)
- Homogeneity: T(cu) = cT(u)
Matrix Representation: Every linear transformation T can be represented by matrix multiplication. T(x) = Ax, where A is the "transformation matrix".
Elementary Operations: These are operations on the rows of a matrix (used in Gaussian elimination to solve systems of equations).
- Swapping two rows.
- Multiplying a row by a non-zero scalar.
- Adding a multiple of one row to another row.
7. Solution of simultaneous linear equations
A system of n linear equations with n variables can be written in matrix form as:
A X = B- A is the n × n coefficient matrix.
- X is the n × 1 vector of variables.
- B is the n × 1 vector of constants.
Method 1: Matrix Inverse Method
If the coefficient matrix A is non-singular (i.e., |A| ≠ 0), it has an inverse, A-1.
- Start with AX = B
- Multiply both sides by A-1 (on the left): A-1(AX) = A-1B
- Since A-1A = I (Identity matrix): (A-1A)X = A-1B => IX = A-1B
- The solution is: X = A-1 B
Method 2: Cramer's Rule
This rule gives a direct formula for each variable in the solution vector X, but is only practical for 2x2 or 3x3 systems.
The solution for a variable xi is:
xi = |Ai| / |A|
- |A| is the determinant of the main coefficient matrix A.
- |Ai| is the determinant of a special matrix (Ai) formed by replacing the i-th column of A with the constant vector B.
System:
ax + by = e
cx + dy = f
Matrix A = [a b; c d], Vector B = [e; f]
|A| = ad - bc
|Ax| = |[e b; f d]| = ed - bf
|Ay| = |[a e; c f]| = af - ec
Solution:
x = (ed - bf) / (ad - bc)
y = (af - ec) / (ad - bc)
8. Economic applications of matrix algebra
A. Solving Market Equilibrium
Consider a 2-good market (e.g., apples and bananas). The demand and supply for each good depends on the price of both goods. We can set Qd = Qs for both markets to get a system of linear equations in the prices Pa and Pb. We can then use Cramer's Rule or the Inverse Method to find the equilibrium prices.
B. Leontief Input-Output Model
This is a major application of linear algebra in economics, showing how the output of one industry is an input for another.
- X = Total Output vector (What each industry produces).
- D = Final Demand vector (What consumers want).
- A = Technology Matrix (or Input-Output Coefficient matrix). An element aij shows how much input from industry i is needed to produce 1 unit of output for industry j.
The fundamental equation is:
Total Output = Intermediate Demand + Final Demand
X = AX + D
To find the Total Output (X) needed to satisfy a given Final Demand (D):
- X - AX = D
- (I - A)X = D (where I is the identity matrix)
- X = (I - A)-1 D
The matrix (I - A)-1 is called the Leontief Inverse Matrix. Each element tells you the total output from industry i required to deliver 1 unit of final output to industry j, accounting for all knock-on effects.