Why Linear Algebra Matters¶
Linear algebra is the backbone of modern computing. It powers:
- Machine learning and AI
- Computer graphics and game engines
- Financial modeling (portfolio optimization, risk)
- Signal processing
- Quantum computing
Vectors¶
A vector is an ordered list of numbers. In programming, think of it as an array with mathematical properties.
Vector Operations in Python¶
import numpy as np
# Creating vectors
v1 = np.array([1, 2, 3])
v2 = np.array([4, 5, 6])
# Addition
v_sum = v1 + v2 # [5, 7, 9]
# Scalar multiplication
v_scaled = 3 * v1 # [3, 6, 9]
# Dot product
dot = np.dot(v1, v2) # 1*4 + 2*5 + 3*6 = 32
# Magnitude (length)
magnitude = np.linalg.norm(v1) # sqrt(1 + 4 + 9) ≈ 3.74
# Unit vector (normalized)
unit = v1 / np.linalg.norm(v1)
Dot Product Applications¶
The dot product tells you about the relationship between vectors:
| Dot Product | Meaning |
|---|---|
| > 0 | Vectors point in similar direction |
| = 0 | Vectors are perpendicular (orthogonal) |
| < 0 | Vectors point in opposite directions |
Matrices¶
A matrix is a 2D array of numbers. Think of it as a transformation machine.
Matrix Operations¶
# Creating matrices
A = np.array([[1, 2],
[3, 4]])
B = np.array([[5, 6],
[7, 8]])
# Matrix multiplication
C = A @ B # or np.matmul(A, B)
# [[19, 22],
# [43, 50]]
# Transpose
A_T = A.T
# [[1, 3],
# [2, 4]]
# Determinant
det = np.linalg.det(A) # 1*4 - 2*3 = -2
# Inverse
A_inv = np.linalg.inv(A)
# Verify: A @ A_inv ≈ Identity matrix
Key Matrix Properties¶
- Identity Matrix (I):
A @ I = A— the “1” of matrix multiplication - Inverse (A⁻¹):
A @ A⁻¹ = I— only exists if det(A) != 0 - Symmetric:
A = Aᵀ— important for covariance matrices
Systems of Linear Equations¶
Linear algebra excels at solving systems of equations:
2x + 3y = 8
4x + 1y = 6
This can be written as Ax = b:
A = np.array([[2, 3],
[4, 1]])
b = np.array([8, 6])
# Solve for x
x = np.linalg.solve(A, b)
print(x) # [1., 2.] → x=1, y=2
Eigenvalues and Eigenvectors¶
An eigenvector of a matrix is a vector that, when multiplied by the matrix, only gets scaled (not rotated).
$$Av = \lambda v$$
Where v is the eigenvector and λ (lambda) is the eigenvalue.
A = np.array([[4, 2],
[1, 3]])
eigenvalues, eigenvectors = np.linalg.eig(A)
print(f"Eigenvalues: {eigenvalues}") # [5., 2.]
print(f"Eigenvectors:\n{eigenvectors}")
Applications¶
- PCA (Principal Component Analysis): Dimensionality reduction
- Google PageRank: Web page ranking
- Quantum Mechanics: Observable states
- Stability Analysis: Dynamic systems
Application: Portfolio Optimization¶
In finance, linear algebra is used for Markowitz portfolio optimization:
# Expected returns for 3 assets
returns = np.array([0.12, 0.10, 0.07])
# Covariance matrix
cov_matrix = np.array([
[0.04, 0.006, 0.002],
[0.006, 0.025, 0.004],
[0.002, 0.004, 0.01]
])
# Equal weights portfolio
weights = np.array([1/3, 1/3, 1/3])
# Portfolio return
port_return = np.dot(weights, returns)
# Portfolio variance
port_variance = weights @ cov_matrix @ weights
# Portfolio volatility (standard deviation)
port_vol = np.sqrt(port_variance)
print(f"Return: {port_return:.2%}")
print(f"Volatility: {port_vol:.2%}")
Key Takeaways¶
- Vectors and matrices are the fundamental building blocks
- Matrix multiplication represents transformations and compositions
- Eigenvalues/eigenvectors reveal the fundamental nature of transformations
- Linear algebra has practical applications everywhere in programming and finance
numpymakes linear algebra in Python fast and intuitive
Further Reading¶
- Linear Algebra and Its Applications by Gilbert Strang
- Introduction to Linear Algebra (MIT OpenCourseWare 18.06)
- Mathematics for Machine Learning by Deisenroth, Faisal, Ong