>>> Next 10 Lectures >>>

EE 503 Lectures (Fall 2020/21)

Lec. #1

00:00 Introduction
01:55 Communication example (motivation)
09:45 Loss functions (communication example)
13:41 Risk/Cost (communication example)
18:38 Restricting the estimator to linear estimation (communication example)
20:10 LMMSE Problem (communication example)
26:45 Connection with practice through law of large numbers (communication example)
27:13 Law of large numbers (communication example)
33:20 Average behavior and one time events
38:57 P implies Q (Mathematical Reasoning)
39:50 Necessary, Sufficient conditions
42:00 Truth table (P implies Q)
48:40 Direct Proof Approach (P implies Q)
51:08 Proof by Contraposition (P implies Q)

Correction: @47:30 and around: "P:True and Q:False" should be "P: False and Q:True" (Ugur Berk S.)

Lec. #2

00:00 - Proof by Contradiction (P implies Q)
05:32 - "if and only if" statement
10:45 - Proving "if and only if " statements
13:05 - Comments: Q as an indicator in P implies Q
14:25 - Comments: Many necessary conditions leading to necessary & sufficient cond.
19:00 - Example: reasoning in a court case (Murder in Kizilay)
25:55 - Linear Algebra Review
28:30 - Matrix multiplication
30:38 - Linear combination
31:46 - Range space of A matrix
33:45 - Null space of A matrix
36:33 - Importance of null space in linear equation solutions
40:10 - Unique solution condition (if there is a solution)
41:29 - Checking the dimension of Null space
43:18 - Checking linear independence
43:55 - Column Rank of A matrix
47:41 - Projection Matrices
51:58 - Norm

Corrections:
42:30 Additional note: Ax = 0 equation system is assumed to hold true for a non-trivial solution, i.e. x \neq 0

Lec. #3

0:00 - Projection Problem (reminder of last lecture)
3:10 - Distance Metric
4:05 - Distance Metric Axioms
6:10 - Norm Function
9:44 - Metric induced from norm function
14:00 - Projection to Range(A) (problem definition)
15:22 - Ever present engineering questions on existence, uniqueness and feasible method for solution
19:09 - Projection to plane (3D case)
26:08 - Projection to circle example
33:40 - Projection to a convex set example
36:22 - Optimality condition for projection (wide angle condition)
41:22 - Inner product
41:43 - Inner product axioms
47:29 - Norm induced by inner product
51:08 - Cauchy - Schwarz inequality (statement)

Lec. #4

0:00 - Inner Product / Norm Axiom (reminder)
2:00 - Cauchy Schwarz Inequality (proof)
8:40 - Cauchy Schwarz Equality case
13:00 - "angle" between vectors
14:40 - Example: Cosine theorem of 2D/3D Euclidean geometry
20:00 - Orthogonality of vectors, aligned vectors
24:30 - Triangle inequality for the induced norm (proof)
27:40 - Vector spaces, normed spaces, inner product spaces
30:30 - Projection matrices (problem statement, again!)
34:27 - Orthogonality conditions for the projection point (projectors)
41:03 - Orthogonality condition in matrix-vector form (projectors)
42:59 - Solving the projection point (projectors)
43:50 - Case of invertible Gram matrix (Solving the projection point)
45:30 - Projection matrix expression (finally!)

Corrections:
11:05 x = \lambda_x y should be x = - \lambda_x y (Ugur Berk S.)
15:30 (x,y) inner product should be (x,y) = x1 \times y1 + x2 \times y2 (Ada G.)

Lec. #5

0:00 - Case of non-invertible Gram matrix (extra!)
2:55 - Gram matrix definition and linear independence of vectors
4:40 - Range(A^T x A ) = Range (A^T) proof by SVD (extra!)
6:45 - Mini talk about SVD (extra!)
9:57 - Orthogonal Projectors (definition)
13:10 - Symmetric/Hermitian symmetric matrices (definition)
14:40 - Showing P_A is an orthogonal projector
17:23 - Transpose operation { (ABC)^T = C^T x B^T x A^T }
19:20 - Symmetric/Hermitian symmetric eigenvectors are orthogonal (statement only!)
20:00 - More general fact: Eigenvectors of normal matrices (statement only!)
21:30 - Orthogonal matrices (definition)
23:38 - Gram matrix as matrix of inner products (orthogonal matrices)
26:50 - Eigenspace of P_A
27:40 - Eigenvalues of P_A (eigenspace P_A)
31:25 - Eigenvectors of P_A (eigenspace P_A)
32:00 - Eigendecomposition expression (eigenspace P_A)
39:10 - Multiplication of matrices A and B via the columns of and rows of B
43:42 - Eigendecomposition expression (eigenspace P_A, finally!)
46:25 - On the representation basis and P_A matrix (uniqueness of P_A)
53:27 - Complementary Projector (definition)
56:40 - Orthogonality of P_A and P_A^\perp matrices
58:35 - Eigendecomposition of complementary projector
1:02:40 - Decomposition of vector b to Range(A) space and its complementary space

Corrections:
55:55 - Left side of the board, point 2: ( P_A^\perp )^T = ( P_A^\perp )^T should be ( P_A^\perp )^T = P_A^\perp

Lec. #6

0:00 - Orthogonal Basis Representations
7:10 - Projection operator with orthonormal basis
15:52 - Example 1: Canonical Basis
17:15 - Example 2: DFT Basis
22:30 - Inner product definition for complex-valued vectors
32:30 - Gram-Schmidt Orthogonalization
44:30 - QR decomposition and Gram-Schmidt relation

Corrections:
19:30 - N'th row N'th column of F matrix should be W^{(N-1)x(N-1)} (Gulin T.)

Lec. #7

0:00 - Introduction
1:00 - Quadratics (1 independent variable)
12:00 - Quadratics (2 independent variables)
13:00 - Level curves of J(x,y)
17:50 - Quadratic form : J(x) = x' A x + b'x + c
20:30 - Discussion on x' A x term
24:45 - Gradient of J(x) = (A + A')x + b
33:10 - Ju(u) = Jx(u+x_opt)
41:00 - Positive Definite Matrices (definition)
45:00 - Relation between positive definite matrices and eigenvalues
55:10 - Indefinite matrices (definition)
57:00 - Positive (semi) definiteness checks

Corrections:
30:00 - 2a11 + a12y + ... should be 2a11x + a12y + .... (similarly for the line below) (Gulin T.)

45:30 - A = [2 4; 4 2] should be A = [ 1 2; 2 1] ([2 4; 4 2] is the Hessian matrix of J(x,y) which is 2 x [1 2; 2 1]; This explanation is also OK.; but I would like to investigate the nature of quadratic terms u^T A u in this discussion. All explanations are correct, but possibly confusing since it is not clear why we switch to 2 \times A, instead of A = [ 1 2; 2 1]) (Gulin T.)

50:17 - The vector [1 -1] points in the opposite direction shown, no harm in this presentation since -1 x [1 -1] = [-1 1] is also an eigenvector and I am considering the values of the function on the eigenvector \times "t" where t is in (-\infty, \infty) (Onur Selim K.)

Lec. #8

0:00 - Positive Definite Matrices (case of non-symmetric matrices)
7:10 - Over-determined equation systems
9:00 - Tall matrices / Fat and Short matrices
10:06 - Minimizing || Ax - b||^2
15:15 - Solving A^T A x = A^T b (Normal Equation)
18:17 - Positive Definiteness of A^T A
23:15 - Review of some DSP Topics
23:34 - Analog / Continuous Time SP
24:30 - LTI systems
25:30 - Example: RC circuit (Low-pass filter)
27:46 - Discrete-time processing of analog signals (block diagram)
31:02 - D/C Conversion
32:17 - D/C Conversion (zero-order hold interpolation)
35:30 - D/C Conversion (linear interpolation)
39:20 - D/C Conversion (sinc interpolation)
43:58 - sinc(x) (definition)
47:00 - sinc interpolation (further explained)
52:25 - Discrete-time processing of analog signals (block diagram)
53:50 - Discrete-time processing of analog signals (bandlimited signals)
55:00 - Some problems of analog signal processing (discussion)
56:50 - Exact discrete-time implementation of analog systems via impulse invariance
1:06:33 - Fourier Transform (definition)
1:08:20 - Fourier Transform of rect(t/T)

Lec. #9

0:00 - Review of Probability Concepts
1:00 - Sample Space, Outcome, Event, Probability
6:20 - Kolmogorov's Probability Axioms
9:42 - De Morgan's Laws
13:50 - Brief Note on sigma algebra
18:50 - Random variable
21:03 - Random vector
26:15 - X = x, X: random variable, x: its value
27:10 - c.d.f. (cumulative density function)
30:20 - p.d.f. (probability density function)
34:02 - Probability value as "area under density"
39:43 - Units of density function and probability
41:57 - Probability mass functions
43:22 - Independence of events A and B
44:25 - Independence of random variables X and Y
47:25 - c.d.f/p.d.f for joint representation of X and Y
50:50 - Conditional Probability
59:30 - Conditional random variables
1:01:40 - Bayes' Theorem

Lec. #10

0:00 - Conditional Probability (review)
02:30 - Bayes' Theorem (review)
03:58 - Marginalization operation (two random variables)
12:20 - Total Probability Theorem
15:54 - Example: X is 1 if Heads, X is Unif([0,2]) if Tails
33:03 - Expectation Operation
34:45 - Expectation (law of large numbers interpretation)
39:51 - E{ g(X) }
40:50 - Moments, E{ X^k }
41:30 - Central Moments, E{ (X - Xbar)^k }
44:28 - Moment generating function
52:07 - Moments as partial description of a r.v.
53:34 - Conditional Expectation
57:52 - Iterated Expectation

>>> Next 10 Lectures >>>

 

 

 

 

 

 

EE 503 Statistical Signal Processing and Modeling
(Fall 2019– 2020)

Short Description:

This course is the first course on statistical signal processing in the graduate curriculum of Department of Electrical and Electronics Engineering, Middle East Technical University (METU). Topics covered in this course are random vectors, random processes, stationary random processes, wide sense stationary processes and their processing with LTI systems with applications in optimal filtering, smoothing and prediction. A major goal is to introduce the concept of mean square error (MSE) optimal processing of random signals by LTI systems.

For the processing of the random signals, it is assumed that some statistical information about the signal of interest and distortion is known. By utilizing this information, MSE optimal LTI filters (Wiener filters) are designed. This forms the processing part of the course. The estimation of the statistical information to construct Wiener filters forms the modeling part of the course. In the modeling part, we examine AR, MA, ARMA models for random signals and give a brief discussion of Pade, Prony methods for the deterministic modeling. Among other topics of importance are decorrelating transforms (whitening), spectral factorization, Karhunen-Loeve transform

This course is a natural pre-requisite (not a formal one) to EE5506 Advanced Statistical Signal Processing. The estimation theory topics in EE 503 is mostly limited to the moment description of random processes which forms a special, but the most important, case of EE 5506.

Outline of Topics:

  1. Review
    1. Basics of Mathematical Deduction
      1. Necessary, Sufficient Conditions
      2. Proofs via contradiction, contraposition
    2. Basics of Linear Algebra
      1. Linear independence of vectors (points in linear space)
      2. Range and Null space of the combination process
      3. Projection to Range/Null Space (orthogonality principle)
      4. Positive Definite Matrices
    3. Basics of Probability
      1. Probability as a mapping, axioms, conditional probability
      2. Expectation, law of large numbers
      3. Moments, moment generating function

  2. Random Processes
    1. Random variables, random vectors (or a sequence of random variables), moment descriptions (mean, variance, correlation), decorrelating transforms
    2. Random processes, stationarity, wide Sense Stationarity (WSS), power spectral density, spectral factorization, linear time invariant processing of WSS random processes, ergodicity

    Ref: Therrien, Hayes, Papoulis, Ross
     
  3. Signal Modeling
    1. LS methods, Pade, Prony (Deterministic methods)
    2. AR, MA, ARMA Processes (Stochastic approach), Yule-Walker Equations, Non-linear set of equations for MA system fit
    3. Harmonic Processes

    Ref: Hayes, Papoulis
     
  4. Estimation Theory Topics
    1. Random parameter estimation
      1. Cost function, loss function, square error, absolute error
      2. Conditional mean (regression line) as the minimum mean square error (MSE) estimator, orthogonality properties
      3. Linear minimum mean square error (LMMSE) estimators, orthogonality principle
      4. Regression line, orthogonality
      5. FIR, IIR, Causal–IIR Wiener filters
      6. Linear Prediction, backward prediction
      7. Random vector LMMSE estimation (multiple parameter)
    2. Non-random parameter estimation
      1. Maximum likelihood method
      2. Best Linear Unbiased Estimator (BLUE)
      3. Discussion of linear estimators for the linear observation model y=Ax+n
    3. Karhunen – Loeve Transform

    Ref: Therrien, Hayes
     
References:

[Hayes]: M. H. Hayes, Statistical Signal Processing and Modeling, Wiley, New York, NY, 1996.

[Therrien]: C. W. Therrien, Discrete random signals and statistical signal processing, Prentice Hall, c1992.

[Papoulis]: A. Papoulis, Probability, Random Variables, and Stochastic Processes, 3rd edition, McGraw Hill, 1991.

[Ross]: S. M. Ross, Introduction to probability models, 7th ed. Harcourt Academic Press, 2000.