<<< Previous 10 Lectures <<< >>> Next 10 Lectures >>>

EE 503 Lectures (Fall 2020/21)

Lec. #21

0:00 - Auto-correlation calculation, (LTI processing of WSS processes)
0:58 - Step 1: Cross-correlation calculation (auto-correlation calculation)
07:34 - Step 2: Auto-correlation calculation (auto-correlation calculation)
11:54 - Power spectral density
14:46 - Evaluation of output auto-correlation and input-output cross-correlation with Fourier transform
15:50 - DTFT of conj( h[ -n ] )
19:53 - Output power spectral density relations in Fourier-domain
22:30 - Z transform of conj( h[ -n ] )
27:00 - Output power spectral density relations in z-domain
30:28 - H(z) and H^*(1/z^*); conjugate reciprocal pole/zero pairs
40:57 - Power Spectral Density (properties)
41:00 - Property 1: Power spectral density is real valued
44:25 - Property 2: Power spectral density is non-negative (proved later)
44:48 - Property 3: Area under power spectral density is r_x[0] = E{ | x[n] |^2 }

Lec. #22

0:00 - Processing of WSS with LTI systems (review)
8:27 - Example: x[n] = A exp(j \omega n + \phi) A, \phi : ind. r.v's. Find S_x( e^{j\omega} )
13:59 - Proof of S_x( e^{j\omega} ) is non-negative
28:30 - Conditions for valid auto-correlation function
35:55 - Q: Can we generate a process whose psd is an arbitrary non-negative valued function?
36:54 - Example (Papoulis p.324): s(t) = A exp(j \omega( t - r(t)/c ) ) (doppler spread example)
space

Corrections:
48:00 - On the right side of the board inverse Fourier transform should be 2\pi \times Inverse F.T. (not affecting any results in the upcoming calculations) (Thanks to Gulin T.)

Lec. #23a

0:00 - Example (Papoulis p.324): s(t) = A exp(j \omega( t - r(t)/c ) ) (doppler spread example, cont'd)
9:16 - Example (Hayes 3.4.1): H(z) = 1 / (1 - 0.25 z^-1), x[n]: white noise. Find r_y[k].
10:20 -Z-transform, ROC, causal/anti-causal sequences etc. (review)
17:24 - r_y[k] calculation by inverse Z-transform (example)

Lec. #23b

0:00 - A 2nd Characterization for Power Spectral Density (windowed F.T. interpretation)
7:28 - Relation between WSS and Fourier Transform (F.T. decorrelates WSS processes)

Lec. #24

0:00 - Moving Average (MA) Processes
2:27 - Moving Average filter in time domain
8:40 - Pole-zero diagram for all-zero (MA) filters
12:40 - Interpretation of frequency response from pole-zero diagram
22:26 - Auto-correlation calculation for MA processes
27:25 - Deterministic auto-correlation
38:36 - Auto-regressive (AR) Processes
41:00 - Pole-zero diagram for all-pole (AR) filters
43:10 - Interpretation of frequency response from pole-zero diagram
48:57- Auto-correlation calculation for AR processes
1:07:30 - Yule-Walker equations in recursion form

Lec. #25

0:00 - Yule-Walker equations
6:02 - Finding H_AR(z) s.t. a given r_AR[k] is synthesized
15:09 -Finding r_AR[k] given H_AR(z)
27:09 - Example: AR(1) process auto-correlation calculation
38:01 - ARMA (Auto-regressive Moving-Average) Processes
51:50 - Auto-correlation recursions for ARMA processes

Lec. #26

0:00 - Periodic (Harmonic Processes)
0:51 - | r_x[k] | \leq r_x[0]
1:31 - Proof 1: | r_x[k] | \leq r_x[0] (by Rx \geq 0)
6:50 - Proof 2: | r_x[k] | \leq r_x[0] (by Cauchy Schwarz)
8:50 - Proof 3: | r_x[k] | \leq r_x[0] (by Power Spectral Density)
11:20 - Questioning the equality case, r_x[ T ] = r_x[0]
12:46 - Mean-square (MS) periodic processes (definition)
17:01 - MS-periodicity and periodicity of r_x[k]
24:10 - MS-periodicity implication on R_x matrix
33:45 - MS-periodicity and impulsive power spectrum (line spectra)
36:07 - Wold's decomposition theorem
40:22 - Example: Line spectra for complex sinusoid process with random phase and amplitude
44:30 - Spectral Factorization
44:50 - Synthesis Filter (random process synthesis)
46:30 - Whitening Filter
49:44 - Example: Spectral Factorization P(e^(j\omega)) = ( 5 - 4 cos(\omega) ) / (10 - 6 cos(\omega) )
1:08:24 - Minimum Phase Filters (causal, stable, causally-invertible) (example)
1:16:50 - Stochastic modeling and deterministic modeling

Lec. #27

0:00 - Introduction
0:40 - Stochastic Signal Modeling (reminder)
2:35 - Deterministic Signal Modeling
5:00 - Example: H(z) = b0/ ( 1 - a1z^-1) to match Z{x[n]}
8:26 - Example: H(z) = (b0 + b1z^-1) / ( 1 - a1z^-1) to match Z{x[n]}
11:55 - Pade's Approximation
19:50 - Convolution Matrix
29:00 - Example: Pade's app. with H(z) = b0 / (1 + a1z^-1 +a2z^-2)
31:46 - Prony's method (matrix form)
37:10 - Prony's method (summation form)
44:30 - Minimizing Prony's cost function (summation form)

Lec. #28

0:00 - All pole modeling (deterministic signal modeling)
3:53 - Prony's cost for all pole modeling
19:45 - Deterministic auto-correlation
21:15 - How/when deterministic and stochastic modeling overlap
26:06 - All pole modeling with finite data records
28:00 - Auto-correlation method
34:01 - Covariance method
41:10 - Comparison of auto-correlation & covariance methods
48:08 - Optimizing real-valued functions of complex variables (brief explanation)
58:46 - Link to paper: .pdf

Lec. #29

0:00 - Estimation Problem
6:15 - Classification of Estimation Problems (non-random, random)
11:20 - Maximum likelihood estimation
17:13 - Non-random parameter estimation
17:24 - Example: x[n] = c + w[n], w[n]: AWGN, c: non-random, Find chat_ML.
23:35 - log-likelihood (example, cont'd)
30:20 - Properties of Estimators
30:35 - Bias (properties of estimators)
39:47 - Consistency (properties of estimators)
52:03 - Efficiency (properties of estimators)
1:05:20 - Example: CRB for a parameter non-linearly related with observations (illustration)
1:10:01 - Asymptotical efficiency (properties of estimators)
1:12:27 - Folk's Theorem: ML is an asymptotically unbiased and efficient estimator

Lec. #30

0:00 - Random Parameter Estimation
1:50 - Cost function (square or absolute error)
5:10 - Risk (definition)
12:00 - Conditional mean as a minimizer of MSE (derivation)
21:48 - Regression line
23:48 - Properties of Conditional Mean Estimator
24:04 - Property 1: Conditional mean vector for multiple parameter estimation
30:36 - Property 2: Orthogonality of estimation error to non-linear processing of observations
48:45 - Example: f(x,y) uniform in 1x1 area in 1st and 3rd quadrants Find yhat(x).
52:20 - Estimator derivation - conditional mean estimator - (example)
52:20 - min MSE value calculation (example)

<<< Previous 10 Lectures <<< >>> Next 10 Lectures >>>

 

 

 

 

 

 

EE 503 Statistical Signal Processing and Modeling
(Fall 2019– 2020)

Short Description:

This course is the first course on statistical signal processing in the graduate curriculum of Department of Electrical and Electronics Engineering, Middle East Technical University (METU). Topics covered in this course are random vectors, random processes, stationary random processes, wide sense stationary processes and their processing with LTI systems with applications in optimal filtering, smoothing and prediction. A major goal is to introduce the concept of mean square error (MSE) optimal processing of random signals by LTI systems.

For the processing of the random signals, it is assumed that some statistical information about the signal of interest and distortion is known. By utilizing this information, MSE optimal LTI filters (Wiener filters) are designed. This forms the processing part of the course. The estimation of the statistical information to construct Wiener filters forms the modeling part of the course. In the modeling part, we examine AR, MA, ARMA models for random signals and give a brief discussion of Pade, Prony methods for the deterministic modeling. Among other topics of importance are decorrelating transforms (whitening), spectral factorization, Karhunen-Loeve transform

This course is a natural pre-requisite (not a formal one) to EE5506 Advanced Statistical Signal Processing. The estimation theory topics in EE 503 is mostly limited to the moment description of random processes which forms a special, but the most important, case of EE 5506.

Outline of Topics:

  1. Review
    1. Basics of Mathematical Deduction
      1. Necessary, Sufficient Conditions
      2. Proofs via contradiction, contraposition
    2. Basics of Linear Algebra
      1. Linear independence of vectors (points in linear space)
      2. Range and Null space of the combination process
      3. Projection to Range/Null Space (orthogonality principle)
      4. Positive Definite Matrices
    3. Basics of Probability
      1. Probability as a mapping, axioms, conditional probability
      2. Expectation, law of large numbers
      3. Moments, moment generating function

  2. Random Processes
    1. Random variables, random vectors (or a sequence of random variables), moment descriptions (mean, variance, correlation), decorrelating transforms
    2. Random processes, stationarity, wide Sense Stationarity (WSS), power spectral density, spectral factorization, linear time invariant processing of WSS random processes, ergodicity

    Ref: Therrien, Hayes, Papoulis, Ross
     
  3. Signal Modeling
    1. LS methods, Pade, Prony (Deterministic methods)
    2. AR, MA, ARMA Processes (Stochastic approach), Yule-Walker Equations, Non-linear set of equations for MA system fit
    3. Harmonic Processes

    Ref: Hayes, Papoulis
     
  4. Estimation Theory Topics
    1. Random parameter estimation
      1. Cost function, loss function, square error, absolute error
      2. Conditional mean (regression line) as the minimum mean square error (MSE) estimator, orthogonality properties
      3. Linear minimum mean square error (LMMSE) estimators, orthogonality principle
      4. Regression line, orthogonality
      5. FIR, IIR, Causal–IIR Wiener filters
      6. Linear Prediction, backward prediction
      7. Random vector LMMSE estimation (multiple parameter)
    2. Non-random parameter estimation
      1. Maximum likelihood method
      2. Best Linear Unbiased Estimator (BLUE)
      3. Discussion of linear estimators for the linear observation model y=Ax+n
    3. Karhunen – Loeve Transform

    Ref: Therrien, Hayes
     
References:

[Hayes]: M. H. Hayes, Statistical Signal Processing and Modeling, Wiley, New York, NY, 1996.

[Therrien]: C. W. Therrien, Discrete random signals and statistical signal processing, Prentice Hall, c1992.

[Papoulis]: A. Papoulis, Probability, Random Variables, and Stochastic Processes, 3rd edition, McGraw Hill, 1991.

[Ross]: S. M. Ross, Introduction to probability models, 7th ed. Harcourt Academic Press, 2000.