Lecture Notes , Homework Sets , Learn Your Grades , Distribution of Grades

Announcements

  • Last Announcement:
  • The following is the points assigned to each homework assignment. The homework sets with Matlab assignments have more points than the other ones. The homework sets from the book, whose solution manual is provided, has less points.
    • HW 1: 100 points
    • HW 2: 150 points (points above 150 is bonus)
    • HW 3: 100 points
    • HW 4: 150 points (points above 150 is bonus)
    • HW 5: 200 points (points above 200 is bonus)
    • To get the full credit from HW assignments, you need a total of 700 points (Points above 700 is bonus)
  • Final Exam: Jan 15 (Saturday) starting at 13:30 at D 134 (the location of 2nd midterm)
  • List of recommended problems for final exam: [Problems 7.4, 7.6, 7.7, 7.8, 7.11 from Hayes]
  • Linear Prediction Section of Manolakis
  • Notes on Non-Random Parameter Estimation
  • Last HW: Collection is postponed to Jan 19 (Wednesday) 17:30. You can bring your HW to my office by the deadline.
  • Final Exam: Jan 15 (Saturday) starting at 13:30. Location will be announced later.
  • Notes on Linear MMSE Estimators
  • Notes on Correlation Coefficient , Notes on Gaussian Distribution (These notes are prepared for undergrad probability course)
  • Midterm #2: I am planning to ask two problems from the following list of problems
    given in Hayes' book: 7.1, 7.2, 7.4, 7.5, 7.9 and 4.16, 4.18, 4.23,
    4.24. (These two exam problems can be identical or trivially different
    from the problems of Hayes.)
  • Hand-out on Non-Causal Wiener Filtering
  • HW-5 is posted (Due: Jan. 07)
  • Midterm #2: Dec. 29th (Wednesday) 14:30-16:30 at D134.
  • Fall 2009 Semester MT#1 and its solutions
  • Midterm #1: Dec. 3th (Friday) during lecture hours.
  • NEW: Corrections on HW4 (Latest Change: Nov. 24, 18:00)
  • Midterm #1 Dec. 4th (Saturday) Details will be announced later.
  • HW-4 is posted (Due: Nov. 26)
  • HW-3: Hayes: Problems 2.4, 2.5, 2.19, 3.9; Therrien: 2.22, 2.23 (Due: Nov. 2)
  • HW-2 is posted (Due: Oct 26)
  • Solutions for HW-1
  • Review notes on DSP is added to Lecture Notes folder
  • HW-1 is posted (Due: Oct. 15)
  • Review Notes on Linear Spaces
  • Hw-0 is posted (not to be collected)
 

 

 

 

EE 503 Signal Analysis and Processing
(Fall 2010 – 2011)

Short Description:

The course aims to unify the knowledge of linear system theory, digital signal processing basics and stochastic processes into the framework of statistical signal processing. The course goal is to establish a firm foundation for estimation theory (parameter estimation, signal modeling), Wiener Filtering (approached from the direction of linear MSE estimation) and linear prediction. Some more advanced topics such as AR, MA, ARMA, Harmonic processes, linear decorrelating transform, series expansion of random processes, spectral factorization, causal – non causal IIR Wiener filters  are also discussed along the path.  

Related Courses:

EE 501, EE 531: Highly recommended to be taken with this course, (that is in the same semester).

EE 430 (Undergraduate DSP Course), EE 230 (Undergraduate Probability Course), EE 306 (Undergraduate Stochastic Processes Course): A fairly complete knowledge of these courses is assumed.

Outline of Topics:

  1. Review of Some Linear Algebra Concepts: Review of  Some Topics From EE 501 (1 Week)
    1. Matrices as Transformations
      1. Linear Space, Linear Operators in Linear Space
      2. Equivalent representations with finite/infinite matrices
      3. Isomorphism between finite energy functions and finite power sequences (L2 ó l2 spaces)
      4. Representation of points in alternative coordinate systems, representation of operators in alternative coordinate systems
      5. Diagonalization of operators (Eigenfunctions ó Eigenvectors)
      6. Hermitian Operators ó Hermitian Matrices, Orthogonal Bases

                           Ref: Strang, Wolf, Lancaster

    1. Matrices as Linear Combiners
      1. Range and Null space of the combination process
      2. Linear independence of vectors (points in linear space)
      3. Projection to Range/Null Space, Direct Sums

                           Ref: Scharf

    1. Matrices as Equation Systems
      1. Linear constraints (equations), intersection of constraints
      2. Under-Over determined systems, Unique-None-Infinite solution systems
      3. LS solution for inconsistent equation systems (over-determined)
        1. Projection to range space,
        2. Pseudo Inverse, SVD
      4. Minimum norm solutions for systems with infinite solutions
      5. SVD and its properties.

                           Ref: Scharf

  1. Review of Some DSP Concepts: Review of EE 430 Fundamentals (2 hours)
    1. Basic Idea: Discrete time processing of continuous time signals
      1. Sampling Theorem (going to discrete time without any loss of information)
      2. Bandlimited Interpolation (going back to continuous time after processing)
    1. Discrete Time Operations:
      1. Z-Transform, discrete time LTI systems, convolution, convolution matrices, diagonalization of convolution matrices
  1. Random Processes:  (4 Weeks) (Ideally should taken in parallel with EE 531)
    1. Random variables, random vectors (or a sequence of  random variables), random processes
    2. Moment descriptors (especially 2nd order moment description of R.P’s, mean, variance, correlation, auto-correlation, power spectrum density etc.) 
    3. Stationarity, Wide Sense Stationarity
    4. PSD and its properties, spectral factorization
    5. Linear Time Invariant Processing  of  WSS R.P’s
    6. Ergodicity

             Ref: Therrien, Hayes, Papoulis, Ross

  1. Signal Modeling (2 Weeks)
    1. LS methods, Pade, Prony (Deterministic methods)
    2. AR, MA, ARMA Processes (Stochastic approach), Yule-Walker Equations, Non-linear set of equations for MA system fit,
      1. All-pole modeling
        1. Covariance Method
        2. Auto-correlation Method
    1. Harmonic Processes, Wold decomposition
    2. Decorrelating transforms such as Fourier Transforms for Harmonic Processes and KL transform in general.
    3. Applications: Signal Compression, Signal Prediction, System Identification, Spectrum Estimation.

             Ref: Hayes, Papoulis

  1. Some Topics in Estimation Theory (5 weeks)
    1. Cost Functions: Mean Square, Mean absolute, max error
    2. MSE, ML, absolute error estimators
    3. Min MSE estimators
      1. Regression line, orthogonality
    1. Linear min MSE estimators
    2. Linear unbiased min MSE  estimators
    3. Bias, consistency, efficiency, bias-error variance trade-off.
    4. Discussion of LS estimator for Ax=b + n systems.
    5. Wiener Filters as optimal estimators
      1. Linear predictors defined from Wiener filters
      2. Levinson-Durbin recursion for efficient solution of Wiener-Hopf equations.
      3. Lattice Structures for efficient implementation of Wiener filters
    1. IIR Wiener Filters
      1. Non-causal, Causal                    

     Ref: Therrien, Hayes, Scharf

References:
Textbook for Signal Modeling Topic:
 [Hayes] :  M. H. Hayes, Statistical Signal Processing and Modeling, Wiley, New York, NY, 1996 (Level: moderate)

Textbook for Random Vectors and Processes Topics:
[Therrien] : Therrien, Charles W. , Discrete random signals and statistical signal processing, Prentice Hall, c1992. (Level: moderate)

[Scharf] : Louis L. Scharf, Statistical Signal Processing, Addison-Wesley Publishing Company, Inc., Reading, MA, 1991.(Level : advanced)

[Papoulis] : A. Papoulis, Probability, Random Variables, and Stochastic Processes, 3rd edition, McGraw Hill, 1991. (Level: important reference book, mostly advanced)

[Ross]: S. M. Ross, Introduction to probability models, 7th ed. Harcourt Academic Press, 2000. (Level: introductory but complete)

[Wolf] : Kurt Bernardo Wolf , Integral Transforms in Science and Engineering Plenum Pub Corp, January 1979 (Level: advanced)

[Lancaster]: P. Lancaster and M. Tismenetsky. The Theory of Matrices. Academic Press, Boston, 2nd edition, 1985.  (Level: complete text, very valuable as a linear algebra reference)