# ECE 8072, Statistical Signal Processing, Fall 2009

Document Sample

```					                   ECE 8072, Statistical Signal Processing, Fall 2009
Course Outline

Part I: Random Variables and Vectors

[1 ] Introduction (Lect. 1)

1.1 Basic Set Theory
1.1 Basic Probability
1.1 The Gaussian Function

[2 ] A Random Variable (Lects. 1,2)

2.1 The Deﬁnition of a Random variables, and its Extensions
2.2 The Probability Distribution and Density Functions
2.3 Some Commonly Occurring Random Variables
2.4 Basic Detection Problems
2.5 Conditional Probability Density Functions

[3 ] Functions of a Random Variable (Lect. 3)

3.1 The pdf od a Functions of a Random Variable
3.2 Random number generators

4 Expectation and Moments (Lects. 3,4)

4.1 The Expectation Operator and Monents of a Random Variable
4.2 Signals in Noise and SNR
4.3 Moment-Generating and Characteristic Functions
4.4 Central Limit Theorem

[5 ] Random Vectors: Linear Transformations and Covariance Matrix Eigenstructure (Lects.
5,6)

5.1 Expectation & Moments
5.1.1 Gaussian Random Vectors
5.2 Linear Transformations
5.3 Vector Observations & the Observation Space
5.4 Diagonalization of Rx : Eigenstructure Transformation
5.4.1 Eigenstructure of Rx
5.4.2 Properties of the Eigenstructure of Rx
5.4.3 Orthogonalization of the Observation X
5.5 Diagonalization of C x and Decorrelation of the Observation
5.6 Gaussian Vector Observation and pdf Contours
5.7 Sample Estimates of Observation Mean and Correlation Matrix (and SVD)

1
Part II: Random Processes and Linear Time Invariant Systems

[6 ] Random Processes (Lects.7,8,9)

6.1 Introduction: Deﬁnitions and Basic Concepts
6.2 Correlation Functions & Power Spectral Density
6.2.1   DT Correlation Function
6.2.2   DT power Spectral Density
6.2.3   CT Correlation Functions & Power Spectral Density
6.2.4   Sampling Wide-Sense Stationary CT Random Processes
6.3 Correlation and Covariance Matrices
6.3.1 Random Vector Observation
6.3.2 Wide-Sense Stationary Random Processes
4.4 Discrete Karhunen Loeve Transformation (DKLT)
6.5 Narrowband Signals in Additive White Noise
6.5.1 Correlation Matrix Eigenstructure
6.5.2 An Example
6.6 Whitening
6.7 Note on Cyclostationary Processes

[7 ] Linear Time-Invariant (LTI) Systems (Lect. 10)

7.1 Discrete Time LTI System Review
7.2 Wide-Sense Stationary Random Processes and DT LTI Systems (mean, correlation func-
tions, power density spectra, examples)
7.3 Continuous Time Signals and Systems
7.4 Matched ﬁlters (various cases)
7.5 Linear Modeling and Random Processes

2
Part III: Estimation and Optimum Filtering

[8 ] Parameter Estimation (Lects. 11,12)

8.1 The Problem
8.2 Ad Hoc Mean and Variance Estimators
8.3 Maximum Likelihood (ML) Parameter Estimation
8.4 Cramer Rao Bounds & the Fisher Information Matrix
8.5 Overview of Bayesian Estimation
8.6 Estimation of Discrete-Valued Parameters (a.k.a. Detection)

[9 ] Optimum Filtering (Lects. 13,14)

9.1 Problem Statement and Examples
9.2 Minimum Mean Squared Error Filtering (optimum ﬁlter, orthogonality principle, mean-
square error surface)
9.3 Least Squares Filtering

[10 ] Overview of Spectrum Estimation (Lect. 14)

10.1 Problem Statement
10.2 Classical Spectrum Estimation (correlation function, periodogram, averaged/windowed
periodogram, computation)
10.3 Autoregressive Spectrum Estimation (model, AR coeﬃcient estimation, spectrum esti-
mation)
10.4 Optimum Filter Based Spectrum Estimation (ﬁlter banks, the ”ML” approach)
10.5 MUSIC: an Eigenstructure Approach (model, spectrum estimation)

3

```
DOCUMENT INFO
Shared By:
Categories:
Stats:
 views: 12 posted: 4/3/2010 language: English pages: 3