# ECE 534: Random Processes (Fall 2012)

### Schedule

The schedule will be updated and revised as the course progresses. Required reading in the course notes is indicated on the left; key topics will be highlighted.
Tue, Aug 28
Thu, Aug 30
Tue, Sep 4
Thu, Sep 6
[Ch. 1]

Introduction, review of basic probability, and administrivia.

• The Kolmogorov axioms of probability theory; the definition of probability space (sample space, σ-algebra, probability measure); countable additivity and continuity of probability measures
• Fundamental building blocks: fair coin toss and standard unit-interval probability space
• Independence and conditional probability
• Random variables: definition; their distributions; functions of a random variable; probability density functions; some frequently used distributions
• Expectation of a random variable; properties of expectation (linearity, preservation of order, change of variables); characteristic functions
• Jointly distributed random variables: definition, examples, conditional densities and conditional expectations; transformations of random vectors
Tue, Sep 11
Thu, Sep 13
Tue, Sep 18
Thu, Sep 20
Tue, Sep 25
Thu, Sep 27
[Ch. 2]

Convergence of random variables

• Four modes of convergence for sequences of random variables: almost sure convergence, convergence in mean square sense, convergence in probability, convergence in distribution
• Implications between various modes of convergence of random variables: a.s. convergence implies convergence in probability, m.s. convergence implies convergence in probability, convergence in probability implies convergence in distribution; sufficiently rapid convergence in probability implies a.s. convergence (via the Borel-Cantelli lemma)
• Cauchy criteria for convergence of random variables
• Preservation of convergence of random variables under transformations (sums, products, continuous functions)
• Limit theorems for sums of independent random variables
• Central Limit Theorem
• Laws of Large Numbers: weak and strong
• Toolbox: moment generating functions, Bernstein-Chernoff-Hoeffding bounding technique, Hoeffding bound for a.s. bounded random variables
• Supplement 3
Mon, Oct 1
Tue, Oct 9
Thu, Oct 11
Tue, Oct 16
Thu, Oct 18
[Ch. 3]
Random vectors and minimum mean squared error estimation
• Random vectors: expectation, correlation and covariance matrices, orthogonality
• Second-order random variables
• Minimum Mean Squared Error (MMSE) estimation: orthogonality principle; conditional expectation as a projection; nonlinear and linear MMSE estimators
• Jointly Gaussian random variables
• Linear innovation sequences
• Discrete-time Kalman filtering
Tue, Oct 23
Thu, Oct 25
Tue, Oct 30
Thu, Nov 1
Tue, Nov 6
[Ch. 4]
Random Processes
• Definition of a random process (continuous- and discrete-time): sample paths; mean, autocorrelation, autocovariance functions
• Examples: random walk and gambler's ruin; processes with independent increments; martingales; Brownian motion; counting processes; Poisson process
• Stationarity and wide-sense stationarity
• mean, autocorrelation, autocovariance functions of stationary and WSS processes
• for Gaussian processes, stationarity is equivalent to wide-sense stationarity
• Conditional independence and Markov processes
• processes with deterministic initial condition and independent increments are Markov
• Gauss-Markov processes
• Discrete-state Markov processes: transition matrices; Chapman-Kolmogorov equation; time-homogeneous Markov processes; invariant (or equilibrium) distributions; infinitesimal generators
Thu, Nov 8
Tue, Nov 13
[Ch. 6 (Secs. 1 and 2)]
Discrete-state Markov chains
• Structural properties: irreducibility; periodicity; Markov chains are contractions on the space of probability mass functions over the state space
• Invariant distributions: existence; uniqueness; convergence to equilibrium; time reversibility and detailed balance
• Markov Chain Monte Carlo (MCMC): Metropolis dynamics and its properties (detailed balance); application to function optimization
Thu, Nov 15
Review
Tue, Nov 20
Thu, Nov 22
No class: Thanksgiving break

Tue, Nov 27
Thue, Nov 29
[Ch. 7 (parts)]
Mean-square calculus of random processes
• Mean-square limits: existence; interchange of limit and expectation
• Mean-square differentiation and integration of second-order random processes: existence criteria in terms of autocorrelation functions
• Karhunen-Loève expansion
Tue, Dec 4
Thu, Dec 6
[Ch. 8 (parts)]
Spectral methods
• Linear processing of random signals: input-output properties, auto- and cross-correlation; convolutions and time reversal
• Power spectral density: properties; transformation under linear time-invariant systems
• Noncausal Wiener filtering: orthogonality principle; special case of filtering a signal in additive noise; relations to linear MMSE estimation; filtering a process over a finite time interval (Wiener filter and Karhunen-Loève expansion)
Tue, Dec 11
Review of Topics for Final Exam
• Discrete-time Markov processes: recursive (random function) representation; finite-state Markov chains (irreducibility, aperiodicity, existence of and convergence to invariant distributions, Dobrushin's contraction coefficient)
• Second-order processes: stationarity vs. wide-sense stationarity; mean-square calculus (convergence, continuity, differentiation, integration)
• Linear processing of random signals: linear time-invariant systems; input-output relations in terms of cross- and auto-correlations (time domain) and spectral densities (frequency domain)