ECE 534: Random Processes (Fall 2015)

About  |  Schedule  |  References  |  Coursework


Schedule

The schedule will be updated and revised as the course progresses. Required reading in the course notes is indicated on the left; key topics will be highlighted.
Tue, Aug 25
Thu, Aug 27
Tue, Sep 1
Thu, Sep 3 [Ch. 1]

Introduction, review of basic probability, and administrivia.

  • The Kolmogorov axioms of probability theory; the definition of probability space (sample space, σ-algebra, probability measure); countable additivity and continuity of probability measures
  • Fundamental building blocks: fair coin toss and standard unit-interval probability space
  • Independence and conditional probability
  • Random variables: definition; their distributions; functions of a random variable; probability density functions; some frequently used distributions
  • Expectation of a random variable; properties of expectation (linearity, preservation of order, change of variables); characteristic functions
  • Jointly distributed random variables: definition, examples, conditional densities and conditional expectations; transformations of random vectors
Tue, Sep 8
Thu, Sep 10
Tue, Sep 15
Thu, Sep 17
Tue, Sep 22
[Ch. 2]

Convergence of random variables

  • Getting started: pointwise convergence of a sequence of random variables, monotone convergence theorem, Fatou's lemma, dominated convergence theorem
  • Four modes of convergence for sequences of random variables: almost sure convergence, convergence in mean square sense, convergence in probability, convergence in distribution
  • Implications between various modes of convergence of random variables: a.s. convergence implies convergence in probability, m.s. convergence implies convergence in probability, convergence in probability implies convergence in distribution; sufficiently rapid convergence in probability implies a.s. convergence (via the Borel-Cantelli lemma)
  • Cauchy criteria for convergence of random variables
  • Preservation of convergence of random variables under transformations (sums, products, continuous functions)
  • Limit theorems for sums of independent random variables
    • Central Limit Theorem
    • Laws of Large Numbers: weak and strong
    • Toolbox: moment generating functions, Bernstein-Chernoff-Hoeffding bounding technique, Hoeffding bound for a.s. bounded random variables
Thu, Sep 24
Tue, Sep 29
Tue, Oct 6
Thu, Oct 8
[Ch. 3]
Random vectors and minimum mean squared error estimation
  • Random vectors: expectation, correlation and covariance matrices, orthogonality
  • Second-order random variables
  • Minimum Mean Squared Error (MMSE) estimation: orthogonality principle; conditional expectation as a projection; nonlinear and linear MMSE estimators
  • Jointly Gaussian random variables
  • Linear innovation sequences
  • Discrete-time Kalman filtering
Tue, Oct 13
Thu, Oct 15
Tue, Oct 20
Thu, Oct 22
Tnu, Oct 29
[Ch. 4]
Random Processes
  • Definition of a random process (continuous- and discrete-time): sample paths; mean, autocorrelation, autocovariance functions
  • Examples: random walk and gambler's ruin; processes with independent increments; martingales; Brownian motion; counting processes; Poisson process
  • Discrete-time martingales
    • definition; martingales and martingale differences
    • the Doob martingale
    • inequalities of Azuma-Hoeffding and McDiarmid
    • examples: random walk with zero-mean increments; edge and vertex exposure martingales in random graphs; concentration of the chromatic number of a random graph
  • Stationarity and wide-sense stationarity
    • mean, autocorrelation, autocovariance functions of stationary and WSS processes
    • for Gaussian processes, stationarity is equivalent to wide-sense stationarity
  • Conditional independence and Markov processes
    • processes with deterministic initial condition and independent increments are Markov
    • Gauss-Markov processes
  • Discrete-state Markov processes: transition matrices; Chapman-Kolmogorov equation; time-homogeneous Markov processes; invariant (or equilibrium) distributions; infinitesimal generators
Oct 27
In-Class Exam 1: you will be allowed two sheet of notes, and the exam will be closed book otherwise; no calculators or other electronic devices allowed
Tue, Nov 3
Thu, Nov 5
Tue, Nov 10
Thu, Nov 12
Tue, Nov 17
[Ch. 6 (Secs. 1 and 2)]
Discrete-state Markov chains
  • Structural properties: irreducibility; periodicity; Markov chains are contractions on the space of probability mass functions over the state space
  • Invariant distributions: existence; uniqueness; convergence to equilibrium; time reversibility and detailed balance
  • Markov Chain Monte Carlo (MCMC): Metropolis dynamics and its properties (detailed balance); application to function optimization

Thu, Nov 19
In-Class Exam 2: you will be allowed one sheet of notes, and the exam will be closed book otherwise; no calculators or other electronic devices allowed
Tue, Nov 24
Thu, Nov 26
No class: Thanksgiving break

Tue, Dec 1
Thu, Dec 3
[Chs. 7 and 8 (parts)]
Second-order processes and spectral methods
  • Mean-square limits: existence; interchange of limit and expectation
  • Mean-square differentiation and integration of second-order random processes: existence criteria in terms of autocorrelation functions
  • Karhunen-Loève expansion
  • Linear processing of random signals: input-output properties, auto- and cross-correlation; convolutions and time reversal
  • Power spectral density: properties; transformation under linear time-invariant systems
  • Noncausal Wiener filtering: orthogonality principle; special case of filtering a signal in additive noise; relations to linear MMSE estimation; filtering a process over a finite time interval (Wiener filter and Karhunen-Loève expansion)
Tue, Dec 8
Review of Topics for Final Exam
  • Convergence of sequences of random variables: almost surely, in probability, in mean-square sense; Cauchy criteria.
  • Basic concepts of random processes: sample paths; martingales; independent increments.
  • Discrete-time Markov processes: recursive (random function) representation; finite-state Markov chains (irreducibility, aperiodicity, existence of and convergence to invariant distributions, Dobrushin's contraction coefficient).
  • Construction of continuous-time Markov processes using Poisson arrivals and discrete Markov chains.
  • Second-order processes: stationarity vs. wide-sense stationarity; Karhunen-Loève expansion.
  • Linear processing of random signals: linear time-invariant systems; input-output relations in terms of cross- and auto-correlations (time domain) and spectral densities (frequency domain).