This is the first book-length treatment of the Variational Bayes (VB) approximation in signal processing. It has been written as a self-contained, self-learning guide for academic and industrial research groups in signal processing, data analysis, machine learning, identification and control. It reviews the VB distributional approximation, showing that tractable algorithms for parametric model identification can be generated in off-line and on-line contexts. Many of the principles are first illustrated via easy-to-follow scalar decomposition problems. In later chapters, successful applications are found in factor analysis for medical image sequences, mixture model identification and speech reconstruction. Results with simulated and real data are presented in detail. The unique development of an eight-step "VB method", which can be followed in all cases, enables the reader to develop a VB inference algorithm from the ground up, for their own particular signal or image model.
Gaussian linear modelling cannot address current signal processing demands. In modern contexts, such as Independent Component Analysis (ICA), progress has been made specifically by imposing non-Gaussian and/or non-linear assumptions. Hence, standard Wiener and Kalman theories no longer enjoy their traditional hegemony in the field, revealing the standard computational engines for these problems. In their place, diverse principles have been explored, leading to a consequent diversity in the implied computational algorithms. The traditional on-line and data-intensive preoccupations of signal processing continue to demand that these algorithms be tractable.
Increasingly, full probability modelling (the so-called Bayesian approach)—or partial probability modelling using the likelihood function—is the pathway for design of these algorithms. However, the results are often intractable, and so the area of distributional approximation is of increasing relevance in signal processing. The Expectation-Maximization (EM) algorithm and Laplace approximation, for example, are standard approaches to handling difficult models, but these approximations (certainty equivalence, and Gaussian, respectively) are often too drastic to handle the high-dimensional, multi-modal and/or strongly correlated problems that are encountered. Since the 1990s, stochastic simulation methods have come to dominate Bayesian signal processing. Markov Chain Monte Carlo (MCMC) sampling, and related methods, are appreciated for their ability to simulate possibly high-dimensional distributions to arbitrary levels of accuracy. More recently, the particle filtering approach has addressed on-line stochastic simulation. Nevertheless, the wider acceptability of these methods—and, to some extent, Bayesian signal processing itself—has been undermined by the large computational demands they typically make.