# Introduction to Fundamentals of Statistical Signal Processing

*Read Fundamentals of Statistical Signal Processing, Volume III: Practical Algorithm Development and more than 24,000 other books and videos on Safari Books Online. Start a free trial today.
*

## 1.1. Motivation and Purpose

Over the last forty years there has been a virtual explosion of ideas, approaches, techniques, and applications of digital signal processing (DSP) to commercial products and military systems. The primary journal devoted to digital signal processing, the *IEEE Transactions on Acoustics, Speech, and Signal Processing*, which was founded in 1974, was originally published bimonthly with each issue consisting of about 100 pages. Today the *IEEE Transactions on Signal Processing*, which is devoted solely to signal processing, is published monthly with a content of about 500 pages, reflecting a *tenfold increase* in papers. This does not even account for the other more specialized journals that have been spawned, such as the *IEEE Transactions on Audio, Speech, and Language Processing*, the *IEEE Transactions on Image Processing*, and others. The algorithm designer, who must choose and implement an approach, is now faced with a bewildering cornucopia of possible algorithms. Even surveying the open literature to glean a promising approach can be an overwhelming task. As a result, it is now more important than ever for the algorithm designer to have a tried and trusted arsenal at his/her disposal. These approaches may not solve the current problem in its entirety, but will at least provide a good starting point for algorithm development.

In addition to accumulating a suite of trusted algorithms, it is critical that we understand why they work as well as when they are likely *to fail*. DSP algorithms and more specifically *statistical signal processing* algorithms, being highly mathematical and stochastic in nature, do not yield their secrets easily. But as the designer begins to implement these algorithms and observe their behavior, his/her intuition grows along with chances for successful future algorithm choices. This intuition may only be gained through experience. We are fortunate today that it is not necessary to implement an algorithm in hardware to assess its performance. Software implementations are readily available and allow relatively painless assessments of performance. A popular and very versatile software language is MATLAB, and it is this vehicle that we will use to implement our proposed algorithms and examine their performance. Its use allows us to “play” with the proposed algorithm as well as to provide us with a “first-cut” implementation in software. In fact, a MATLAB implementation frequently leads to an implementation on a DSP chip or on some specialized digital hardware. For these reasons we will rely heavily on MATLAB throughout this textbook.

The material contained herein are algorithms for *statistical signal processing*. On the other hand, for the processing of signals whose mathematical form is completely known and that are not subject to excessive noise, many standard techniques exist and have been found to be quite reliable. As examples, these typically include algorithms for the design of digital filters or for the computation of the discretetime Fourier transform, i.e., the fast Fourier transform (FFT). Many excellent books describe these algorithms and their implementations [Ingle and Proakis 2007, Lyons 2009]. In contrast, our purpose is to describe algorithms that can be used to analyze and extract information from *random data*. For example, the specification that a signal whose Fourier spectrum is lowpass in nature should be filtered by a digital lowpass filter prior to further processing, naturally requires the design of a digital filter with a prescribed cutoff frequency. A slightly different specification might be to filter a bandpass signal whose center frequency is *unknown*. In the first case, the specification is complete. In the second case, it remains to determine how to center the filter so as to pass the signal but hopefully remove much of the noise. The former calls for deterministic signal processing while the latter requires *estimation of the center frequency*, preferably on-line, so that if the signal center frequency changes, our algorithm will still be able to provide the appropriately centered filter. When there is uncertainty in the signal characteristics, only a *statistical* approach is appropriate.

Algorithms for the analysis of random data are highly problem specific. This is to say that each real-world signal processing problem, although generically related to many others, is unique and requires a specialized approach. Because of the seemingly endless development of new electronic systems and devices, it is not possible to use “off-the-shelf” algorithms. However, all is not lost! There exist a suite of “core” algorithms that appear at the heart of most practical signal processing systems. It is our intention to describe and implement in MATLAB these approaches in this book. A general discussion of these algorithms is given next.