1.7. Lessons Learned
The lessons listed next will be a major recurring theme of our discussions. We will have much more to say about these lessons as we examine algorithms and their performance throughout the book.
- Assess the feasibility of the algorithm requirements as the first step in the design. Typically, the Cramer-Rao lower bound is used for estimation, and the probability of detection of the Neyman-Pearson likelihood ratio test is used for detection. For classification, the maximum a posteriori (MAP) classifier is used. In all cases, the probability density functions must be known.
- Reassess the goals and/or require more accurate data if the specifications are not attainable.
- Signal processing cannot do the impossible—there must be a reasonable signal-to-noise ratio for it to succeed.
- In determining the performance of an algorithm via computer simulation, repeat the experiment many times, say 1000 or more. Make sure that your performance metric is statistical in nature, i.e., variance for an estimator, probability of detection for a detector, probability of error for a classifier. Keep increasing the number of experiments until the evaluation of these metrics produces consistent results.
- Make sure that the algorithm is tested under varying operational conditions to assess robustness, also called sensitivity.
- Verify that the underlying algorithm assumptions hold in practice by analyzing real-world data.
- Try to transform the problem into a simpler one, for example, the linear signal model.
- Test the algorithm first in MATLAB by using controlled data sets generated within MATLAB. The results should agree with theoretical predictions, if available. The performance obtained under MATLAB controlled conditions should be an upper bound on that for field data.
- Before proposing an algorithm, scour the open literature for similar signal processing problems in other fields and the methods employed.