tailieunhanh - Lecture Signals, systems & inference – Lecture 22: Hypothesis testing

The following will be discussed in this chapter: Choosing between H=H 0 andH=H1 with minimum P(error), implementing the maximum a posteriori (MAP) rule, likelihood ratio test (LRT) implemention of MAP rule, binary hypothesis testing (example),. | Lecture Signals, systems & inference – Lecture 22: Hypothesis testing Hypothesis testing , Spring 2018 Lec 22 1 Choosing between H=H0 and H=H1 with minimum P(error) P (H0 is true) = P (H = H0 ) = P (H0 ) = p0 P (H1 is true) = P (H = H1 ) = P (H1 ) = p1 ) choose more probable hypothesis for min P (error) ‘H1 ’ > P (H1 ) P (H0 ) < ‘H0 ’ 2 Again choosing between H=H0 and H=H1 but now given R=r, for min P(error|R=r) ‘H1 ’ > P (H1 |R = r) P (H0 |R = r) < ‘H0 ’ Pick whichever hypothesis has maximum a posteriori probability 3 Implementing the maximum a posteriori (MAP) rule ‘H1 ’ > P (H1 |R = r) P (H0 |R = r) < ‘H0 ’ ‘H1 ’ > p1 .fR|H (r|H1 ) p0 .fR|H (r|H0 ) < ‘H0 ’ 4 Likelihood ratio test (LRT) implemention of MAP rule ‘H1 ’ > p1 .fR|H (r|H1 ) p0 .fR|H (r|H0 ) < ‘H0 ’ ‘H1 ’ fR|H (r|H1 ) > p0 ⇤(r) = =⌘ fR|H (r|H0 ) < p1 ‘H0 ’ 5 Binary hypothesis testing (example) 6 Binary hypothesis testing (example) 7 Binary hypothesis testing (example) PM PFA p1PM + p0PFA = P(error) 8 Terminology •  prevalence (p1) •  (conditional ) probability of detection, sensitivity, true positive rate, recall •  specificity, true negative rate •  (conditional) probability of false alarm, false positive rate (= 1– specificity) •  (conditional) probability of a miss, false negative rate (= 1 – sensitivity) •  positive predictive value, precision 9 •  negative predictive value MIT OpenCourseWare Signals, Systems and Inference Spring 2018 For information about citing these materials or our Terms of Use, visit: . 10