tailieunhanh - Chapter 4: Bayes Classifier

Chapter 4: Bayes Classifier present of you about The naïve Bayes Probabilistic model, Constructing a Classifier from the probability model, An application of Naïve Bayes Classifier, Bayesian network. | Chapter 4 Bayes Classifier Assoc. Prof. Dr. Duong Tuan Anh Faculty of Computer Science and Engineering, HCMC Univ. of Technology 3/2015 Outline Introduction The naïve Bayes Probabilistic model Constructing a Classifier from the probability model. An application of Naïve Bayes Classifier Bayesian network. 1. Introduction toNaïve Bayes Classifier A naïve Bayes classifier is a simple probabilistic classifier based on applying Bayes theorem where every feature is assumed to be class-conditionally independent. Naïve Bayes classifiers assume that the effect of a feature value on a given class is independent of the values of other features. This assumption is called class-conditionally independence. It is made to simplify the computation and in this sense, it is considered to be naïve. The assumption is fairly strong and sometimes is not applicable. But studies comparing classification algorithms have found the naïve Bayes classifier to be comparable in performance with decision trees . | Chapter 4 Bayes Classifier Assoc. Prof. Dr. Duong Tuan Anh Faculty of Computer Science and Engineering, HCMC Univ. of Technology 3/2015 Outline Introduction The naïve Bayes Probabilistic model Constructing a Classifier from the probability model. An application of Naïve Bayes Classifier Bayesian network. 1. Introduction toNaïve Bayes Classifier A naïve Bayes classifier is a simple probabilistic classifier based on applying Bayes theorem where every feature is assumed to be class-conditionally independent. Naïve Bayes classifiers assume that the effect of a feature value on a given class is independent of the values of other features. This assumption is called class-conditionally independence. It is made to simplify the computation and in this sense, it is considered to be naïve. The assumption is fairly strong and sometimes is not applicable. But studies comparing classification algorithms have found the naïve Bayes classifier to be comparable in performance with decision trees and neural network classifiers. They have also exhibited high accuracy and speed when applied to large databases. 2. The Naïve Bayes Probabilistic Model The probabilistic model for a classifier is a conditional model P(C|F1, ,Fn) where n is the number of features. Over a class variable C with a small number of classes, conditional on several feature variables F1 through Fn. Using Bayes theorem, we can write P(C|F1, ,Fn) = P(F1, ,Fn|C)P(C)/P(F1, ,Fn). The denominator does not depend on C and the values of the features Fi are given, so the denominator is effectively constant. The numerator is equivalent to the joint probability model P(C, F1, ,Fn) Which can be rewritten as follows, using repeated applications of the definition of conditional probability: P(C, F1, ,Fn) = P(C)P(F1, ,Fn|C) = P(C)P(F1|C)P(F2, ,Fn|C, F1) = P(C)P(F1|C)P(F2|C, F1)P(F3, ,Fn|C, F1, F2) = P(C)P(F1|C)P(F2|C, F1)P(F3|C, F1, F2) P(F4, ,Fn|C, F1, F2, F3) And so forth. Using the conditional independent .