Home > database >  Naive bayes
Naive bayes

Time:10-02

Bayes' theorem began in the derivation of conditional probability, conditional probability can be defined as: under the premise of the event B occurs, the probability of event A occurs, math by P (A/B) the conditional probability, mathematical definition is:

P (A/B)=P (A intersection B)/P (B)

The formula of the vernacular interpreted as: "when the probability of B occur, without A" equal to "the probability of A and B at the same time" divided by the incidence of the "B",
"Accordingly?

The formula of the vernacular interpreted as: "when the probability of B occur, without A" equal to "the probability of A and B at the same time" divided by the incidence of the "B",


Below we will have a bayesian theory is used to monitor the classification of the scene, make [equation] represents a [equation] d feature vector (it is representing a set of characteristics, namely the features), these characteristics are used to describe an object; Formula on behalf of the object's class, the purpose of classification is to find a formula of [] and [equation] the mapping relationship between calculated formula of [], as to be classified objects have [equation] these characteristics, it belongs to the formula of conditional probability of a class,

Concrete, suppose that the number of categories for] [formula (that is, the value have [equation] [equation]), then for every possible value (for [equation], [equation]), we need according to the characteristics of the given formula to calculate the probability formula of [], and then, as long as from all [equation] pick out the value of the maximum probability corresponding formula as the most likely classification,

CodePudding user response:

Please put a note from my blog, thank you
  • Related