Posts

Showing posts from August, 2018

Naive Bayes vs Logistic Regression

Image
I recently came across Tom Mitchell's comparison of Naive Bayes and Logistic Regression. Under some specific assumptions, Naive Bayes can be interpreted in terms of Logistic Regression. Furthermore, as the number of training examples reach infinity, under the aforementioned constraint, NB and LR classifiers are identical. In this blog I'll explain the bias of Naive Bayes, including the reason why using Naive Bayes instead of unbiased learning of Bayes classifiers, along with the relationship between Naive Bayes and Logistic Regression. 1. Why Naive Bayes? Consider the unbiased learning of Bayes classifiers. Given a supervised learning problem, we want to estimate a function f that maps input X  to output Y , f : X -> Y , or in other words, P(Y | X), where X  is the vector of n  attributes and Y  can take on k classes. Applying Bayes rule, we have the following: Since X  is a vector of n  attributes. Let's assume each attribute takes on either 0  or 1 . To represe