Explore Courses
Liverpool Business SchoolLiverpool Business SchoolMBA by Liverpool Business School
  • 18 Months
Bestseller
Golden Gate UniversityGolden Gate UniversityMBA (Master of Business Administration)
  • 15 Months
Popular
O.P.Jindal Global UniversityO.P.Jindal Global UniversityMaster of Business Administration (MBA)
  • 12 Months
New
Birla Institute of Management Technology Birla Institute of Management Technology Post Graduate Diploma in Management (BIMTECH)
  • 24 Months
Liverpool John Moores UniversityLiverpool John Moores UniversityMS in Data Science
  • 18 Months
Popular
IIIT BangaloreIIIT BangalorePost Graduate Programme in Data Science & AI (Executive)
  • 12 Months
Bestseller
Golden Gate UniversityGolden Gate UniversityDBA in Emerging Technologies with concentration in Generative AI
  • 3 Years
upGradupGradData Science Bootcamp with AI
  • 6 Months
New
University of MarylandIIIT BangalorePost Graduate Certificate in Data Science & AI (Executive)
  • 8-8.5 Months
upGradupGradData Science Bootcamp with AI
  • 6 months
Popular
upGrad KnowledgeHutupGrad KnowledgeHutData Engineer Bootcamp
  • Self-Paced
upGradupGradCertificate Course in Business Analytics & Consulting in association with PwC India
  • 06 Months
OP Jindal Global UniversityOP Jindal Global UniversityMaster of Design in User Experience Design
  • 12 Months
Popular
WoolfWoolfMaster of Science in Computer Science
  • 18 Months
New
Jindal Global UniversityJindal Global UniversityMaster of Design in User Experience
  • 12 Months
New
Rushford, GenevaRushford Business SchoolDBA Doctorate in Technology (Computer Science)
  • 36 Months
IIIT BangaloreIIIT BangaloreCloud Computing and DevOps Program (Executive)
  • 8 Months
New
upGrad KnowledgeHutupGrad KnowledgeHutAWS Solutions Architect Certification
  • 32 Hours
upGradupGradFull Stack Software Development Bootcamp
  • 6 Months
Popular
upGradupGradUI/UX Bootcamp
  • 3 Months
upGradupGradCloud Computing Bootcamp
  • 7.5 Months
Golden Gate University Golden Gate University Doctor of Business Administration in Digital Leadership
  • 36 Months
New
Jindal Global UniversityJindal Global UniversityMaster of Design in User Experience
  • 12 Months
New
Golden Gate University Golden Gate University Doctor of Business Administration (DBA)
  • 36 Months
Bestseller
Ecole Supérieure de Gestion et Commerce International ParisEcole Supérieure de Gestion et Commerce International ParisDoctorate of Business Administration (DBA)
  • 36 Months
Rushford, GenevaRushford Business SchoolDoctorate of Business Administration (DBA)
  • 36 Months
KnowledgeHut upGradKnowledgeHut upGradSAFe® 6.0 Certified ScrumMaster (SSM) Training
  • Self-Paced
KnowledgeHut upGradKnowledgeHut upGradPMP® certification
  • Self-Paced
IIM KozhikodeIIM KozhikodeProfessional Certification in HR Management and Analytics
  • 6 Months
Bestseller
Duke CEDuke CEPost Graduate Certificate in Product Management
  • 4-8 Months
Bestseller
upGrad KnowledgeHutupGrad KnowledgeHutLeading SAFe® 6.0 Certification
  • 16 Hours
Popular
upGrad KnowledgeHutupGrad KnowledgeHutCertified ScrumMaster®(CSM) Training
  • 16 Hours
Bestseller
PwCupGrad CampusCertification Program in Financial Modelling & Analysis in association with PwC India
  • 4 Months
upGrad KnowledgeHutupGrad KnowledgeHutSAFe® 6.0 POPM Certification
  • 16 Hours
O.P.Jindal Global UniversityO.P.Jindal Global UniversityMaster of Science in Artificial Intelligence and Data Science
  • 12 Months
Bestseller
Liverpool John Moores University Liverpool John Moores University MS in Machine Learning & AI
  • 18 Months
Popular
Golden Gate UniversityGolden Gate UniversityDBA in Emerging Technologies with concentration in Generative AI
  • 3 Years
IIIT BangaloreIIIT BangaloreExecutive Post Graduate Programme in Machine Learning & AI
  • 13 Months
Bestseller
IIITBIIITBExecutive Program in Generative AI for Leaders
  • 4 Months
upGradupGradAdvanced Certificate Program in GenerativeAI
  • 4 Months
New
IIIT BangaloreIIIT BangalorePost Graduate Certificate in Machine Learning & Deep Learning (Executive)
  • 8 Months
Bestseller
Jindal Global UniversityJindal Global UniversityMaster of Design in User Experience
  • 12 Months
New
Liverpool Business SchoolLiverpool Business SchoolMBA with Marketing Concentration
  • 18 Months
Bestseller
Golden Gate UniversityGolden Gate UniversityMBA with Marketing Concentration
  • 15 Months
Popular
MICAMICAAdvanced Certificate in Digital Marketing and Communication
  • 6 Months
Bestseller
MICAMICAAdvanced Certificate in Brand Communication Management
  • 5 Months
Popular
upGradupGradDigital Marketing Accelerator Program
  • 05 Months
Jindal Global Law SchoolJindal Global Law SchoolLL.M. in Corporate & Financial Law
  • 12 Months
Bestseller
Jindal Global Law SchoolJindal Global Law SchoolLL.M. in AI and Emerging Technologies (Blended Learning Program)
  • 12 Months
Jindal Global Law SchoolJindal Global Law SchoolLL.M. in Intellectual Property & Technology Law
  • 12 Months
Jindal Global Law SchoolJindal Global Law SchoolLL.M. in Dispute Resolution
  • 12 Months
upGradupGradContract Law Certificate Program
  • Self paced
New
ESGCI, ParisESGCI, ParisDoctorate of Business Administration (DBA) from ESGCI, Paris
  • 36 Months
Golden Gate University Golden Gate University Doctor of Business Administration From Golden Gate University, San Francisco
  • 36 Months
Rushford Business SchoolRushford Business SchoolDoctor of Business Administration from Rushford Business School, Switzerland)
  • 36 Months
Edgewood CollegeEdgewood CollegeDoctorate of Business Administration from Edgewood College
  • 24 Months
Golden Gate UniversityGolden Gate UniversityDBA in Emerging Technologies with Concentration in Generative AI
  • 36 Months
Golden Gate University Golden Gate University DBA in Digital Leadership from Golden Gate University, San Francisco
  • 36 Months
Liverpool Business SchoolLiverpool Business SchoolMBA by Liverpool Business School
  • 18 Months
Bestseller
Golden Gate UniversityGolden Gate UniversityMBA (Master of Business Administration)
  • 15 Months
Popular
O.P.Jindal Global UniversityO.P.Jindal Global UniversityMaster of Business Administration (MBA)
  • 12 Months
New
Deakin Business School and Institute of Management Technology, GhaziabadDeakin Business School and IMT, GhaziabadMBA (Master of Business Administration)
  • 12 Months
Liverpool John Moores UniversityLiverpool John Moores UniversityMS in Data Science
  • 18 Months
Bestseller
O.P.Jindal Global UniversityO.P.Jindal Global UniversityMaster of Science in Artificial Intelligence and Data Science
  • 12 Months
Bestseller
IIIT BangaloreIIIT BangalorePost Graduate Programme in Data Science (Executive)
  • 12 Months
Bestseller
O.P.Jindal Global UniversityO.P.Jindal Global UniversityO.P.Jindal Global University
  • 12 Months
WoolfWoolfMaster of Science in Computer Science
  • 18 Months
New
Liverpool John Moores University Liverpool John Moores University MS in Machine Learning & AI
  • 18 Months
Popular
Golden Gate UniversityGolden Gate UniversityDBA in Emerging Technologies with concentration in Generative AI
  • 3 Years
Rushford, GenevaRushford Business SchoolDoctorate of Business Administration (AI/ML)
  • 36 Months
Ecole Supérieure de Gestion et Commerce International ParisEcole Supérieure de Gestion et Commerce International ParisDBA Specialisation in AI & ML
  • 36 Months
Golden Gate University Golden Gate University Doctor of Business Administration (DBA)
  • 36 Months
Bestseller
Ecole Supérieure de Gestion et Commerce International ParisEcole Supérieure de Gestion et Commerce International ParisDoctorate of Business Administration (DBA)
  • 36 Months
Rushford, GenevaRushford Business SchoolDoctorate of Business Administration (DBA)
  • 36 Months
Liverpool Business SchoolLiverpool Business SchoolMBA with Marketing Concentration
  • 18 Months
Bestseller
Golden Gate UniversityGolden Gate UniversityMBA with Marketing Concentration
  • 15 Months
Popular
Jindal Global Law SchoolJindal Global Law SchoolLL.M. in Corporate & Financial Law
  • 12 Months
Bestseller
Jindal Global Law SchoolJindal Global Law SchoolLL.M. in Intellectual Property & Technology Law
  • 12 Months
Jindal Global Law SchoolJindal Global Law SchoolLL.M. in Dispute Resolution
  • 12 Months
IIITBIIITBExecutive Program in Generative AI for Leaders
  • 4 Months
New
IIIT BangaloreIIIT BangaloreExecutive Post Graduate Programme in Machine Learning & AI
  • 13 Months
Bestseller
upGradupGradData Science Bootcamp with AI
  • 6 Months
New
upGradupGradAdvanced Certificate Program in GenerativeAI
  • 4 Months
New
KnowledgeHut upGradKnowledgeHut upGradSAFe® 6.0 Certified ScrumMaster (SSM) Training
  • Self-Paced
upGrad KnowledgeHutupGrad KnowledgeHutCertified ScrumMaster®(CSM) Training
  • 16 Hours
upGrad KnowledgeHutupGrad KnowledgeHutLeading SAFe® 6.0 Certification
  • 16 Hours
KnowledgeHut upGradKnowledgeHut upGradPMP® certification
  • Self-Paced
upGrad KnowledgeHutupGrad KnowledgeHutAWS Solutions Architect Certification
  • 32 Hours
upGrad KnowledgeHutupGrad KnowledgeHutAzure Administrator Certification (AZ-104)
  • 24 Hours
KnowledgeHut upGradKnowledgeHut upGradAWS Cloud Practioner Essentials Certification
  • 1 Week
KnowledgeHut upGradKnowledgeHut upGradAzure Data Engineering Training (DP-203)
  • 1 Week
MICAMICAAdvanced Certificate in Digital Marketing and Communication
  • 6 Months
Bestseller
MICAMICAAdvanced Certificate in Brand Communication Management
  • 5 Months
Popular
IIM KozhikodeIIM KozhikodeProfessional Certification in HR Management and Analytics
  • 6 Months
Bestseller
Duke CEDuke CEPost Graduate Certificate in Product Management
  • 4-8 Months
Bestseller
Loyola Institute of Business Administration (LIBA)Loyola Institute of Business Administration (LIBA)Executive PG Programme in Human Resource Management
  • 11 Months
Popular
Goa Institute of ManagementGoa Institute of ManagementExecutive PG Program in Healthcare Management
  • 11 Months
IMT GhaziabadIMT GhaziabadAdvanced General Management Program
  • 11 Months
Golden Gate UniversityGolden Gate UniversityProfessional Certificate in Global Business Management
  • 6-8 Months
upGradupGradContract Law Certificate Program
  • Self paced
New
IU, GermanyIU, GermanyMaster of Business Administration (90 ECTS)
  • 18 Months
Bestseller
IU, GermanyIU, GermanyMaster in International Management (120 ECTS)
  • 24 Months
Popular
IU, GermanyIU, GermanyB.Sc. Computer Science (180 ECTS)
  • 36 Months
Clark UniversityClark UniversityMaster of Business Administration
  • 23 Months
New
Golden Gate UniversityGolden Gate UniversityMaster of Business Administration
  • 20 Months
Clark University, USClark University, USMS in Project Management
  • 20 Months
New
Edgewood CollegeEdgewood CollegeMaster of Business Administration
  • 23 Months
The American Business SchoolThe American Business SchoolMBA with specialization
  • 23 Months
New
Aivancity ParisAivancity ParisMSc Artificial Intelligence Engineering
  • 24 Months
Aivancity ParisAivancity ParisMSc Data Engineering
  • 24 Months
The American Business SchoolThe American Business SchoolMBA with specialization
  • 23 Months
New
Aivancity ParisAivancity ParisMSc Artificial Intelligence Engineering
  • 24 Months
Aivancity ParisAivancity ParisMSc Data Engineering
  • 24 Months
upGradupGradData Science Bootcamp with AI
  • 6 Months
Popular
upGrad KnowledgeHutupGrad KnowledgeHutData Engineer Bootcamp
  • Self-Paced
upGradupGradFull Stack Software Development Bootcamp
  • 6 Months
Bestseller
KnowledgeHut upGradKnowledgeHut upGradBackend Development Bootcamp
  • Self-Paced
upGradupGradUI/UX Bootcamp
  • 3 Months
upGradupGradCloud Computing Bootcamp
  • 7.5 Months
PwCupGrad CampusCertification Program in Financial Modelling & Analysis in association with PwC India
  • 5 Months
upGrad KnowledgeHutupGrad KnowledgeHutSAFe® 6.0 POPM Certification
  • 16 Hours
upGradupGradDigital Marketing Accelerator Program
  • 05 Months
upGradupGradAdvanced Certificate Program in GenerativeAI
  • 4 Months
New
upGradupGradData Science Bootcamp with AI
  • 6 Months
Popular
upGradupGradFull Stack Software Development Bootcamp
  • 6 Months
Bestseller
upGradupGradUI/UX Bootcamp
  • 3 Months
PwCupGrad CampusCertification Program in Financial Modelling & Analysis in association with PwC India
  • 4 Months
upGradupGradCertificate Course in Business Analytics & Consulting in association with PwC India
  • 06 Months
upGradupGradDigital Marketing Accelerator Program
  • 05 Months

Learn Naive Bayes Algorithm For Machine Learning [With Examples]

Updated on 03 July, 2023

7.57K+ views
15 min read

Introduction  

In mathematics and programming, some of the simplest solutions are usually the most powerful ones. The naïve Bayes Algorithm comes as a classic example of this statement. Even with the strong and rapid advancement and development in the field of Machine Learning, this Naïve Bayes Algorithm still stands strong as one of the most widely used and efficient algorithms. The naïve Bayes Algorithm finds its applications in a variety of problems including Classification tasks and Natural Language Processing (NLP) problems. 

The mathematical hypothesis of the Bayes Theorem serves as the fundamental concept behind this Naïve Bayes Algorithm. In this article, we shall go through the basics of Bayes Theorem, the Naïve Bayes Algorithm along with its implementation in Python with a real-time example problem. Along with these, we shall also look at some advantages and disadvantages of the Naïve Bayes Algorithm in comparison with its competitors.

Basics of Probability 

Before we venture out on understanding the Bayes Theorem and Naïve Bayes Algorithm, let us brush up our existing knowledge upon the fundamentals of Probability. 

Get Machine Learning Certification from the World’s top Universities. Earn Masters, Executive PGP, or Advanced Certificate Programs to fast-track your career.

As we all know by definition, given an event A, the probability of that event occurring is given by P(A). In probability, two events A and B are termed as independent events if the occurrence of event A does not alter the probability of occurrence of event B and vice versa. On the other hand, if one’s occurrence changes the probability of the other, then they are termed as Dependent events.

Let us get introduced to a new term called Conditional Probability. In mathematics, Conditional Probability for two events A and B given by P (A| B) is defined as the probability of the occurrence of event A given that event B has already occurred. Depending upon the relationship between the two events A and B as to whether they are dependent or independent, Conditional Probability is calculated in two ways.

  • The conditional probability of two dependent events A and B is given by P (A| B) = P (A and B) / P (B)
  • The expression for the conditional probability of two independent events A and B is given by, P (A| B) = P (A)

Knowing the math behind Probability and Conditional Probabilities, let us now move on towards the Bayes Theorem.

Bayes Theorem 

In statistics and probability theory, the Bayes’ Theorem also known as the Bayes’ rule is used to determine the conditional probability of events. In other words, the Bayes’ theorem describes the probability of an event based on prior knowledge of the conditions that might be relevant to the event.

To understand it in a simpler way, consider that we need to know the probability of the price of a house is very high. If we know about the other parameters such as the presence of schools, medical shops and hospitals nearby, then we can make a more accurate assessment of the same. This is exactly what the Bayes Theorem performs.

Such that,

  • P(A|B) – the conditional probability of event A occurring, given event B has occurred also known as Posterior Probability.
  • P(B|A) – the conditional probability of event B occurring, given event A has occurred also known as Likelihood Probability.
  • P(A) – the probability of event A occurring also known as Prior Probability.
  • P(B) – the probability of event B occurring also known as Marginal Probability. 

Suppose we have a simple Machine Learning problem with ‘n’ independent variables and the dependent variable which is the output is a Boolean value (True or False). Suppose the independent attributes are categorical in nature let us consider 2 categories for this example. Hence, with these data, we need to calculate the value of the Likelihood Probability, P(B|A).

Hence, on observing the above we find that we need to calculate 2*(2^n -1) parameters in order to learn this Machine Learning model. Similarly, if we have 30 Boolean independent attributes, then the total number of parameters to be calculated will be close to 3 billion which is extremely high in computational cost.

This difficulty in building a Machine Learning model with the Bayes Theorem led to the birth and development of the Naïve Bayes Algorithm.

Naïve Bayes Algorithm  

In order to be practical, the above-mentioned complexity of the Bayes Theorem needs to be reduced. This is exactly achieved in the Naïve Bayes Algorithm by making few assumptions. The assumptions made are that each feature makes an independent and equal contribution to the outcome. 

The naïve Bayes Algorithm is a supervised learning algorithm and it is based on the Bayes theorem which is primarily used in solving classification problems. It is one of the simplest and most accurate Classifiers which build Machine Learning models to make quick predictions. Mathematically, it is a probabilistic classifier as it makes predictions using the probability function of the events.

Example Problem 

In order to understand the logic behind the assumptions, let us go through a simple dataset to get a better intuition.

Colour Type Origin Theft?
Black Sedan Imported Yes
Black SUV Imported No
Black Sedan Domestic Yes
Black Sedan Imported No
Brown SUV Domestic Yes
Brown SUV Domestic No
Brown Sedan Imported No
Brown SUV Imported Yes
Brown Sedan Domestic No

From the above-given dataset, we can derive the concepts of the two assumptions that we defined for the Naïve Bayes Algorithm above.

  • The first assumption is that all the features are independent of each other. Here, we see that each attribute is independent such as the colour “Red” is independent of the Type and Origin of the car.
  • Next, each feature is to be given equal importance. Similarly, only having knowledge about the Type and Origin of the Car isn’t sufficient to predict the output of the problem. Hence, none of the variables is irrelevant and hence they all make an equal contribution to the outcome.

To sum it up, A and B are conditionally independent given C if and only if, given the knowledge that C occurs, knowledge of whether A occurs provides no information on the likelihood of B occurring, and knowledge of whether B occurs provides no information on the likelihood of A occurring. These assumptions make the Bayes algorithm – Naive. Hence the name, Naïve Bayes Algorithm.

Hence for the above-given problem, the Bayes Theorem can be rewritten as – 

Such that,

  • The independent feature vector, X = (x1, x2, x3……xn) representing the features such as Colour, Type and Origin of the Car.
  • The output variable, y has only two outcomes Yes or No.

Hence, by substituting the above values, we obtain the Naïve Bayes Formula as,

In order to calculate the posterior probability P(y|X), we have to create a Frequency Table for each attribute against the output. Then converting the frequency tables to Likelihood Tables after which we finally use the Naïve Bayesian equation to calculate the posterior probability for each class. The class with the highest posterior probability is chosen as the outcome of the prediction. Below are the Frequency and likelihood tables for all three predictors.

Frequency Table of Colour                        Likelihood Table of Colour

Frequency Table of Type                            Likelihood Table of Type

Frequency Table of Origin                         Likelihood Table of Origin

Consider the case where we need to calculate the posterior probabilities for the below-given conditions – 

Colour Type Origin
Brown SUV Imported

Thus, from the above given formula, we can calculate the Posterior Probabilities as shown below– 

P(Yes | X) = P(Brown | Yes) * P(SUV | Yes) * P(Imported | Yes) * P(Yes)

    = 2/5 * 2/4 * 2/5 * 1

    = 0.08

P(No | X) = P(Brown | No) * P(SUV | No) * P(Imported | No) * P(No)

    = 3/5 * 2/4 * 3/5 * 1

    = 0.18

From the above-calculated values, as the Posterior Probabilities for No is Greater than Yes (0.18>0.08), then it can be inferred that a car with Brown Colour, SUV Type of an Imported Origin is classified as “No”. Hence, the car is not stolen.

Implementation in Python 

Now that we have understood the math behind the Naïve Bayes algorithm and also visualized it with an example, let us go through its Machine Learning code in Python language.

Related: Naive Bayes Classifier

Problem Analysis  

In order to implement the Naïve Bayes Classification program in Machine Learning using Python, we will be using the very famous ‘Iris Flower Dataset”. The Iris flower data set or Fisher’s Iris data set is a multivariate data set introduced by the British statistician, eugenicist, and biologist Ronald Fisher in 1998. This is a very small and basic dataset that consists of very less numeric data containing information about 3 classes of flowers belonging to the Iris species which are – 

  • Iris Setosa
  • Iris Versicolour
  • Iris Virginica

There are 50 samples of each of the three species amounting to a total dataset of 150 rows. The 4 attributes (or) independent variables that are used in this dataset are – 

  • sepal length in cm
  • sepal width in cm
  • petal length in cm
  • petal width in cm

The dependant variable is the “species” of the flower that is identified by the above given four attributes.

Step 1 – Importing the Libraries

As always, the primary step in building any Machine Learning model will be to import the relevant libraries. For this, we shall load the NumPy, Mathplotlib and the Pandas libraries for pre-processing the data.

import numpy as np
import matplotlib.pyplot as plt
import pandas as pd

Step 2 – Loading the Dataset

The Iris flower dataset to be used for training the Naïve Bayes Classifier shall be loaded into a Pandas DataFrame. The 4 independent variables shall be assigned to the variable X and the final output species variable is assigned to y.

dataset = pd.read_csv(‘https://raw.githubusercontent.com/mk-gurucharan/Classification/master/IrisDataset.csv’)X = dataset.iloc[:,:4].values
y = dataset[‘species’].valuesdataset.head(5)>>
sepal_length  sepal_width  petal_length  petal_width   species
5.1           3.5          1.4           0.2           setosa
4.9           3.0          1.4           0.2           setosa
4.7           3.2          1.3           0.2           setosa
4.6           3.1          1.5           0.2           setosa
5.0           3.6          1.4           0.2           setosa

Step 3 – Splitting the dataset into the Training set and Test set

After loading the dataset and the variables, the next step is to prepare the variables that will undergo the training process. In this step, we have to split the X and y variables to training and the test datasets. For this, we shall assign 80% of the data randomly to the training set which will be used for training purposes and the remaining 20% of the data as the test set on which the trained Naïve Bayes Classifier shall be tested for accuracy.

from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size = 0.2)

Step 4 – Feature Scaling

Though this is an additional process to this small dataset, I am adding this for you to use it in a larger dataset. In this, the data in the training and test sets are scaled down to a range of values between 0 and 1. This reduces the computational cost.

from sklearn.preprocessing import StandardScaler
sc = StandardScaler()
X_train = sc.fit_transform(X_train)
X_test = sc.transform(X_test)

Step 5 – Training the Naive Bayes Classification model on the Training Set

It is in this step that we import the Naïve Bayes class from sklearn library. For this model, we use the Gaussian model, there are several other models such as Bernoulli, Categorical and Multinomial. Thus, the X_train and y_train are fitted to the classifier variable for training purpose.

from sklearn.naive_bayes import GaussianNB
classifier = GaussianNB()
classifier.fit(X_train, y_train)

Step 6 – Predicting the Test set results – 

We predict the class of the species for the Test set using the model trained and compare it with the Real Values of the species class.

y_pred = classifier.predict(X_test) 
df = pd.DataFrame({‘Real Values’:y_test, ‘Predicted Values’:y_pred})
df>>
Real Values   Predicted Values
setosa        setosa
setosa        setosa
virginica     virginica
versicolor    versicolor
setosa        setosa
setosa        setosa
…  …   …  …  …
virginica     versicolor
virginica     virginica
setosa        setosa
setosa        setosa
versicolor    versicolor
versicolor    versicolor

In the above comparison, we see that there is one incorrect prediction that has predicted Versicolor instead of virginica.

Step 7 – Confusion Matrix and Accuracy

As we are dealing with Classification, the best way to evaluate our classifier model is to print the Confusion Matrix along with its accuracy on the test set.

from sklearn.metrics import confusion_matrix
cm = confusion_matrix(y_test, y_pred)from sklearn.metrics import accuracy_score 

print (“Accuracy : “, accuracy_score(y_test, y_pred))

cm>>Accuracy :  0.9666666666666667

array([[14,  0,  0],
      [ 0,  7,  0],
      [ 0,  1,  8]])

Types of Naive Bayes Classifiers

The different types of Naive Bayes classifiers in machine learning are as follows:

  • Multinomial Naive Bayes Classifier

A multinomial distribution generates certain events whose frequencies are represented by feature vectors. This event model is particularly useful for document classification. 

  • Gaussian Naive Bayes Classifier

The Gaussian or normal distribution method is assumed to distribute values associated with different features. When plotted, it provides a bell-shaped curve symmetric to the mean of the feature values. 

  • Bernoulli Naive Bayes Classifier

Features in a multivariate Bernoulli event model are independent binary variables or booleans describing inputs. Similar to the multinomial model, it is popular for performing document classification tasks. During these tasks, binary term occurrence features are leveraged instead of term frequencies. 

  • Optimal Naive Bayes Classifier

The optimal Naive Bayes classifiers choose the class with the greatest posterior probability of happenings. Even though the name suggests it is optimal, it will go through multiple possibilities. Therefore, the optimal Naive Bayes classifier is slow and time-consuming.

How to Construct a Naive Bayes Classifier

You will have to combine different preprocessing techniques and build a dictionary of words and every word’s count in training data. 

  1. Calculate the probability for each word in a text and sort out the words with a probability lower than the threshold. Any word with a lower probability than the threshold is irrelevant.
  2. Next, you will have to create a probability for every word in the dictionary. The probability should be based on the word’s appearance in sincere and insincere questions. After that, determining the conditional probability is crucial for using the Naive Bayes algorithm. 
  3. The last step is making predictions with the help of conditional probabilities. 

Benefits of the Naive Bayes Classifier

The different benefits of the Naive Bayes algorithm are as follows:

  • Easy and simple to implement
  • Can handle discrete as well as continuous data
  • Does not need a large amount of training data
  • Extremely scalable with multiple data points and predictors
  • Fast and useful for real-time predictions
  • Does not support irrelevant features 

Applications of the Naive Bayes Algorithm

A few applications of the Naive Bayes algorithm in machine learning are as follows:

  • Spam Filtering: It can help determine whether the emails and messages you are receiving are spam.
  • Text Classification: It is a popular probabilistic learning technique beneficial for classifying text. The Naive Bayes algorithm is often used for classifying documents of one or more classes. 
  • Weather Prediction: The algorithm is extremely useful for predicting if the weather will be good.
  • Recommendation System: The Naive Bayes algorithm is used for developing hybrid recommendation systems with the help of collaborative filtering. The recommendation systems are useful for predicting whether a particular user will be receiving any resource. 
  • Medical Diagnosis: The Naive Bayes algorithm is extremely beneficial for medical diagnosis. It helps predict the risk level of patients for particular diseases. 
  • Sentiment Analysis: The Naive Bayes algorithm can help analyze whether a particular feeling or sentiment is positive, negative, or neutral. 
  • Face Recognition: The algorithm is also quite popular for identifying faces. 

Conclusion  

Thus, in this article, we have gone through the basics of the Naïve Bayes Algorithm, understood the math behind the Classification along with a hand-solved example. Finally, we implemented a Machine Learning code to solve a popular dataset using the Naïve Bayes Classification algorithm.

If you’re interested to learn more about AI, machine learning, check out IIIT-B & upGrad’s PG Diploma in Machine Learning & AI which is designed for working professionals and offers 450+ hours of rigorous training, 30+ case studies & assignments, IIIT-B Alumni status, 5+ practical hands-on capstone projects & job assistance with top firms.

Frequently Asked Questions (FAQs)

1. How is probability helpful in Machine Learning?

We may have to make decisions based on partial or incomplete information in real-world scenarios. Probability helps us quantify the uncertainties in such systems and manage the risk for the task. The traditional method works only for the deterministic outcomes for specific actions, but there is always some scope of uncertainty in any prediction model. This uncertainty can come from many parameters from the input data, such as Noise in data. Also, bayesian views from probability theorems can help pattern recognition from the input data. For this, probability uses the maximum likelihood estimation concept and hence is helpful to produce relevant results.

2. What is the use of the Confusion Matrix?

The confusion matrix is a 2x2 matrix used to interpret the performance of the classification model. The true values for the input data must be known for this to work, so it cannot be represented for unlabeled data. It consists of the number of false positives (FP), true positives (TP), false negatives (FN), and true negatives (TN). The predictions are classified into these classes using the count from the training set and the test set. It helps us visualize useful parameters such as accuracy, precision, recall, and specificity. It is relatively easy to understand and gives you a clear idea about the algorithm.

3. What are the different types of Naive Bayes model?

All types are primarily based on Bayes Theorem. The Naive Bayes model generally has three types: Gaussian, Bernoulli, and Multinomial. The Gaussian Naive Bayes assists with continuous values from the input parameters, and it has the assumption that all the classes of input data are uniformly distributed. Bernoulli’s naive Bayes is an event-based model where the data features are independent and present in boolean values. Multinomial Naive Bayes is also based on an event-based model. It has the data features in vector form, which represents relevant frequencies based on the occurrence of the events.