5) Create a Naive Bayesian Network with a Laplace Estimator.

This Blog entry is from the Naive Bayesian section in Learn R.

To create a Bayesian model with a nominal Laplace estimator of 1, which will mean that in the event that there is nothing it is switch to at least one occurrence in the observation, simply change the parameter value in the training:

SafeBayesianModel <- naiveBayes(CreditRisk,CreditRisk$Dependent,laplace=1)
1.png

Run the line of script to console:

2.png

A Bayesian model has been created as SafeBaysianModel.  Recall the model:

ClassPredictions <- predict(SafeBayesianModel,CreditRisk,type = "class")
3.png

Run the line of script to console:

4.png

The de-facto method to appraise the performance of the model would be to create a confusion matrix:

library(gmodels)
CrossTable(CreditRisk$Dependent, ClassPredictions)
5.png

Run the block of script to console:

6.png

It can be seen that this naive Bayesian model appears to be startlingly accurate, which stands to reason as the same data is being used to test as was trained.  It follows that this would benefit from an element of cross validation, which was introduced in Gradient Boosting Machines.

4) Recalling a Naive Bayesian Classifier for Classification.

This Blog entry is from the Naive Bayesian section in Learn R.

To recall the pivotal classification, rather than recall P for each class and drive it from the larger of the values, the type class can be specified:

ClassPredictions <- predict(BayesianModel,CreditRisk,type = "class")
1.png

Run the line of script to console:

2.png

Merge the classification predictions into the CreditRisk data frame, specifying the dply library also:

library(dplyr)
CreditRisk <- mutate(CreditRisk, ClassPredictions)
3.png

Run the line of script to console:

4.png

Viewing the CreditRisk data frame:

View(CreditRisk)
5.png

Run the line of script to console:

6.png

Scroll to the last column in the RStudio viewer to reveal the classification for each record:

7.png

2) Training a Naive Bayesian Classifier.

This Blog entry is from the Naive Bayesian section in Learn R.

As a Naive Bayesian classifier is rather simple in its concept, all independent variables being treated and arcs flowing away from the dependent variable, it is to be expected that the process of training such a classifier is indeed trivial.  To train a Bayesian model, simply pass the data frame, specify the factor that is to be treated as the dependent variable and the Laplace estimator (zero in this example).  The naiveBayes() function exists as part of the e1071 package,  a such begin by installing the package via RStudio:

1.png

Click install to download and install this package:

2.png

Reference the library:

library(e1071)

3.png

Run the line of script to console. To train a Naïve Bayesian model:

BayesianModel <- naiveBayes(CreditRisk,CreditRisk$Dependent,laplace=0)
4.png

Run the line of script to console. The BayesModel object now contains a model that can be used to make P predictions as well as classifications.