My guiding vision for the book was simple. One does not need to go through years of culinary schooling in order to prepare a great meal. All you need is a fantastic recipe. I have tried to pack a lot of practical usefulness in little recipes that you can execute on a Sunday afternoon. Take a look through the menu below, and choose your adventure!

**Table of Contents for Volume 2: Practice**

**Note #1:** You will notice that most of the table of contents is not yet hyperlinked. This is because I am still working on those posts! I am adding new content every week, please check back soon.

**Note #2:** Articles that include downloadable content are indicated with a little graphic. For example, indicates that the article includes a downloadable excel file, or a SQL file , or an R script , etc.

>> 2.1.2.2 AiXQL

>> 2.1.2.4 Installing Anaconda

>> 2.1.2.5 Installing Julia Language

>> 2.1.2.6 Installing Python

>> 2.1.2.7 TensorFlow

> 2.3.2 Discriminant Analysis

> 2.3.3 Linear Discriminants

> 2.3.4 Quadratic Discriminants

> 2.3.5 Logistic Discriminants

>> 2.4.1.2 Linear Least Median Squares Regression

>> 2.4.1.3 Robust Regression

>> 2.4.1.4 Logistic Regression

>> 2.4.1.5 Probabilistic Regression

>> 2.4.1.6 Generalized Linear Model (GLM)

>> 2.4.1.7 Generalized Additive Model (GAM)

>> 2.4.1.8 Multivariate Adaptive Regression Splines

>> 2.4.1.9 PACE Regression

>> 2.4.1.10 Isotonic Regression

>> 2.4.1.11 Project Pursuit Regression

>> 2.4.1.12 Gaussian Process Regression

>> 2.4.2.2 Types of Neural Networks

>> 2.4.2.3 The Multi-Layer Perceptron

>>> 2.4.2.3.2 Example in MS Excel

>>> 2.4.2.3.3 AiXQL: Neural Networks

>> 2.4.2.5 Radial Basis Functions

>> 2.4.2.6 Vector Quantization

>> 2.4.3.3 Simulated Annealing

>> 2.4.3.4 Genetic Algorithms

>> 2.4.3.5 Matrix Factorization Method

>> 2.4.4.2 ARIMA

>> 2.5.1.3 Bayesian Rules Classifier

>> 2.5.1.4 Locally Weighted Learning

>> 2.5.3.2 Naive Bayes Tree

>> 2.5.3.3 CART

>> 2.5.3.4 CHAID

>> 2.5.3.5 Decision Stump

>> 2.5.3.6 Random Tree

>> 2.5.3.7 Random Forest

>> 2.5.3.8 C4.5 or J4.8

>> 2.5.3.9 ID3

>> 2.5.3.10 M5P

>> 2.5.3.11 Alternating Decision Tree

>> 2.5.3.12 QUEST

>> 2.5.3.13 CRUISE

>> 2.5.3.14 GUIDE

>> 2.5.3.15 LOTUS

>> 2.5.4.2 BayesNet

>> 2.5.4.3 Complement Naive Bayes

>> 2.5.4.4 Naive Bayes

>> 2.5.4.5 Naive Bayes Multinomial

>> 2.5.4.6 Naive Bayes Multinomial Updateable

>> 2.5.4.7 Hidden Naive Bayes

>> 2.5.4.8 DBNBText

>> 2.5.4.9 AODEsr (Subsumption Resolution)

>> 2.5.4.10 WAODE

>> 2.5.5.2 OneR

>> 2.5.5.3 ZeroR

>> 2.5.5.4 Conjunctive Rule

>> 2.5.5.5 PART

>> 2.5.5.6 NNGE

>> 2.5.5.7 PRISM

>> 2.5.5.8 M5Rules

>> 2.5.5.9 RIDOR

>> 2.5.5.10 JRIP

>> 2.5.5.11 Ordinal Learning Method

>> 2.5.5.12 Fuzzy Unordered Rule Induction

>> 2.6.1.3 Bayesian Rules Classifier

>> 2.6.1.4 Locally Weighted Learning

>> 2.7.1.2 Complex Adaptive Systems

> 2.8.2 Loss Functions

> 2.8.3 Performance Metrics

>> 2.8.4.2 Cross-Validation

>> 2.8.4.3 Bootstrapping

>> 2.8.5.2 Weighted R-Square

>> 2.8.5.3 Adjusted R-Square

>> 2.8.5.4 Absolute Error

>> 2.8.5.5 Prediction Error

>> 2.8.5.6 RMSE

>> 2.8.5.7 Correlation Coefficient

>> 2.8.6.2 Sensitivity & Specificity

>> 2.8.6.3 Precision & Accuracy

>> 2.8.6.4 Entropy

>> 2.8.6.5 Kappa Statistic

>> 2.8.7.2 ROC Curves

>> 2.8.7.3 Lorenz Curves & Gini Coefficient

> 2.9.2 Crazy, Great Model!

**Recent posts under: Practice**

Since most of the table of contents is not yet hyperlinked, you can see some of the more recent posts below for easier access.