Linear Regression Models: Multiple and Parsimonious Linear Regression


Overview/Description
Expected Duration
Lesson Objectives
Course Number
Expertise Level



Overview/Description
Several factors usually influence an outcome, and users need to consider all of those by using regression. Regression models help us mathematically evaluate our hunches. This course explores machine learning techniques and the risks involved with multiple factor linear regression. Key concepts covered here include reasons to use multiple features in a regression, and how to configure, train, and evaluate the linear regression model. Next, learn to create a data set with multiple features in a form that can be fed to a neural network for training and validation. Review Keras sequential model architecture, its training parameters, and ways to test its predictions. Learn how to use Pandas and Seaborn to view correlations and enumerate risks. Conclude by applying parsimonious regression to rebuild linear regression models.

Expected Duration (hours)
1.2

Lesson Objectives

Linear Regression Models: Multiple and Parsimonious Linear Regression

  • Course Overview
  • identify the reasons to use multiple features when doing a regression and the technique involved in creating such a multiple regression model
  • prepare a dataset containing multiple features to used for training and evaluating a linear regression model
  • configure, train and evaluate the linear regression model which makes predictions from multiple input features
  • create a dataset with multiple features in a form which can be fed to a neural network for training and validation
  • define the architecture for a Keras sequential model and set the training parameters such as loss function and optimizer
  • make predictions on the test data and examine the metrics to gauge the quality of the neural network model
  • use Pandas and Seaborn to visualize correlations in a dataset and identify features which convey similar information
  • identify the risks involved with multiple regression and the need to select features carefully
  • apply the principle of parsimonious regression to re-build the Linear Regression model and compare the results with the kitchen sink approach
  • build a Keras model after selecting only the important features from a dataset
  • encode categorical integers for ML algorithms as well as use Pandas and Seaborn to view correlations, and enumerate risks
  • Course Number:
    it_mllrmddj_03_enus

    Expertise Level
    Intermediate