Linear Models & Gradient Descent: Gradient Descent and Regularization


Overview/Description
Expected Duration
Lesson Objectives
Course Number
Expertise Level



Overview/Description

Explore the features of simple and multiple regression, implement simple and multiple regression models, and explore concepts of gradient descent and regularization and different types of gradient descent and regularization. Key concepts covered in this 12-video course include characteristics of the prominent types of linear regression; essential features of simple and multiple regressions and how they are used to implement linear models; and how to implement simple regression models by using Python libraries for machine learning solutions. Next, observe how to implement multiple regression models in Python by using Scikit-learn and StatsModels; learn the different types of gradient descent; and see how to classify the prominent gradient descent optimization algorithms from the perspective of their mathematical representation. Learn how to implement a simple representation of gradient descent using Python; how to implement linear regression by using mini-batch gradient descent to compute hypothesis and predictions; and learn the benefits of regularization and the objectives of L1 and L2 regularization. Finally, learn how to implement L1 and L2 regularization of linear models by using Scikit-learn.



Expected Duration (hours)
0.9

Lesson Objectives

Linear Models & Gradient Descent: Gradient Descent and Regularization

  • discover the key concepts covered in this course
  • list and describe the characteristics of the prominent types of linear regression
  • describe the essential features of simple and multiple regressions and how they're used to implement linear models
  • demonstrate how to implement simple regression models using Python libraries
  • implement multiple regression models in Python using Scikit-learn and StatsModels
  • define gradient descent and the different types of gradient descent
  • classify the prominent gradient descent optimization algorithms from the perspective of their mathematical representation
  • implement a simple representation of gradient descent using Python
  • implement linear regression using mini-batch gradient descent to compute hypothesis and predictions
  • describe the benefits of regularization and the objective of L1 & L2 regularization
  • demonstrate how to implement L1 and L2 regularization of linear models using Scikit-learn
  • recall the essential features of simple and multiple regression, implement a simple regression model using Python and implement L1 regularization using Scikit-learn
  • Course Number:
    it_mlgdrldj_02_enus

    Expertise Level
    Intermediate