Explore the concept of machine learning linear models, classifications of linear models, and prominent statistical approaches used to implement linear models. This 11-video course also explores the concepts of bias, variance, and regularization. Key concepts covered here include learning about linear models and various classifications used in predictive analytics; learning different statistical approaches that are used to implement linear models [single regression, multiple regression and analysis of variance (ANOVA)]; and various essential components of a generalized linear model (random component, linear predictor and link function). Next, discover differences between the ANOVA and analysis of covariance (ANCOVA) approaches of statistical testing; learn about implementation of linear regression models by using Scikit-learn; and learn about the concepts of bias, variance, and regularization and their usages in evaluating predictive models. Learners explore the concept of ensemble techniques and illustrate how bagging and boosting algorithms are used to manage predictions, and learn to implement bagging algorithms with the approach of random forest by using Scikit-learn. Finally, observe how to implement boosting ensemble algorithms by using Adaboost classifier in Python.
Linear Models & Gradient Descent: Managing Linear Models
discover the key concepts covered in this course
define linear model and the various classification of linear models that are used in predictive analytics
recognize the different statistical approaches that are used to implement linear models (single regression, multiple regression and ANOVA)
define generalized linear model and the various essential components of generalized linear model (random component, linear predictor and link function)
compare the differences between the ANOVA and ANCOVA approaches of statistical test
demonstrate the implementation of linear regression models using Scikit-learn
describe the concept of bias, variance and regularization and their usages in evaluating predictive models
define the concept of ensemble techniques and illustrate how bagging and boosting algorithms are used to manage predictions
implement bagging algorithms with the approach of random forest using Scikit-learn
implement boosting ensemble algorithms using Adaboost classifier in Python
list the classifications of linear models, recall the essential components of generalized linear models, and implement boosting algorithm using Adaboost classifier