Several factors usually influence an outcome, and users need to consider all of those by using regression. Explore machine learning techniques and risks involved when using multiple factors for linear regression in this 11-video course. Key concepts covered here include reasons to use multiple features in a regression, and techniques involved in creating a model; preparing a data set containing multiple features for training and evaluating a linear regression model; and how to configure, train, and evaluate the linear regression model. Next, learn to create a data set with multiple features in a form which can be fed to a neural network for training and validation; learn the architecture for a Keras sequential model and setting training parameters; and learn to make predictions on test data and examine the metrics to gauge the quality of the neural network model. Learn to use Pandas and Seaborn to view correlations and enumerate risks; apply the principle of parsimonious regression to rebuild the linear regression model; and build a Keras model after selecting only important features from a data set.