### Regression Analysis with Python

ISBN: 978-1-61691-688-6uCertify REG-PYTHON.AJ1

(REG-PYTHON.AJ1) / ISBN : 978-1-61691-688-6

This course includes

Lessons

TestPrep

Lab

AI Tutor (Add-on)

267
Review

Get the knowledge to use Python for building fast and better linear models and to deploy the resulting models in Python with uCertify’s course Regression Analysis with Python. The course provides hands-on experience of the concepts, Regression – The Workhorse of Data Science, Approaching Simple Linear Regression, Multiple Regression in Action. Logistic Regression, Data Preparation, Achieving Generalization, and so on.

Get the support you need. Enroll in our Instructor-Led Course.

10+ Lessons | 52+ Exercises | 60+ Quizzes | 38+ Flashcards | 38+ Glossary of terms

35+ Pre Assessment Questions | 35+ Post Assessment Questions |

1

- What this course covers
- What you need for this course
- Who this course is for
- Conventions

2

- Regression analysis and data science
- Python for data science
- Python packages and functions for linear models
- Summary

3

- Defining a regression problem
- Starting from the basics
- Extending to linear regression
- Minimizing the cost function
- Summary

4

- Using multiple features
- Revisiting gradient descent
- Estimating feature importance
- Interaction models
- Polynomial regression
- Summary

5

- Defining a classification problem
- Defining a probability-based approach
- Revisiting gradient descent
- Multiclass Logistic Regression
- An example
- Summary

6

- Numeric feature scaling
- Qualitative feature encoding
- Numeric feature transformation
- Missing data
- Outliers
- Summary

7

- Checking on out-of-sample data
- Greedy selection of features
- Regularization optimized by grid-search
- Stability selection
- Summary

8

- Batch learning
- Online mini-batch learning
- Summary

9

- Least Angle Regression
- Bayesian regression
- SGD classification with hinge loss
- Regression trees (CART)
- Bagging and boosting
- Gradient Boosting Regressor with LAD
- Summary

10

- Downloading the datasets
- A regression problem
- An imbalanced and multiclass classification problem
- A ranking problem
- A time series problem
- Summary

- Creating a One-Column Matrix Structure
- Visualizing the Distribution of Errors
- Plotting a Normal Distribution Graph
- Plotting a Scatterplot
- Standardizing a Variable
- Showing Regression Analysis Parameters
- Showing the Summary of Regression Analysis
- Printing the Residual Sum of Squared Errors
- Plotting Standardized Residuals
- Predicting with a Regression Model
- Regressing with Scikit-learn
- Using the fmin Minimization Procedure
- Finding Mean and Median
- Obtaining the Inverse of a Matrix

- Printing Eigenvalues
- Visualizing the Correlation Matrix
- Obtaining the Correlation Matrix
- Standardizing Using the Scikit-learn Preprocessing Module
- Printing Standardized Coefficients
- Obtaining the R-squared Baseline
- Recording Coefficient of Determination Using R-squared
- Reporting All R-squared Increment Above 0.03
- Representing LSTAT Using the Scatterplot
- Testing Degree of a Polynomial

- Creating a Dummy Dataset
- Obtaining a Classification Report
- Representing a Confusion Matrix Using Heatmap
- Creating a Confusion Matrix
- Plotting the sigmoid Function
- Fitting a Multiple Linear Regressor
- Creating and Fitting a Logistic Regressor Classifier
- Obtaining the Feature Vector and its Original and Predicted Labels
- Visualizing Multiclass Logistic Regressor
- Creating a Dummy Four-Class Dataset

- Centering the Variables
- Demonstrating the Logistic Regression
- Analyzing Qualitative Data Using Logit
- Transforming Qualitative Data
- Using LabelBinarizer
- Using the Hashing Trick
- Obtaining Residuals
- Replacing Missing Values With the Mean Value
- Representing Outliers Among Predictors
- Showing Outliers

- Splitting a Dataset
- Bootstrapping a Dataset
- Applying Third-Degree Polynomial Expansion
- Plotting the Distribution of Scores
- Demonstrating Working of Recursive Elimination
- Implementing L2 Regularization
- Performing Random Grid Search

- Demonstrating Mini-Batch Learning

- Obtaining LARS Coefficients
- Using Bayesian Regression
- Using the SGDClassifier Class With the hinge Loss
- Implementing SVR
- Implementing CART
- Implementing Random Forest Regressor
- Implementing Bagging
- Implementing Boosting
- Implementing Gradient Boosting Regressor with LAD