*This article was written by Madhu Sanjeevi (Mady)**.*

In the previous story we talked about Linear Regression for solving regression problems in machine learning, This story we will talk about Logistic Regression for classification problems.

You may be wondering why the name says regression if it is a classification algorithm, well,It uses the regression inside to be the classification algorithm.

Classification : Separates the data from one to another. This story we talk about binary classification ( 0 or 1). Here target variable is either 0 or 1. Goal is to find that green straight line (which separates the data at best). So we use regression for drawing the line, makes sense right?

We only accept the values between 0 and 1 (We don’t accept other values) to make a decision (Yes/No). There is an awesome function called Sigmoid or Logistic function, we use to get the values between 0 and 1. This function squashes the value (any value) and gives the value between 0 and 1.

So far we know that we first apply the linear equation and apply Sigmoid function for the result so we get the value which is between 0 and 1. The hypothesis for Linear regression is h(X) = θ0+θ1*X.

**How does it work?**

- First we calculate the Logit function: logit = θ0+θ1*X (hypothesis of linear regression)
- We apply the above Sigmoid function (Logistic function) to logit.
- We calculate the error, Cost function (Maximum log-Likelihood).
- Next step is to apply Gradient descent to change the θ values in our hypothesis.

We got the Logistic regression ready, we can now predict new data with the model we just built.

*To read the whole article, with examples and illustrations, click here.*

© 2020 TechTarget, Inc. Powered by

Badges | Report an Issue | Privacy Policy | Terms of Service

**Most Popular Content on DSC**

To not miss this type of content in the future, subscribe to our newsletter.

- Book: Applied Stochastic Processes
- Long-range Correlations in Time Series: Modeling, Testing, Case Study
- How to Automatically Determine the Number of Clusters in your Data
- New Machine Learning Cheat Sheet | Old one
- Confidence Intervals Without Pain - With Resampling
- Advanced Machine Learning with Basic Excel
- New Perspectives on Statistical Distributions and Deep Learning
- Fascinating New Results in the Theory of Randomness
- Fast Combinatorial Feature Selection

**Other popular resources**

- Comprehensive Repository of Data Science and ML Resources
- Statistical Concepts Explained in Simple English
- Machine Learning Concepts Explained in One Picture
- 100 Data Science Interview Questions and Answers
- Cheat Sheets | Curated Articles | Search | Jobs | Courses
- Post a Blog | Forum Questions | Books | Salaries | News

**Archives:** 2008-2014 |
2015-2016 |
2017-2019 |
Book 1 |
Book 2 |
More

**Most popular articles**

- Free Book and Resources for DSC Members
- New Perspectives on Statistical Distributions and Deep Learning
- Time series, Growth Modeling and Data Science Wizardy
- Statistical Concepts Explained in Simple English
- Machine Learning Concepts Explained in One Picture
- Comprehensive Repository of Data Science and ML Resources
- Advanced Machine Learning with Basic Excel
- Difference between ML, Data Science, AI, Deep Learning, and Statistics
- Selected Business Analytics, Data Science and ML articles
- How to Automatically Determine the Number of Clusters in your Data
- Fascinating New Results in the Theory of Randomness
- Hire a Data Scientist | Search DSC | Find a Job
- Post a Blog | Forum Questions

## You need to be a member of Data Science Central to add comments!

Join Data Science Central