Subscribe to DSC Newsletter
ankita paunikar
  • Female
  • Hartford, CT
  • United States
Share on Facebook
Share

Gifts Received

Gift

ankita paunikar has not received any gifts yet

Give a Gift

 

ankita paunikar's Page

Latest Activity

George Joseph liked ankita paunikar's blog post Intuition behind Bias-Variance trade-off, Lasso and Ridge Regression
Jan 15, 2018
Duane Baker liked ankita paunikar's blog post Intuition behind Bias-Variance trade-off, Lasso and Ridge Regression
Jan 12, 2018
Matt Reaney liked ankita paunikar's blog post Intuition behind Bias-Variance trade-off, Lasso and Ridge Regression
Jan 11, 2018
Gerardo Rojas liked ankita paunikar's blog post Intuition behind Bias-Variance trade-off, Lasso and Ridge Regression
Jan 10, 2018
Hui Li liked ankita paunikar's blog post Intuition behind Bias-Variance trade-off, Lasso and Ridge Regression
Jan 8, 2018
Harsh Sarda liked ankita paunikar's blog post Intuition behind Bias-Variance trade-off, Lasso and Ridge Regression
Jan 7, 2018
Justin McBride liked ankita paunikar's blog post Intuition behind Bias-Variance trade-off, Lasso and Ridge Regression
Jan 5, 2018
ankita paunikar posted a blog post

Intuition behind Bias-Variance trade-off, Lasso and Ridge Regression

Linear regression uses Ordinary Least square method to find the best coefficient estimates. One of the assumptions of Linear regression is that the variables are not correlated with each other. However, when the multicollinearity exists in the dataset (two or more variables are highly correlated with each other) Ordinary Least square method cannot be that effective. In this blog, we will talk about two methods which are slightly better than Ordinary Least Square method – Lasso and Ridge…See More
Jan 4, 2018
ankita paunikar's blog post was featured

Intuition behind Bias-Variance trade-off, Lasso and Ridge Regression

Linear regression uses Ordinary Least square method to find the best coefficient estimates. One of the assumptions of Linear regression is that the variables are not correlated with each other. However, when the multicollinearity exists in the dataset (two or more variables are highly correlated with each other) Ordinary Least square method cannot be that effective. In this blog, we will talk about two methods which are slightly better than Ordinary Least Square method – Lasso and Ridge…See More
Jan 4, 2018

Profile Information

My Web Site Or LinkedIn Profile
http://www.linkedin.com/in/ankitapaunikar
Professional Status
Student
Your Job Title:
Graduate Student
Interests:
Finding a new position, Networking, New venture

Ankita paunikar's Blog

Intuition behind Bias-Variance trade-off, Lasso and Ridge Regression

Posted on January 4, 2018 at 9:30am 0 Comments

Linear regression uses Ordinary Least square method to find the best coefficient estimates. One of the assumptions of Linear regression is that the variables are not correlated with each other. However, when the multicollinearity exists in the dataset (two or more variables are highly correlated with each other) Ordinary Least square method cannot be that effective. In this blog, we will talk about two methods which are slightly better than Ordinary Least Square method – Lasso and Ridge…

Continue

Comment Wall

You need to be a member of Data Science Central to add comments!

Join Data Science Central

  • No comments yet!
 
 
 

Videos

  • Add Videos
  • View All

Follow Us

© 2019   Data Science Central ®   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service