Webinar Series: 3 Ways to Improve your Regression
January 20th and 27th, 10AM – 11AM PT
- If the time is inconvenient, please register and we will send you a recording.
Click to Register
- Linear regression plays a big part in the everyday life of a data analyst, but the results aren’t always satisfactory.
- What if you could drastically improve prediction accuracy in your regression with a new model that handles missing values, interactions, AND nonlinearities in your data?
- Instead of proceeding with a mediocre analysis, join us for this 2-part webinar series.
- We will show you how modern algorithms can take your regression model to the next level and expertly handle your modeling woes.
- You will walk away with several different methods to turn your ordinary regression into an extraordinary regression!
This webinar will be a step-by-step presentation that you can repeat on your own! Included with Registration:
- Webinar recording
- 30 day software evaluation
- Dataset used in presentation
- Step-by-step instruction for you to try at home
Who should attend:
- Attend if you want to implement data science techniques even without a data science, statistical or programming background.
- Attend if you want to understand why data science techniques are so important for forecasting.
Click to Register
Alternative Link: http://info.salford-systems.com/3-ways-to-improve-your-regression-part1
Agenda Part 1: January 20
- We introduce MARS nonlinear regression, TreeNet gradient boosting, and Random Forests and show you how to extract actionable insight.
- Nonlinear regression splines (via MARS): this tool is ideal for users who prefer results in a form similar to traditional regression while allowing for bends, thresholds, and other departures from straight-line methods.
- Stochastic gradient boosting (via TreeNet): this flexible and powerful data mining tool generates hundreds of decision trees in a sequential, error-correcting process to produce an extremely accurate model.
- Random Forests: this method combines many decision trees independent of each other and is best suited in analyses of small to moderate datasets.
Agenda Part 2: January 27
- We will show you how to take these techniques even further and take advantage of advanced modeling features.
- There will be overlap with Part 1. It is recommended to watch Part 1, but not required.
- Stochastic gradient boosting: TreeNet plots show you the impact of every variable in your model; take it a step further by creating spline approximations to these variables and using them in a conventional linear regression for a boosted model performance!
- Nonlinear regression splines: MARS nonlinear regression will still give you what looks like a standard regression equation, but instead of coefficients, you’ll see transformations of your original variables.
- Modeling automation: learn how to cycle through numerous modeling scenarios automatically to discover best-fit parameters.