Subscribe to DSC Newsletter







XGBoost on GPUs: Unlocking Machine Learning Performance and Productivity with RAPIDS
Data scientists are increasingly turning to XGBoost as their preferred machine learning algorithm, providing an easily accessible, versatile tool for data scientists. Exciting new advances in GPU-accelerated machine learning are making it easier than ever to use XGBoost with faster results.

On December 18th at 11:00am PT, join NVIDIA for a technical deep dive into GPU-accelerated machine learning, to explore:
  Why XGBoost is currently the most popular and versatile machine learning algorithm
  The benefits of running XGBoost on GPUs vs CPUs, and how to get started
  How to effortlessly scale up ML workflows with greater speed leveraging RAPIDS with Pandas-like ease of use
  How to tame terabyte size datasets using multi-GPU, multi-node configurations
REGISTER NOW




18
DEC



WEBINAR DETAILS
Date: Tuesday, Dec 18, 2018
Time: 11:00am – 12:00pm PST
Host: NVIDIA
Register Now ►


FOLLOW US
Facebook Twitter Instagram LinkedIn Google+ LinkedIn



Contact Us
© 2018 NVIDIA Corporation. All rights reserved.
NVIDIA Corporation, 2788 San Tomas Expressway Santa Clara, CA 95051.

Views: 56

Videos

  • Add Videos
  • View All

© 2019   Data Science Central ®   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service