XGBoost on GPUs: Unlocking Machine Learning Performance and Productivity with RAPIDS
Data scientists are increasingly turning to XGBoost as their preferred machine learning algorithm, providing an easily accessible, versatile tool for data scientists. Exciting new advances in GPU-accelerated machine learning are making it easier than ever to use XGBoost with faster results.
On December 18th at 11:00am PT, join NVIDIA for a technical deep dive into GPU-accelerated machine learning, to explore:
Why XGBoost is currently the most popular and versatile machine learning algorithm
The benefits of running XGBoost on GPUs vs CPUs, and how to get started
How to effortlessly scale up ML workflows with greater speed leveraging RAPIDS with Pandas-like ease of use
How to tame terabyte size datasets using multi-GPU, multi-node configurations