Skip to content
forked from dmlc/xgboost

Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Flink and DataFlow

License

Notifications You must be signed in to change notification settings

sxinger/xgboost

This branch is 36 commits ahead of, 3292 commits behind dmlc/xgboost:master.

Folders and files

NameName
Last commit message
Last commit date
Oct 24, 2018
Jul 8, 2020
Jan 16, 2020
Dec 11, 2019
Jan 20, 2020
Aug 13, 2017
Feb 7, 2020
May 20, 2019
Oct 10, 2019
Jan 13, 2020
Jan 13, 2020
Jan 13, 2020
Dec 3, 2019
Oct 23, 2019
Mar 9, 2020
Nov 1, 2019
Mar 9, 2020
Jan 22, 2020
Dec 29, 2019
Jul 23, 2018
Feb 17, 2020
Jul 10, 2018
Dec 26, 2019
Dec 1, 2017
Dec 26, 2019
Aug 15, 2019
Jan 20, 2020
Oct 24, 2019
Jul 15, 2019
Aug 18, 2019
May 20, 2019
Jan 16, 2020
Jul 25, 2019
Jul 20, 2020
Jul 20, 2020
Apr 7, 2020
Mar 6, 2020
Mar 6, 2020
Mar 6, 2020
Mar 6, 2020
Mar 6, 2020
Mar 9, 2020
Mar 9, 2020
Mar 9, 2020
Mar 9, 2020
Mar 9, 2020
Mar 9, 2020
Feb 17, 2020

Repository files navigation

eXtreme Gradient Boosting

Build Status Build Status Build Status Documentation Status GitHub license CRAN Status Badge PyPI version Optuna

Community | Documentation | Resources | Contributors | Release Notes

XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Kubernetes, Hadoop, SGE, MPI, Dask) and can solve problems beyond billions of examples.

License

© Contributors, 2019. Licensed under an Apache-2 license.

Contribute to XGBoost

XGBoost has been developed and used by a group of active community members. Your help is very valuable to make the package better for everyone. Checkout the Community Page

Reference

  • Tianqi Chen and Carlos Guestrin. XGBoost: A Scalable Tree Boosting System. In 22nd SIGKDD Conference on Knowledge Discovery and Data Mining, 2016
  • XGBoost originates from research project at University of Washington.

Sponsors

Become a sponsor and get a logo here. See details at Sponsoring the XGBoost Project. The funds are used to defray the cost of continuous integration and testing infrastructure (https://xgboost-ci.net).

Open Source Collective sponsors

Backers on Open Collective Sponsors on Open Collective

Sponsors

[Become a sponsor]

NVIDIA

Backers

[Become a backer]

Other sponsors

The sponsors in this list are donating cloud hours in lieu of cash donation.

Amazon Web Services

About

Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Flink and DataFlow

Resources

License

Citation

Stars

Watchers

Forks

Packages

No packages published

Languages

  • C++ 41.1%
  • Python 15.5%
  • Cuda 13.9%
  • Scala 11.7%
  • R 9.1%
  • Java 4.7%
  • Other 4.0%