Skip to content

Commit 0e0849f

Browse files
authored
Mention dask in readme. [skip ci] (dmlc#4942)
1 parent 3d46bd0 commit 0e0849f

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@
1717
XGBoost is an optimized distributed gradient boosting library designed to be highly ***efficient***, ***flexible*** and ***portable***.
1818
It implements machine learning algorithms under the [Gradient Boosting](https://en.wikipedia.org/wiki/Gradient_boosting) framework.
1919
XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way.
20-
The same code runs on major distributed environment (Kubernetes, Hadoop, SGE, MPI) and can solve problems beyond billions of examples.
20+
The same code runs on major distributed environment (Kubernetes, Hadoop, SGE, MPI, Dask) and can solve problems beyond billions of examples.
2121

2222
License
2323
-------
@@ -38,7 +38,7 @@ Sponsors
3838
Become a sponsor and get a logo here. See details at [Sponsoring the XGBoost Project](https://xgboost.ai/sponsors). The funds are used to defray the cost of continuous integration and testing infrastructure (https://xgboost-ci.net).
3939

4040
## Open Source Collective sponsors
41-
[![Backers on Open Collective](https://opencollective.com/xgboost/backers/badge.svg)](#backers) [![Sponsors on Open Collective](https://opencollective.com/xgboost/sponsors/badge.svg)](#sponsors)
41+
[![Backers on Open Collective](https://opencollective.com/xgboost/backers/badge.svg)](#backers) [![Sponsors on Open Collective](https://opencollective.com/xgboost/sponsors/badge.svg)](#sponsors)
4242

4343
### Sponsors
4444
[[Become a sponsor](https://opencollective.com/xgboost#sponsor)]

0 commit comments

Comments
 (0)