XGBoost JVM Package

JVM Package for XGBoost

License

License

GroupId

GroupId

ai.rapids
ArtifactId

ArtifactId

xgboost-jvm
Last Version

Last Version

0.90.1-Beta
Release Date

Release Date

Type

Type

pom
Description

Description

XGBoost JVM Package
JVM Package for XGBoost
Project URL

Project URL

https://github.com/dmlc/xgboost/tree/master/jvm-packages
Source Code Management

Source Code Management

https://github.com/rapidsai/xgboost

Download xgboost-jvm

How to add to project

<!-- https://jarcasting.com/artifacts/ai.rapids/xgboost-jvm/ -->
<dependency>
    <groupId>ai.rapids</groupId>
    <artifactId>xgboost-jvm</artifactId>
    <version>0.90.1-Beta</version>
    <type>pom</type>
</dependency>
// https://jarcasting.com/artifacts/ai.rapids/xgboost-jvm/
implementation 'ai.rapids:xgboost-jvm:0.90.1-Beta'
// https://jarcasting.com/artifacts/ai.rapids/xgboost-jvm/
implementation ("ai.rapids:xgboost-jvm:0.90.1-Beta")
'ai.rapids:xgboost-jvm:pom:0.90.1-Beta'
<dependency org="ai.rapids" name="xgboost-jvm" rev="0.90.1-Beta">
  <artifact name="xgboost-jvm" type="pom" />
</dependency>
@Grapes(
@Grab(group='ai.rapids', module='xgboost-jvm', version='0.90.1-Beta')
)
libraryDependencies += "ai.rapids" % "xgboost-jvm" % "0.90.1-Beta"
[ai.rapids/xgboost-jvm "0.90.1-Beta"]

Dependencies

compile (5)

Group / Artifact Type Version
com.esotericsoftware.kryo : kryo jar 2.22
org.scala-lang : scala-compiler jar 2.11.12
org.scala-lang : scala-reflect jar 2.11.12
org.scala-lang : scala-library jar 2.11.12
commons-logging : commons-logging jar 1.2

test (1)

Group / Artifact Type Version
org.scalatest : scalatest_2.11 jar 3.0.0

Project Modules

  • xgboost4j
  • xgboost4j-spark

eXtreme Gradient Boosting

Build Status Build Status Build Status XGBoost-CI Documentation Status GitHub license CRAN Status Badge PyPI version Optuna Twitter

Community | Documentation | Resources | Contributors | Release Notes

XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Kubernetes, Hadoop, SGE, MPI, Dask) and can solve problems beyond billions of examples.

License

© Contributors, 2019. Licensed under an Apache-2 license.

Contribute to XGBoost

XGBoost has been developed and used by a group of active community members. Your help is very valuable to make the package better for everyone. Checkout the Community Page.

Reference

  • Tianqi Chen and Carlos Guestrin. XGBoost: A Scalable Tree Boosting System. In 22nd SIGKDD Conference on Knowledge Discovery and Data Mining, 2016
  • XGBoost originates from research project at University of Washington.

Sponsors

Become a sponsor and get a logo here. See details at Sponsoring the XGBoost Project. The funds are used to defray the cost of continuous integration and testing infrastructure (https://xgboost-ci.net).

Open Source Collective sponsors

Backers on Open Collective Sponsors on Open Collective

Sponsors

[Become a sponsor]

NVIDIA

Backers

[Become a backer]

Other sponsors

The sponsors in this list are donating cloud hours in lieu of cash donation.

Amazon Web Services

ai.rapids

RAPIDS

Open GPU Data Science

Versions

Version
0.90.1-Beta
0.90-Beta