XGBoost JVM Package

JVM Package for XGBoost

License

License

GroupId

GroupId

ai.rapids
ArtifactId

ArtifactId

xgboost-jvm_2.x
Last Version

Last Version

1.0.0-Beta5
Release Date

Release Date

Type

Type

pom
Description

Description

XGBoost JVM Package
JVM Package for XGBoost
Project URL

Project URL

https://github.com/dmlc/xgboost/tree/master/jvm-packages
Source Code Management

Source Code Management

https://github.com/rapidsai/xgboost

Download xgboost-jvm_2.x

How to add to project

<!-- https://jarcasting.com/artifacts/ai.rapids/xgboost-jvm_2.x/ -->
<dependency>
    <groupId>ai.rapids</groupId>
    <artifactId>xgboost-jvm_2.x</artifactId>
    <version>1.0.0-Beta5</version>
    <type>pom</type>
</dependency>
// https://jarcasting.com/artifacts/ai.rapids/xgboost-jvm_2.x/
implementation 'ai.rapids:xgboost-jvm_2.x:1.0.0-Beta5'
// https://jarcasting.com/artifacts/ai.rapids/xgboost-jvm_2.x/
implementation ("ai.rapids:xgboost-jvm_2.x:1.0.0-Beta5")
'ai.rapids:xgboost-jvm_2.x:pom:1.0.0-Beta5'
<dependency org="ai.rapids" name="xgboost-jvm_2.x" rev="1.0.0-Beta5">
  <artifact name="xgboost-jvm_2.x" type="pom" />
</dependency>
@Grapes(
@Grab(group='ai.rapids', module='xgboost-jvm_2.x', version='1.0.0-Beta5')
)
libraryDependencies += "ai.rapids" % "xgboost-jvm_2.x" % "1.0.0-Beta5"
[ai.rapids/xgboost-jvm_2.x "1.0.0-Beta5"]

Dependencies

compile (5)

Group / Artifact Type Version
com.esotericsoftware.kryo : kryo jar 2.22
org.scala-lang : scala-compiler jar 2.11.12
org.scala-lang : scala-reflect jar 2.11.12
org.scala-lang : scala-library jar 2.11.12
commons-logging : commons-logging jar 1.2

test (2)

Group / Artifact Type Version
org.scalatest : scalatest_2.11 jar 3.0.8
org.scalactic : scalactic_2.11 jar 3.0.8

Project Modules

  • xgboost4j
  • xgboost4j-spark

eXtreme Gradient Boosting

Build Status Build Status Build Status XGBoost-CI Documentation Status GitHub license CRAN Status Badge PyPI version Conda version Optuna Twitter

Community | Documentation | Resources | Contributors | Release Notes

XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Kubernetes, Hadoop, SGE, MPI, Dask) and can solve problems beyond billions of examples.

License

© Contributors, 2019. Licensed under an Apache-2 license.

Contribute to XGBoost

XGBoost has been developed and used by a group of active community members. Your help is very valuable to make the package better for everyone. Checkout the Community Page.

Reference

  • Tianqi Chen and Carlos Guestrin. XGBoost: A Scalable Tree Boosting System. In 22nd SIGKDD Conference on Knowledge Discovery and Data Mining, 2016
  • XGBoost originates from research project at University of Washington.

Sponsors

Become a sponsor and get a logo here. See details at Sponsoring the XGBoost Project. The funds are used to defray the cost of continuous integration and testing infrastructure (https://xgboost-ci.net).

Open Source Collective sponsors

Backers on Open Collective Sponsors on Open Collective

Sponsors

[Become a sponsor]

NVIDIA

Backers

[Become a backer]

Other sponsors

The sponsors in this list are donating cloud hours in lieu of cash donation.

Amazon Web Services

ai.rapids

RAPIDS

Open GPU Data Science

Versions

Version
1.0.0-Beta5
1.0.0-Beta4
1.0.0-Beta3