XGBoost JVM Package

JVM Package for XGBoost optimized with Apache Arrow

License

License

Categories

Categories

Data
GroupId

GroupId

com.intel.bigdata.xgboost
ArtifactId

ArtifactId

xgboost-jvm_2.12
Last Version

Last Version

1.3.3
Release Date

Release Date

Type

Type

jar
Description

Description

XGBoost JVM Package
JVM Package for XGBoost optimized with Apache Arrow
Project URL

Project URL

https://github.com/Intel-bigdata/xgboost/tree/arrow-to-dmatrix/jvm-packages
Source Code Management

Source Code Management

https://github.com/dmlc/xgboost

Download xgboost-jvm_2.12

How to add to project

<!-- https://jarcasting.com/artifacts/com.intel.bigdata.xgboost/xgboost-jvm_2.12/ -->
<dependency>
    <groupId>com.intel.bigdata.xgboost</groupId>
    <artifactId>xgboost-jvm_2.12</artifactId>
    <version>1.3.3</version>
</dependency>
// https://jarcasting.com/artifacts/com.intel.bigdata.xgboost/xgboost-jvm_2.12/
implementation 'com.intel.bigdata.xgboost:xgboost-jvm_2.12:1.3.3'
// https://jarcasting.com/artifacts/com.intel.bigdata.xgboost/xgboost-jvm_2.12/
implementation ("com.intel.bigdata.xgboost:xgboost-jvm_2.12:1.3.3")
'com.intel.bigdata.xgboost:xgboost-jvm_2.12:jar:1.3.3'
<dependency org="com.intel.bigdata.xgboost" name="xgboost-jvm_2.12" rev="1.3.3">
  <artifact name="xgboost-jvm_2.12" type="jar" />
</dependency>
@Grapes(
@Grab(group='com.intel.bigdata.xgboost', module='xgboost-jvm_2.12', version='1.3.3')
)
libraryDependencies += "com.intel.bigdata.xgboost" % "xgboost-jvm_2.12" % "1.3.3"
[com.intel.bigdata.xgboost/xgboost-jvm_2.12 "1.3.3"]

Dependencies

compile (5)

Group / Artifact Type Version
com.esotericsoftware : kryo jar 4.0.2
org.scala-lang : scala-compiler jar 2.12.8
org.scala-lang : scala-reflect jar 2.12.8
org.scala-lang : scala-library jar 2.12.8
commons-logging : commons-logging jar 1.2

test (2)

Group / Artifact Type Version
org.scalatest : scalatest_2.12 jar 3.0.8
org.scalactic : scalactic_2.12 jar 3.0.8

Project Modules

There are no modules declared in this project.

eXtreme Gradient Boosting

Build Status Build Status Build Status XGBoost-CI Documentation Status GitHub license CRAN Status Badge PyPI version Conda version Optuna Twitter

Community | Documentation | Resources | Contributors | Release Notes

XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Kubernetes, Hadoop, SGE, MPI, Dask) and can solve problems beyond billions of examples.

License

© Contributors, 2019. Licensed under an Apache-2 license.

Contribute to XGBoost

XGBoost has been developed and used by a group of active community members. Your help is very valuable to make the package better for everyone. Checkout the Community Page.

Reference

  • Tianqi Chen and Carlos Guestrin. XGBoost: A Scalable Tree Boosting System. In 22nd SIGKDD Conference on Knowledge Discovery and Data Mining, 2016
  • XGBoost originates from research project at University of Washington.

Sponsors

Become a sponsor and get a logo here. See details at Sponsoring the XGBoost Project. The funds are used to defray the cost of continuous integration and testing infrastructure (https://xgboost-ci.net).

Open Source Collective sponsors

Backers on Open Collective Sponsors on Open Collective

Sponsors

[Become a sponsor]

NVIDIA

Backers

[Become a backer]

Other sponsors

The sponsors in this list are donating cloud hours in lieu of cash donation.

Amazon Web Services

com.intel.bigdata.xgboost

Distributed (Deep) Machine Learning Community

A Community of Awesome Machine Learning Projects

Versions

Version
1.3.3