aws-kinesis-spark


License

License

Categories

Categories

AWS Container PaaS Providers
GroupId

GroupId

jp.co.bizreach
ArtifactId

ArtifactId

aws-kinesis-spark_2.12
Last Version

Last Version

0.0.12
Release Date

Release Date

Type

Type

jar
Description

Description

aws-kinesis-spark
aws-kinesis-spark
Project URL

Project URL

https://github.com/bizreach/aws-kinesis-scala
Project Organization

Project Organization

jp.co.bizreach
Source Code Management

Source Code Management

https://github.com/bizreach/aws-kinesis-scala

Download aws-kinesis-spark_2.12

How to add to project

<!-- https://jarcasting.com/artifacts/jp.co.bizreach/aws-kinesis-spark_2.12/ -->
<dependency>
    <groupId>jp.co.bizreach</groupId>
    <artifactId>aws-kinesis-spark_2.12</artifactId>
    <version>0.0.12</version>
</dependency>
// https://jarcasting.com/artifacts/jp.co.bizreach/aws-kinesis-spark_2.12/
implementation 'jp.co.bizreach:aws-kinesis-spark_2.12:0.0.12'
// https://jarcasting.com/artifacts/jp.co.bizreach/aws-kinesis-spark_2.12/
implementation ("jp.co.bizreach:aws-kinesis-spark_2.12:0.0.12")
'jp.co.bizreach:aws-kinesis-spark_2.12:jar:0.0.12'
<dependency org="jp.co.bizreach" name="aws-kinesis-spark_2.12" rev="0.0.12">
  <artifact name="aws-kinesis-spark_2.12" type="jar" />
</dependency>
@Grapes(
@Grab(group='jp.co.bizreach', module='aws-kinesis-spark_2.12', version='0.0.12')
)
libraryDependencies += "jp.co.bizreach" % "aws-kinesis-spark_2.12" % "0.0.12"
[jp.co.bizreach/aws-kinesis-spark_2.12 "0.0.12"]

Dependencies

compile (3)

Group / Artifact Type Version
org.scala-lang : scala-library jar 2.12.8
jp.co.bizreach : aws-kinesis-scala_2.12 jar 0.0.12
com.amazonaws : aws-java-sdk-sts jar 1.11.579

provided (1)

Group / Artifact Type Version
org.apache.spark : spark-core_2.12 jar 2.4.3

Project Modules

There are no modules declared in this project.

aws-kinesis-scala Build Status

Scala client for Amazon Kinesis with Apache Spark support.

For Apache Spark, reading from Kinesis is supported by Spark Streaming Kinesis Integration, but it does not support writing to Kinesis. This library makes possible to write Spark's RDD and Spark Streaming's DStream to Kinesis.

Installation

Add a following dependency into your build.sbt.

core only:

libraryDependencies += "jp.co.bizreach" %% "aws-kinesis-scala" % "0.0.12"

use spark integration:

libraryDependencies += "jp.co.bizreach" %% "aws-kinesis-spark" % "0.0.12"

Usage

Create the AmazonKinesis at first.

import jp.co.bizreach.kinesis._

implicit val region = Regions.AP_NORTHEAST_1

// use DefaultAWSCredentialsProviderChain
val client = AmazonKinesis()

// specify an explicit Provider
val client = AmazonKinesis(new InstanceProfileCredentialsProvider())

// specify an explicit client configuration
val client = AmazonKinesis(new ClientConfiguration().withProxyHost("proxyHost"))

// both
val client = AmazonKinesis(
  new InstanceProfileCredentialsProvider(),
  new ClientConfiguration().withProxyHost("proxyHost")
)

Then you can access Kinesis as following:

val request = PutRecordRequest(
  streamName   = "streamName",
  partitionKey = "partitionKey",
  data         = "data".getBytes("UTF-8")
)

// not retry
client.putRecord(request)

// if failure, max retry count is 3 (SDK default)
client.putRecordWithRetry(request)

Amazon Kinesis Firehose

Create the AmazonKinesisFirehose at first.

import jp.co.bizreach.kinesisfirehose._

implicit val region = Regions.US_EAST_1

// use DefaultAWSCredentialsProviderChain
val client = AmazonKinesisFirehose()

... as with kinesis ...

Then you can access Kinesis Firehose as following:

val request = PutRecordRequest(
  deliveryStreamName = "firehose-example",
  record             = "data".getBytes("UTF-8")
)

// not retry
client.putRecord(request)

// if failure, max retry count is 3 (SDK default)
client.putRecordWithRetry(request)

Apache Spark

aws-kinesis-spark provides integration with Spark: for writing, methods that work on any RDD.

Import the jp.co.bizreach.kinesis.spark._ to gain saveToKinesis method on your RDDs:

import jp.co.bizreach.kinesis.spark._

val rdd: RDD[Map[String, Option[Any]]] = ...

rdd.saveToKinesis(
  streamName = "streamName",
  region     = Regions.AP_NORTHEAST_1,
  chunk      = 30
)

You can also write data to Kinesis from Spark Streaming with DStreams.

import jp.co.bizreach.kinesis.spark._

val dstream: DStream[Map[String, Option[Any]]] = ...

dstream.foreachRDD { rdd =>
  rdd.saveToKinesis( ... )
}
jp.co.bizreach

BizReach,Inc.

Versions

Version
0.0.12
0.0.11
0.0.10
0.0.9
0.0.8
0.0.7
0.0.6
0.0.5
0.0.4