Streaming with Apache Spark 2.0

Table of contents
Reading Time: 2 minutes

Hello geeks we were discussed about Apache Spark 2.0 with hive in earlier blog.

Now i am going to describe how can we use spark to stream the data   .

At first we need to understand this new Spark Streaming architecture  .

Spark 2.0 simplified the API for Streaming and lets us to access stream data in form of DataFrame and DataSet. Hence with new architecture we can process our streamed data according to our business logic with DataFrame. This is the simple concept behind above architecture.

So here we have two approach to use Spark Streaming programmetically :

  • by using predefined receiver , and
  • by creating Custom-Receiver

At first we will stream our data using predefined receiver .

Add the following dependencies  :

“org.apache.spark” %% “spark-core” % “2.0.0”,
“org.apache.spark” %% “spark-sql” % “2.0.0”,
“org.apache.spark” %% “spark-hive” % “2.0.0”,
“org.apache.spark” %% “spark-streaming” % “2.0.0”

Now as we know entry point of Spark in current version is SparkSession . So ,

Now you need stream receiver  :

Now we get the data of stream here we can perform our any bussines logic with dataframe.

Find Demo code here.
Continue Reading to Streaming with Custom Receiver.

Thanks

KNOLDUS-advt-sticker

Written by 

Software Consultant At Knoldus

2 thoughts on “Streaming with Apache Spark 2.01 min read

Comments are closed.

Discover more from Knoldus Blogs

Subscribe now to keep reading and get access to the full archive.

Continue reading