Building Apache Spark with Scala 2.10

Table of contents
Reading Time: < 1 minute

As we know, Spark is a Map-Reduce like cluster computing framework, designed to make data analytics fast.

In the official website of Spark, latest release is Spark 0.7.3. But this release requires Scala 2.9.3. If you are using Scala 2.10, this release would not work.

Since Spark has not announced any Scala 2.10 compatible release yet, so to build Spark with Scala 2.10, you have to download latest release from here: Spark-Scala 2.10.

Installation instructions would be the same as previous release.

Since Spark Api for this release is not available on any repository, so if you want to use it in your project, you need to do:

1) Go to Spark directory

2) Run sbt compile publish-local

3) Add
libraryDependencies += "org.spark-project" %% "spark-core" % "0.8.0-SNAPSHOT"
in the build.sbt of your Scala project.

Written by 

Ayush is the Sr. Lead Consultant @ Knoldus Software LLP. In his 10 years of experience he has become a developer with proven experience in architecting and developing web applications. Ayush has a Masters in Computer Application from U.P. Technical University, Ayush is a strong-willed and self-motivated professional who takes deep care in adhering to quality norms within projects. He is capable of managing challenging projects with remarkable deadline sensitivity without compromising code quality.

2 thoughts on “Building Apache Spark with Scala 2.101 min read

Comments are closed.