As we know, Spark is a Map-Reduce like cluster computing framework, designed to make data analytics fast.
In the official website of Spark, latest release is Spark 0.7.3. But this release requires Scala 2.9.3. If you are using Scala 2.10, this release would not work.
Since Spark has not announced any Scala 2.10 compatible release yet, so to build Spark with Scala 2.10, you have to download latest release from here: Spark-Scala 2.10.
Installation instructions would be the same as previous release.
Since Spark Api for this release is not available on any repository, so if you want to use it in your project, you need to do:
1) Go to Spark directory
2) Run sbt compile publish-local
3) Add
libraryDependencies += "org.spark-project" %% "spark-core" % "0.8.0-SNAPSHOT"
in the build.sbt of your Scala project.
Or you can wait untill they release version 0.8, right? 🙂
Yes Alex. I have given an alternate to those, who want to work on Scala 2.10.0. Otherwise It would be great, if Spark releases its version 0.8 very soon.