As we know, Spark is a Map-Reduce like cluster computing framework, designed to make data analytics fast.
Installation instructions would be the same as previous release.
Since Spark Api for this release is not available on any repository, so if you want to use it in your project, you need to do:
1) Go to Spark directory
sbt compile publish-local
libraryDependencies += "org.spark-project" %% "spark-core" % "0.8.0-SNAPSHOT"
in the build.sbt of your Scala project.