Spark On Mesos(Installation)

Table of contents
Reading Time: 2 minutes

In this Article We Will Learn How to Use Mesos On spark,so lets get started all you required is spark on your machine as a prerequisite,here are the steps to configure

1.Download Latest Mesos Version from here

2.extract the jar

3.Install Mesos dependencies.
$ sudo apt-get -y install build-essential python-dev python-six python-virtualenv libcurl4-nss-dev libsasl2-dev libsasl2-modules maven libapr1-dev libsvn-devInstall other Mesos dependencies.

4.Install libz which is required to build mesos

$sudo apt install zlib1g-dev

5.Configure and build mesos
$ mkdir build
$ cd build
build$../configure
$ make install

it will create libmesos.so file in /usr/local/lib folder

6.start mesos slave and master

$ cd build
#start master
./bin/mesos-master.sh –ip=127.0.0.1 –work_dir=/tmp/mesos
#start slave
./bin/mesos-slave.sh –master=127.0.0.1:5050 –work_dir=/tmp/mesos

navigate to mesos uri localhost:5050 you can see its working or not on worker tab there will be 1 active worker

Screenshot from 2017-10-09 11-16-24.png

7.now got to $SPARK_HOME/CONF

inside your spark-env.sh add following parameters

export MESOS_NATIVE_JAVA_LIBRARY= /usr/local/lib/libmesos.so
export SPARK_EXECUTOR_URI=/path/to/spark-2.2.0-bin-hadoop2.7.tgz

8. start spark shell with mesos as master

./bin/spark-shell –master mesos://127.0.0.1:5050

Screenshot from 2017-10-09 11-23-15.png

go to your mesos ui

Screenshot from 2017-10-09 11-25-06.pnghere is my active task now you can go to sandbox you can find the complete logs of your slave

Screenshot from 2017-10-09 11-26-22.pngi hope this blog will help


knoldus-advt-sticker


 

4 thoughts on “Spark On Mesos(Installation)1 min read

Comments are closed.

Discover more from Knoldus Blogs

Subscribe now to keep reading and get access to the full archive.

Continue reading