In this Article We Will Learn How to Use Mesos On spark,so lets get started all you required is spark on your machine as a prerequisite,here are the steps to configure
1.Download Latest Mesos Version from here
2.extract the jar
3.Install Mesos dependencies.
$ sudo apt-get -y install build-essential python-dev python-six python-virtualenv libcurl4-nss-dev libsasl2-dev libsasl2-modules maven libapr1-dev libsvn-devInstall other Mesos dependencies.
4.Install libz which is required to build mesos
$sudo apt install zlib1g-dev
5.Configure and build mesos
$ mkdir build
$ cd build
build$../configure
$ make install
it will create libmesos.so file in /usr/local/lib folder
6.start mesos slave and master
$ cd build
#start master
./bin/mesos-master.sh –ip=127.0.0.1 –work_dir=/tmp/mesos
#start slave
./bin/mesos-slave.sh –master=127.0.0.1:5050 –work_dir=/tmp/mesos
navigate to mesos uri localhost:5050 you can see its working or not on worker tab there will be 1 active worker
7.now got to $SPARK_HOME/CONF
inside your spark-env.sh add following parameters
export MESOS_NATIVE_JAVA_LIBRARY= /usr/local/lib/libmesos.so
export SPARK_EXECUTOR_URI=/path/to/spark-2.2.0-bin-hadoop2.7.tgz
8. start spark shell with mesos as master
./bin/spark-shell –master mesos://127.0.0.1:5050
go to your mesos ui
Reblogged this on Site Title.
Reblogged this on Site Title.
Reblogged this on Coding, Unix & Other Hackeresque Things.