Setting Up Multi-Node Hadoop Cluster , just got easy !


In this blog,we are going to embark the journey of how to setup the Hadoop Multi-Node cluster on a distributed environment.

So lets do not waste any time, and let’s get started.
Here are steps you need to perform.


1.Download & install Hadoop for local machine (Single Node Setup) – 2.7.3
use java : jdk1.8.0_111
2. Download Apache Spark from :
choose spark release : 1.6.2

1. Mapping the nodes

First of all ,we have to edit hosts file in /etc/ folder on all nodes, specify the IP address of each system followed by their host names.

# vi /etc/hostsenter the following lines in the /etc/hosts hadoop-master hadoop-slave-2

View original post 687 more words

This entry was posted in Scala. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s