top button
Flag Notify
    Connect to us
      Site Registration

Site Registration

multinode hadoop cluster on vmware

+1 vote
619 views

I want to set up a multinode hadoop cluser on vmware. Can anybody suggest some good material to do so.

I have tried these https://www.dropbox.com/s/05aurcp42asuktp/Chiu%20Hadoop%20Pig%20Install%20Instructions.docx instruction to setup single node hadoop cluster on vmware. N

ow can anybody help me in creating a multinode hadoop cluster ( 3 nodes) on vmware?

posted Dec 18, 2013 by Jagan Mishra

Share this question
Facebook Share Button Twitter Share Button LinkedIn Share Button

2 Answers

+2 votes
answer Dec 18, 2013 by Deepak Dasgupta
+1 vote

If I understand correctly, Hadoop or BigData applicatins are highly I/O bound. So for performance processing you would definitely prefer many little physical machines. Apart from that setting up Hadoop on VMWare should bring no questions that would not occur on physical installations.

Depending on your performance needs and your three virtual machines you would probably go with a dedicated master and two slave nodes. The master should be running NameNode and ResourceManager, the slaves DataNode and NodeManager. Other nodes in the Hadoop setup are optional.

answer Dec 19, 2013 by Seema Siddique
Similar Questions
+1 vote

I am trying to set up a hadoop cluster, I would like to know how many physical VMs are needed for this. . My main interest to measure the shuffle phase network traffic.

What is the basic requirement like namenode and data node .

+3 votes

In my cluster ,I want to have multiusers for different purpose. The usual method is to add a user through the OS on Hadoop NameNode.

I notice the hadoop also support to LDAP, could I add user through LDAP instead through OS? So that if a user is authenticated by the LDAP ,who will also access the HDFS directory?

+2 votes

I am running VMware player on CentOS 5.4 and its working fine. However it does not allow me to increase the RAM more than 3GB. It keeps throwing error stating- "Requested memory size is greater than allowed maximum of 3072 MB. Could not initiate memory hot plug."

I understand from few threads that 32bit OS has this kind of limitation but I am able to understand why I am seeing this issue when I am using 64bit OS and VMware player is also for 64bit.

+1 vote

If I run WindowsXP in a VM am I looking at Windows security risks? I have whatever protection Fedora 19 delivers out of the box and my DD-WRT router. I am still uneasy booting Windows ... There are no Windows boxes on my LAN, only Apple stuff in addition to mine.

+2 votes

Let we change the default block size to 32 MB and replication factor to 1. Let Hadoop cluster consists of 4 DNs. Let input data size is 192 MB. Now I want to place data on DNs as following. DN1 and DN2 contain 2 blocks (32+32 = 64 MB) each and DN3 and DN4 contain 1 block (32 MB) each. Can it be possible? How to accomplish it?

...