top button
Flag Notify
    Connect to us
      Site Registration

Site Registration

The archive file created in Hadoop always has the extension of

0 votes
832 views
A.hrc
B.har
C.hrh
D.hrar

Correct Option: 2  
posted Nov 30, 2017 by anonymous

Looking for an answer?  Promote on:
Facebook Share Button Twitter Share Button LinkedIn Share Button

Similar Questions
+2 votes

I have a working version of java 7 installed. I can execute java programs in the workstation. When I start hdfs, the statup process aborts with a message JAVA_HOME not set.

OS : Ubuntu 13.04 raring ringtail
Hadoop version : 2.1.1-beta
Java version:java-7-openjdk-amd64

+2 votes
public class MaxMinReducer extends Reducer {
int max_sum=0; 
int mean=0;
int count=0;
Text max_occured_key=new Text();
Text mean_key=new Text("Mean : ");
Text count_key=new Text("Count : ");
int min_sum=Integer.MAX_VALUE; 
Text min_occured_key=new Text();

 public void reduce(Text key, Iterable<IntWritable> values, Context context) throws IOException, InterruptedException {
       int sum = 0;           

       for (IntWritable value : values) {
             sum += value.get();
             count++;
       }

       if(sum < min_sum)
          {
              min_sum= sum;
              min_occured_key.set(key);        
          }     


       if(sum > max_sum) {
           max_sum = sum;
           max_occured_key.set(key);
       }          

       mean=max_sum+min_sum/count;
  }

 @Override
 protected void cleanup(Context context) throws IOException, InterruptedException {
       context.write(max_occured_key, new IntWritable(max_sum));   
       context.write(min_occured_key, new IntWritable(min_sum));   
       context.write(mean_key , new IntWritable(mean));   
       context.write(count_key , new IntWritable(count));   
 }
}

Here I am writing minimum,maximum and mean of wordcount.

My input file :

high low medium high low high low large small medium

Actual output is :

high - 3------maximum

low - 3--------maximum

large - 1------minimum

small - 1------minimum

but i am not getting above output ...can anyone please help me?

+1 vote

The original local file has execution permission, and then it was distributed to multiple nodemanager nodes with Distributed Cache feature of Hadoop-2.2.0, but the distributed file has lost the execution permission.

However I did not encounter such issue in Hadoop-1.1.1.

Why this happened? Some changes about dfs.umask option or related staffs?

0 votes

I have just realized that my implementation of hadoop-2.4.1 is pulling in all the default.xml files.

I have three copies of each in different directories, obviously at least one of those is on the class path.

Anyway with all the effort to set up a site, it seems strange to me that I would use settings I had no idea existed and that may not be how I would choose to set them up.

...