Issue while running hadoop pipes in hadoop -1.2.1

2014-08-16T13:53:05

 Hello everybody,

          Earlier I was getting an issue while running the c++ binaries in hadoop

 syscon@syscon-OptiPlex-3020:~/uday/hadoop-1.2.1$ bin/hadoop pipes -D hadoop.pipes.java.recordreader=true -D hadoop.pipes.java.recordwriter=true -input /dft1 -output dft1 -program /bin/wordcount

14/08/16 11:11:12 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
14/08/16 11:11:12 INFO util.NativeCodeLoader: Loaded the native-hadoop library
14/08/16 11:11:12 WARN snappy.LoadSnappy: Snappy native library not loaded
14/08/16 11:11:12 INFO mapred.FileInputFormat: Total input paths to process : 0
14/08/16 11:11:12 INFO mapred.JobClient: Running job: job_201408161011_0003
14/08/16 11:11:13 INFO mapred.JobClient:  map 0% reduce 0%
14/08/16 11:11:20 INFO mapred.JobClient: Task Id : attempt_201408161011_0003_r_000000_0, Status : FAILED
java.io.IOException
    at org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188)
    at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194)
    at org.apache.hadoop.mapred.pipes.Application.<init>(Application.java:149)
    at org.apache.hadoop.mapred.pipes.PipesReducer.startApplication(PipesReducer.java:81)
    at org.apache.hadoop.mapred.pipes.PipesReducer.close(PipesReducer.java:107)
    at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:532)
    at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:421)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Unknown Source)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)

After a few failed attempts like the above, it terminates:

attempt_201408161011_0003_r_000000_2: Server failed to authenticate. Exiting
14/08/16 11:11:37 INFO mapred.JobClient: Job complete: job_201408161011_0003
14/08/16 11:11:37 INFO mapred.JobClient: Counters: 6
14/08/16 11:11:37 INFO mapred.JobClient:   Job Counters 
14/08/16 11:11:37 INFO mapred.JobClient:     Launched reduce tasks=4
14/08/16 11:11:37 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=2096
14/08/16 11:11:37 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
14/08/16 11:11:37 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
14/08/16 11:11:37 INFO mapred.JobClient:     Failed reduce tasks=1
14/08/16 11:11:37 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=21447
14/08/16 11:11:37 INFO mapred.JobClient: Job Failed: # of failed Reduce Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201408161011_0003_r_000000
Exception in thread "main" java.io.IOException: Job failed!
    at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1357)
    at org.apache.hadoop.mapred.pipes.Submitter.runJob(Submitter.java:248)
    at org.apache.hadoop.mapred.pipes.Submitter.run(Submitter.java:479)
    at org.apache.hadoop.mapred.pipes.Submitter.main(Submitter.java:494)






Inorder to solve this issue ,tried finding the solution in google to solve this issue...while searching for that,


         *Found this link to be useful to fix the issue

         *so as per the given solution,tried running all the steps which was mwntioned in the below link

              http://www.linuxquestions.org/questions/showthread.php?p=5221898#post5221898

         *Till the 4th step ,it was clear

                 When i was proceeding with the fifth step,it was throwing error like this


syscon@syscon-OptiPlex-3020:~/uday/hadoop-1.2.1/src/c++/pipes$ ./configure
checking for a BSD-compatible install... /usr/bin/install -c
checking whether build environment is sane... yes
checking for gawk... no
checking for mawk... mawk
checking whether make sets $(MAKE)... yes
checking for style of include used by make... GNU
checking for gcc... gcc
checking for C compiler default output file name... a.out
checking whether the C compiler works... yes
checking whether we are cross compiling... no
checking for suffix of executables...
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... yes
checking for gcc option to accept ISO C89... none needed
checking dependency style of gcc... gcc3
checking for special C compiler options needed for large files... no
checking for _FILE_OFFSET_BITS value needed for large files... no
checking how to run the C preprocessor... gcc -E
checking for grep that handles long lines and -e... /bin/grep
checking for egrep... /bin/grep -E
checking for ANSI C header files... yes
checking for sys/types.h... yes
checking for sys/stat.h... yes
checking for stdlib.h... yes
checking for string.h... yes
checking for memory.h... yes
checking for strings.h... yes
checking for inttypes.h... yes
checking for stdint.h... yes
checking for unistd.h... yes
checking pthread.h usability... yes
checking pthread.h presence... yes
checking for pthread.h... yes
checking for pthread_create in -lpthread... yes
checking for HMAC_Init in -lssl... no
configure: error: Cannot find libssl.so
./configure: line 4809: exit: please: numeric argument required
./configure: line 4809: exit: please: numeric argument required



syscon@syscon-OptiPlex-3020:~/uday/hadoop-1.2.1/src/c++/pipes$ locate libssl.so
/home/syscon/uday/hadoop-1.2.1/c++/Linux-amd64-64/lib/libssl.so
/lib/x86_64-linux-gnu/libssl.so.0.9.8
/lib/x86_64-linux-gnu/libssl.so.1.0.0
/usr/lib/libssl.so
/usr/lib/x86_64-linux-gnu/libssl.so
/usr/lib/x86_64-linux-gnu/libssl.so.0.9.8
/usr/local/bin/libssl.so

Note: I've installed libssl.so in my PC....But it's still throwing the error...

Where i need to change in the configure file inorder to make it working?

Can someone help me please?

Copyright License:
Author:「user3532122」,Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.
Link to:https://stackoverflow.com/questions/25337363/issue-while-running-hadoop-pipes-in-hadoop-1-2-1

About “Issue while running hadoop pipes in hadoop -1.2.1” questions

Hello everybody, Earlier I was getting an issue while running the c++ binaries in hadoop syscon@syscon-OptiPlex-3020:~/uday/hadoop-1.2.1$ bin/hadoop pipes -D hadoop.pipes.java.recordre...
I have configured hadoop in pseudo-distributed mode (single -node cluster) on my ubuntu 10.04. I have a problem in running hadoop pipes code my code is following: #include "/home/hadoop/project/
I was running the MapReduce Matrix Multiplication.java program found at http://www.norstad.org/matrix-multiply/index.html. First: $javac -classpath /usr/local/hadoop/hadoop-core-1.2.1.jar:/usr/local/
I am newbie in Linux Hadoop. I am looking for guidance to make Hadoop up and running for writing C++ tasks. I tried to install Hadoop in pseudo-distributed mode using tutorial: http://www.michael-...
I am trying to run the example of wordcount in C++ like this link that describes the way to run the WordCount program in C++. So I have this code in the file wordcount.cpp: #include &lt;algorithm...
I am getting this error while running a hadoop pipes program. The program compiles successfully but fails on hadoop pipes. error while loading shared libraries: Lib.so.0: cannot open shared object...
First of all, I am a newbie of Hadoop. I have a small Hadoop pipes program that throws java.io.EOFException. The program takes as input a small text file and uses hadoop.pipes.java.recordreader ...
I have been using Hadoop 1.2.1 from recent past. I wanted to add my own functionality to Hadoop-common. However, the docs for building the hadoop source and developing are available only for latest
I have come across two approaches for compiling c++ programs using Hadoop pipes for Hadoop 1.2.0 and have had no luck with either of them. Approach 1: I came across the following link: http://cs.
I'm using Ubuntu 12.04 and installed hadoop 1.2.1 version. I have configured it to run on local set up successfully. I would like to use eclipse plugin for map reduce so that I can program in eclip...

Copyright License:Reproduced under the CC 4.0 BY-SA copyright license with link to original source & disclaimer.