-   Linux - Software (
-   -   Hadoop 1.0.3 Pipes ("Server failed to authenticate") (

kfn 09-30-2012 05:31 PM

Hadoop 1.0.3 Pipes ("Server failed to authenticate")
Hi all!

I'm trying to run a pseudo-distributed C++ job on hadoop pipes (version 1.0.3) in Slackware 14 64bit. Problem is that after the nodes' initialization and when the job starts.. the OutputHandler gives an exception while waiting for authentication.
(line 188: here)

The output handler's waitForAuthentication() is called by org.apache.hadoop.mapred.pipes.Application
(in line 149 here).

Therefore, after a few failed attempts the job fails with a "server failed to authenticate" message.

SSH seems to be working correctly though.. I tried putting the hadoop folder both in /opt and /home/user, but didn't see any difference..Same exact error.

Any ideas? :redface:

Here's an example run:

12/09/30 18:35:26 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
12/09/30 18:35:26 INFO util.NativeCodeLoader: Loaded the native-hadoop library
12/09/30 18:35:26 WARN snappy.LoadSnappy: Snappy native library not loaded
12/09/30 18:35:26 INFO mapred.FileInputFormat: Total input paths to process : 1
12/09/30 18:35:27 INFO mapred.JobClient: Running job: job_201209301832_0001
12/09/30 18:35:28 INFO mapred.JobClient:  map 0% reduce 0%
12/09/30 18:35:40 INFO mapred.JobClient: Task Id : attempt_201209301832_0001_m_000000_0, Status : FAILED
        at org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(
        at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(
        at org.apache.hadoop.mapred.pipes.Application.<init>(
        at org.apache.hadoop.mapred.MapTask.runOldMapper(
        at org.apache.hadoop.mapred.Child$
        at Method)
        at org.apache.hadoop.mapred.Child.main(

attempt_201209301832_0001_m_000000_0: Server failed to authenticate. Exiting

After a few failed attempts like the above, it terminates:

12/09/30 18:36:04 INFO mapred.JobClient: Job complete: job_201209301832_0001
12/09/30 18:36:04 INFO mapred.JobClient: Counters: 7
12/09/30 18:36:04 INFO mapred.JobClient:  Job Counters
12/09/30 18:36:04 INFO mapred.JobClient:    SLOTS_MILLIS_MAPS=42396
12/09/30 18:36:04 INFO mapred.JobClient:    Total time spent by all reduces waiting after reserving slots (ms)=0
12/09/30 18:36:04 INFO mapred.JobClient:    Total time spent by all maps waiting after reserving slots (ms)=0
12/09/30 18:36:04 INFO mapred.JobClient:    Launched map tasks=8
12/09/30 18:36:04 INFO mapred.JobClient:    Data-local map tasks=8
12/09/30 18:36:04 INFO mapred.JobClient:    SLOTS_MILLIS_REDUCES=0
12/09/30 18:36:04 INFO mapred.JobClient:    Failed map tasks=1
12/09/30 18:36:04 INFO mapred.JobClient: Job Failed: # of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201209301832_0001_m_000000
Exception in thread "main" Job failed!
        at org.apache.hadoop.mapred.JobClient.runJob(
        at org.apache.hadoop.mapred.pipes.Submitter.runJob(
        at org.apache.hadoop.mapred.pipes.Submitter.main(

kfn 10-04-2012 01:33 AM

Hello everybody!

After some googling and trial-and-error, i finally found the solution to this one. I suspect that other people may also come across this problem, so i'm posting it here.

I use Hadoop-1.0.3 (the tar.gz package, not the .deb or .rpm)
In my program's Makefile i was initially using the libraries in $(HADOOP_INSTALL)/c++/Linux-amd64-64/
I actually had to recompile these from source -with a couple of tweaks before- and include the new ones instead.

So, first of all, since i'm running Slackware64 14.0, I enabled the multilib support.


1. Export a variable LIB=-lcrypto. (I actually put it in /etc/profile, so that i don't have to export it every time).

2. in $(HADOOP_INSTALL)/src/c++/pipes/impl/ add

#include <unistd.h>
3. In $(HADOOP_INSTALL)/src/contrib/gridmix/src/java/org/apache/hadoop/mapred/gridmix/, replace the two(2) lines described here.

4. In $(HADOOP_INSTALL)/src/c++/utils run

make install

5. In $(HADOOP_INSTALL)/src/c++/pipes run

make install

6. In the new Makefile, use

-L$(HADOOP_INSTALL)/src/c++/install/lib -lhadooputils -lhadooppipes -lcrypto -lssl -lpthread

That was it. Programs runs fine now. :cool:

Nishant Kelkar 09-14-2013 07:47 PM

Many thanks to this excellent solution!
I would like to thank you on this excellent post! It has helped me resolve my issue, which was exactly what you were having at the time that you posted this. I'd also like to add, that this is a common problem on Stack Overflow that hasn't been addressed in clarity at all.

Thanks once again! :) :D


Nishant Kelkar

sudarsun 02-04-2014 02:19 AM

Brilliant man. Thanks very much for your post.

I had got the same problem in Linux Mint 12 x64 as well. I just did (1), (4), (5), (6) to get my C++ wordcount application working fine. But instead of using LIB=-lcrypto, it should be LIBS=-lcrypto and this can be passed as an argument to the ./configure script.

All times are GMT -5. The time now is 07:51 AM.