LinuxQuestions.org
Visit Jeremy's Blog.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 09-30-2012, 05:31 PM   #1
kfn
LQ Newbie
 
Registered: Sep 2012
Location: NYC
Distribution: Slackware64-current
Posts: 11

Rep: Reputation: Disabled
Hadoop 1.0.3 Pipes ("Server failed to authenticate")


Hi all!

I'm trying to run a pseudo-distributed C++ job on hadoop pipes (version 1.0.3) in Slackware 14 64bit. Problem is that after the nodes' initialization and when the job starts.. the OutputHandler gives an exception while waiting for authentication.
(line 188: here)

The output handler's waitForAuthentication() is called by org.apache.hadoop.mapred.pipes.Application
(in line 149 here).

Therefore, after a few failed attempts the job fails with a "server failed to authenticate" message.

SSH seems to be working correctly though.. I tried putting the hadoop folder both in /opt and /home/user, but didn't see any difference..Same exact error.

Any ideas?


Here's an example run:
Code:
12/09/30 18:35:26 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
12/09/30 18:35:26 INFO util.NativeCodeLoader: Loaded the native-hadoop library
12/09/30 18:35:26 WARN snappy.LoadSnappy: Snappy native library not loaded
12/09/30 18:35:26 INFO mapred.FileInputFormat: Total input paths to process : 1
12/09/30 18:35:27 INFO mapred.JobClient: Running job: job_201209301832_0001
12/09/30 18:35:28 INFO mapred.JobClient:  map 0% reduce 0%
12/09/30 18:35:40 INFO mapred.JobClient: Task Id : attempt_201209301832_0001_m_000000_0, Status : FAILED
java.io.IOException
        at org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188)
        at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194)
        at org.apache.hadoop.mapred.pipes.Application.<init>(Application.java:149)
        at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)

attempt_201209301832_0001_m_000000_0: Server failed to authenticate. Exiting
After a few failed attempts like the above, it terminates:
Code:
12/09/30 18:36:04 INFO mapred.JobClient: Job complete: job_201209301832_0001
12/09/30 18:36:04 INFO mapred.JobClient: Counters: 7
12/09/30 18:36:04 INFO mapred.JobClient:   Job Counters 
12/09/30 18:36:04 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=42396
12/09/30 18:36:04 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
12/09/30 18:36:04 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
12/09/30 18:36:04 INFO mapred.JobClient:     Launched map tasks=8
12/09/30 18:36:04 INFO mapred.JobClient:     Data-local map tasks=8
12/09/30 18:36:04 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
12/09/30 18:36:04 INFO mapred.JobClient:     Failed map tasks=1
12/09/30 18:36:04 INFO mapred.JobClient: Job Failed: # of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201209301832_0001_m_000000
Exception in thread "main" java.io.IOException: Job failed!
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1265)
        at org.apache.hadoop.mapred.pipes.Submitter.runJob(Submitter.java:248)
        at org.apache.hadoop.mapred.pipes.Submitter.run(Submitter.java:479)
        at org.apache.hadoop.mapred.pipes.Submitter.main(Submitter.java:494)

Last edited by kfn; 10-04-2012 at 01:35 AM.
 
Old 10-04-2012, 01:33 AM   #2
kfn
LQ Newbie
 
Registered: Sep 2012
Location: NYC
Distribution: Slackware64-current
Posts: 11

Original Poster
Rep: Reputation: Disabled
Smile

Hello everybody!

After some googling and trial-and-error, i finally found the solution to this one. I suspect that other people may also come across this problem, so i'm posting it here.

I use Hadoop-1.0.3 (the tar.gz package, not the .deb or .rpm)
In my program's Makefile i was initially using the libraries in $(HADOOP_INSTALL)/c++/Linux-amd64-64/
I actually had to recompile these from source -with a couple of tweaks before- and include the new ones instead.

So, first of all, since i'm running Slackware64 14.0, I enabled the multilib support.

Then

1. Export a variable LIB=-lcrypto. (I actually put it in /etc/profile, so that i don't have to export it every time).

2. in $(HADOOP_INSTALL)/src/c++/pipes/impl/HadoopPipes.cc add
Code:
#include <unistd.h>
3. In $(HADOOP_INSTALL)/src/contrib/gridmix/src/java/org/apache/hadoop/mapred/gridmix/Gridmix.java, replace the two(2) lines described here.

4. In $(HADOOP_INSTALL)/src/c++/utils run
Code:
./configure
make install
5. In $(HADOOP_INSTALL)/src/c++/pipes run
Code:
./configure
make install
6. In the new Makefile, use
Code:
-I$(HADOOP_INSTALL)/src/c++/install/include
-L$(HADOOP_INSTALL)/src/c++/install/lib -lhadooputils -lhadooppipes -lcrypto -lssl -lpthread
That was it. Programs runs fine now.
 
Old 09-14-2013, 07:47 PM   #3
Nishant Kelkar
LQ Newbie
 
Registered: Sep 2013
Posts: 3

Rep: Reputation: Disabled
Thumbs up Many thanks to this excellent solution!

I would like to thank you on this excellent post! It has helped me resolve my issue, which was exactly what you were having at the time that you posted this. I'd also like to add, that this is a common problem on Stack Overflow that hasn't been addressed in clarity at all.

Thanks once again!

Regards,

Nishant Kelkar
 
Old 02-04-2014, 02:19 AM   #4
sudarsun
LQ Newbie
 
Registered: Jun 2008
Posts: 2

Rep: Reputation: 0
Thumbs up

Brilliant man. Thanks very much for your post.

I had got the same problem in Linux Mint 12 x64 as well. I just did (1), (4), (5), (6) to get my C++ wordcount application working fine. But instead of using LIB=-lcrypto, it should be LIBS=-lcrypto and this can be passed as an argument to the ./configure script.
 
Old 08-16-2014, 12:26 AM   #5
udaya111
LQ Newbie
 
Registered: Feb 2014
Posts: 2

Rep: Reputation: Disabled
issue while running pipes in hadoop-1.2.1

Hello everybody,

Tried the following step which was mentioned by KFN....

Till the 4th step ,it was clear

When i was proceeding with the fifth step,it was throwing error like this


syscon@syscon-OptiPlex-3020:~/uday/hadoop-1.2.1/src/c++/pipes$ ./configure
checking for a BSD-compatible install... /usr/bin/install -c
checking whether build environment is sane... yes
checking for gawk... no
checking for mawk... mawk
checking whether make sets $(MAKE)... yes
checking for style of include used by make... GNU
checking for gcc... gcc
checking for C compiler default output file name... a.out
checking whether the C compiler works... yes
checking whether we are cross compiling... no
checking for suffix of executables...
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... yes
checking for gcc option to accept ISO C89... none needed
checking dependency style of gcc... gcc3
checking for special C compiler options needed for large files... no
checking for _FILE_OFFSET_BITS value needed for large files... no
checking how to run the C preprocessor... gcc -E
checking for grep that handles long lines and -e... /bin/grep
checking for egrep... /bin/grep -E
checking for ANSI C header files... yes
checking for sys/types.h... yes
checking for sys/stat.h... yes
checking for stdlib.h... yes
checking for string.h... yes
checking for memory.h... yes
checking for strings.h... yes
checking for inttypes.h... yes
checking for stdint.h... yes
checking for unistd.h... yes
checking pthread.h usability... yes
checking pthread.h presence... yes
checking for pthread.h... yes
checking for pthread_create in -lpthread... yes
checking for HMAC_Init in -lssl... no
configure: error: Cannot find libssl.so
./configure: line 4809: exit: please: numeric argument required
./configure: line 4809: exit: please: numeric argument required



syscon@syscon-OptiPlex-3020:~/uday/hadoop-1.2.1/src/c++/pipes$ locate libssl.so
/home/syscon/uday/hadoop-1.2.1/c++/Linux-amd64-64/lib/libssl.so
/lib/x86_64-linux-gnu/libssl.so.0.9.8
/lib/x86_64-linux-gnu/libssl.so.1.0.0
/usr/lib/libssl.so
/usr/lib/x86_64-linux-gnu/libssl.so
/usr/lib/x86_64-linux-gnu/libssl.so.0.9.8
/usr/local/bin/libssl.so


Note :i've installed libssl.so in my PC....But it's still throwing the error...

Where i need to change in the configure file inorder to make it working?

Can someone helpme pls?
 
  


Reply

Tags
pipes, slackware


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
LXer: A Quick Overview of Hadoop LXer Syndicated Linux News 0 08-15-2012 12:50 AM
LXer: Who Loves Hadoop? LXer Syndicated Linux News 0 08-07-2012 03:00 PM
Public Keys and Hadoop hogelectra Linux - Newbie 3 05-12-2012 03:35 PM
LXer: Scheduling in Hadoop LXer Syndicated Linux News 0 12-06-2011 05:41 PM
LXer: Cloudera Distribution of Hadoop Available, Makes Hadoop Easy LXer Syndicated Linux News 0 03-16-2009 08:10 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 10:11 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration