Hadoop 1.0.3 Pipes ("Server failed to authenticate")
I'm trying to run a pseudo-distributed C++ job on hadoop pipes (version 1.0.3) in Slackware 14 64bit. Problem is that after the nodes' initialization and when the job starts.. the OutputHandler gives an exception while waiting for authentication.
(line 188: here)
The output handler's waitForAuthentication() is called by org.apache.hadoop.mapred.pipes.Application
(in line 149 here).
Therefore, after a few failed attempts the job fails with a "server failed to authenticate" message.
SSH seems to be working correctly though.. I tried putting the hadoop folder both in /opt and /home/user, but didn't see any difference..Same exact error.
Any ideas? :redface:
Here's an example run:
After some googling and trial-and-error, i finally found the solution to this one. I suspect that other people may also come across this problem, so i'm posting it here.
I use Hadoop-1.0.3 (the tar.gz package, not the .deb or .rpm)
In my program's Makefile i was initially using the libraries in $(HADOOP_INSTALL)/c++/Linux-amd64-64/
I actually had to recompile these from source -with a couple of tweaks before- and include the new ones instead.
So, first of all, since i'm running Slackware64 14.0, I enabled the multilib support.
1. Export a variable LIB=-lcrypto. (I actually put it in /etc/profile, so that i don't have to export it every time).
2. in $(HADOOP_INSTALL)/src/c++/pipes/impl/HadoopPipes.cc add
4. In $(HADOOP_INSTALL)/src/c++/utils run
Many thanks to this excellent solution!
I would like to thank you on this excellent post! It has helped me resolve my issue, which was exactly what you were having at the time that you posted this. I'd also like to add, that this is a common problem on Stack Overflow that hasn't been addressed in clarity at all.
Thanks once again! :) :D
|All times are GMT -5. The time now is 10:17 PM.|