Other *NIXThis forum is for the discussion of any UNIX platform that does not have its own forum. Examples would include HP-UX, IRIX, Darwin, Tru64 and OS X.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I'm just starting to use Linux and am trying to install Hadoop onto my computer. I downloaded the Hadoop file from a mirror recommended by Apache. The file is hadoop-2.2.0.tar.gz. Every website that I have read says that the command "tar -xzvf hadoop-2.2.0.tar.gz" should be able to unpack the file, but I keep getting the error "Error opening archive: Failed to open 'hadoop-2.2.0.tar.gz' "
I have tried both xzf and xvzf, neither seems to solve the problem.
Was I perhaps supposed to do something before I unpacked the file? What is going on?
(My computer is running OSX 10.9.3)
Note: I just started using this forum so I wasn't sure which category I should have posted this thread under; excuse me if this is the wrong category.
not to sound snarky but if you dont know the file command, how do you expect to administer a multi-node hadoop cluster with a distributed filesystem (in fact i'm not even convinced hadoop can be installed on mac -- i just looked it on duckduckgo: https://wiki.apache.org/hadoop/Runni...ode_Cluster%29) ?
what are you trying to do. i am not trying to dissuade you but perhaps you are going about it the wrong way ?
Quote:
Originally Posted by bbdynamite
I'm just starting to use Linux...
(My computer is running OSX 10.9.3)
a clarification although mac os uses the unix kernel it is not the same as using gnu/linux. the linux kernel is similar but not the same as unix.
No offense taken. This is my first computer science-related job so I'm still trying to find my way.
I'm almost certain that hadoop can be installed on Mac; there are a few websites with instructions on how to do so. I don't doubt that I could be going about it the wrong way though.
Quote:
a clarification although mac os uses the unix kernel it is not the same as using gnu/linux. the linux kernel is similar but not the same as unix.
Thanks, good to know!
Last edited by bbdynamite; 06-27-2014 at 03:04 PM.
I think it worked! I didn't get an error this time. Thank you so much!!
In retrospect, this was a really dumb question to post. Sorry about that.
I have 2 more questions:
1. After tar -xzf hadoop-2.2.0.tar.gz the instructions say to type chown -R hadoop hadoop-2.2.0.tar.gz. What does the second command do?
2. The Apache website says to verify the integrity of the downloaded hadoop file using the PGP signature. I didn't do this because I couldn't find the public keys and the asc signature file. Will this be an issue later on when I am configuring or using hadoop?
So now I am supposed to edit the .bashrc file, but I can't locate the .bashrc file. Instructions say I should "update its appropriate configuration files instead of .bashrc" if I don't have a .bashrc file. (Although I'm pretty certain I do have a .bashrc file.)
What are my appropriate configuration files, can you tell?
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.