Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
ya i found this step in HDP installation... i am also facing the same problem... i have also tried the mkdir by placing the whole path instead of variable $DFS_NAME_DIR ut it doesnt work..
thnks shivaa... i have done it as u mentioned... some of the folders are created but some are not get created...how? Is there any condition for that...
and many more variables they hav declared. I have run this directories.sh file and "/grid/hadoop/hdfs/namenode" these folders get created automatically however the folders "/var/run/hadoop/hdfs" are not get created. what will be the cause for it?
however the folders "/var/run/hadoop/hdfs" are not get created. what will be the cause for it?
Without seeing the script or its output we can only guess. Here goes:
1. There is no line in the script like "mkdir $HDFS_PID_DIR"
2. The script exits before calling "mkdir $HDFS_PID_DIR"
3. The user running the script does not have permission to write to /var/run/hadoop/hdfs
4. The directory /var/run/hadoop does not exist and/or mkdir is called without -p
5. etc...
6. etc...
7. Ad infinitum
Also, when you run a script, a new shell is started to run it.
You need to "source" the scipt to run it with current shell.
Also the new variables are not passed to the new shell unless you "export" them.
1.My script doesnt contain command "mkdir".
2.In the script there is line like DFS_NAME_DIR="/grid/hadoop/hdfs/namenode /grid1/hadoop/hdfs/namenode". After running the script, i am supposed to run a command in a terminal as
# mkdir -p $DFS_NAME_DIR
after running above command oit displays following:
mkdir: missing operand
Try `mkdir --help' for more information.
3.after running above command successfully i am supposed to run next command
# chmod -R 755 $DFS_NAME_DIR
I also want to ask that after assigning a variable in a script and after running that script, can we get that variable value by using command # echo $variable_name. I have tried it but i was unable to get the value by echo command, does it mean that variable doesnt holding the assigned value permanantly?
Then it will not create a directory. You should add it, as:
Code:
mkdir $DIRNAME
Quote:
2.In the script there is line like DFS_NAME_DIR="/grid/hadoop/hdfs/namenode /grid1/hadoop/hdfs/namenode". After running the script, i am supposed to run a command in a terminal as
# mkdir -p $DFS_NAME_DIR
after running above command oit displays following:
mkdir: missing operand
Try `mkdir --help' for more information.
Simply running mkdir will not create a directory, but you need to provide directory name as argument to mkdir command, as said above.
Quote:
3.after running above command successfully i am supposed to run next command
# chmod -R 755 $DFS_NAME_DIR
I also want to ask that after assigning a variable in a script and after running that script, can we get that variable value by using command # echo $variable_name. I have tried it but i was unable to get the value by echo command, does it mean that variable doesnt holding the assigned value permanantly?
Yes, you can display any variable name using echo, as:
Code:
echo $VARIABLE_NAME
If it doesn't show anything, then it means there's nothing assigned to variable VARIABLE_NAME.
Note: Without checking full script, it's not easy to suggest anything. Also as said earlier, invoke your script with set -xv and post script's output here.
MY directories.sh file is as follow: after running this script i am supposed to run rest of the commands like
# mkdir -p $DFS_NAME_DIR
# chmod -R 755 $DFS_NAME_DIR
Pls hav a look in following script:
#!/bin/sh
#
# Directories Script
#
# 1. To use this script, you must edit the TODO variables below for your environment.
#
# 2. Warning: Leave the other parameters as the default values. Changing these default values will require you to
# change values in other configuration files.
#
#
# Hadoop Service - HDFS
#
# Space separated list of directories where NameNode will store file system image.
DFS_NAME_DIR="/grid/hadoop/hdfs/namenode /grid1/hadoop/hdfs/namenode"
# Space separated list of directories where DataNodes will store the blocks.
DFS_DATA_DIR="/grid/hadoop/hdfs/datanode /grid1/hadoop/hdfs/datanode /grid2/hadoop/hdfs/datanode"
# Space separated list of directories where SecondaryNameNode will store checkpoint image.
FS_CHECKPOINT_DIR="/grid/hadoop/hdfs/secondarynamenode /grid1/hadoop/hdfs/secondarynamenode /grid2/hadoop/hdfs/secondarynamenode"
# Directory to store the Hadoop configuration files.
HADOOP_CONF_DIR="/etc/hadoop/conf"
# Directory to store the HDFS logs.
HDFS_LOG_DIR="/var/log/hadoop/hdfs"
# Directory to store the HDFS process ID.
HDFS_PID_DIR="/var/run/hadoop/hdfs"
#
# Hadoop Service - MapReduce
#
# Space separated list of directories where MapReduce will store temporary data.
MAPREDUCE_LOCAL_DIR="/grid/hadoop/mapred /grid1/hadoop/mapred /grid2/hadoop/mapred"
# Directory to store the MapReduce logs.
MAPRED_LOG_DIR="/var/log/hadoop/mapred"
# Directory to store the MapReduce process ID.
MAPRED_PID_DIR="/var/run/hadoop/mapred"
#
# Hadoop Service - Hive
#
# Directory to store the Hive configuration files.
HIVE_CONF_DIR="/etc/hive/conf"
# Directory to store the Hive logs.
HIVE_LOG_DIR="/var/log/hive"
# Directory to store the Hive process ID.
HIVE_PID_DIR="/var/run/hive"
#
# Hadoop Service - Templeton
#
# Directory to store the Templeton configuration files.
WEBHCAT_CONF_DIR="/usr/lib/hcatalog/conf"
# Directory to store the Templeton logs.
WEBHCAT_LOG_DIR="/var/log/webhcat"
# Directory to store the Templeton process ID.
WEBHCAT_PID_DIR="/var/run/webhcat"
#
# Hadoop Service - HBase
#
# Directory to store the HBase configuration files.
HBASE_CONF_DIR="/etc/hbase/conf"
# Directory to store the HBase logs.
HBASE_LOG_DIR="/var/log/hbase"
# Directory to store the HBase process ID.
HBASE_PID_DIR="/var/run/hbase"
#
# Hadoop Service - ZooKeeper
#
# Directory where ZooKeeper will store data.
ZOOKEEPER_DATA_DIR="/grid1/hadoop/zookeeper/data"
# Directory to store the ZooKeeper configuration files.
ZOOKEEPER_CONF_DIR="/etc/zookeeper/conf"
# Directory to store the ZooKeeper logs.
ZOOKEEPER_LOG_DIR="/var/log/zookeeper"
# Directory to store the ZooKeeper process ID.
ZOOKEEPER_PID_DIR="/var/run/zookeeper"
#
# Pig
#
# Directory to store the Pig configuration files.
PIG_CONF_DIR="/etc/pig/conf"
#
# Hadoop Service - Oozie
#
# Directory to store the Oozie configuration files.
OOZIE_CONF_DIR="/etc/oozie/conf"
# Directory to store the Oozie data.
OOZIE_DATA="/var/db/oozie"
# Directory to store the Oozie logs.
OOZIE_LOG_DIR="/var/log/oozie"
# Directory to store the Oozie process ID.
OOZIE_PID_DIR="/var/run/oozie"
# Directory to store the Oozie temporary files.
OOZIE_TMP_DIR="/var/tmp/oozie"
#
# Hadoop Service - Sqoop
#
SQOOP_CONF_DIR="/etc/sqoop/conf"
that script, as posted, does nothing but set some variables that will go out of scope as soon as it exits. Ie it does absolutely nothing. Is that really the whole script?
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.