Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
How do I generate sql result into xml ?
Tried this way (shown below), but it generates all data into single column.
Tell me kindly what else should I post here to get help.
Please help.
Thanks ahead.
Code:
generate_xml()
{
f1=_tmp1
f2=_tmp2
cat $SQL_QUERY_OUT |sed -e 's/^"//g;s/"$//g;s/","/|/g'|tr '|' '\t' > $f1
[ -e $SQL_QUERY_OUT ] && rm $SQL_QUERY_OUT
# prepare XML header
while read LINE;
do
print $LINE >> $SQL_QUERY_OUT
done <<EOF
<?xml version="1.0"?>
<?mso-application progid="Excel.Sheet"?>
<Workbook xmlns="urn:schemas-microsoft-com:office:spreadsheet"
xmlns:o="urn:schemas-microsoft-com:office:office"
xmlns:x="urn:schemas-microsoft-com:office:excel"
xmlns:ss="urn:schemas-microsoft-com:office:spreadsheet"
xmlns:html="http://www.w3.org/TR/REC-html40">
<DocumentProperties xmlns="urn:schemas-microsoft-com:office:office">
<Author>b</Author>
<LastAuthor>b</LastAuthor>
<Created>2013-08-18T18:36:57Z</Created>
<Company>b</Company>
<Version>11.9999</Version>
</DocumentProperties>
<ExcelWorkbook xmlns="urn:schemas-microsoft-com:office:excel">
<WindowHeight>11385</WindowHeight>
<WindowWidth>15180</WindowWidth>
<WindowTopX>360</WindowTopX>
<WindowTopY>90</WindowTopY>
<ProtectStructure>False</ProtectStructure>
<ProtectWindows>False</ProtectWindows>
</ExcelWorkbook>
<Styles>
<Style ss:ID="Default" ss:Name="Normal">
<Alignment ss:Vertical="Bottom"/>
<Borders/>
<Font ss:FontName="Arial Cyr"/>
<Interior/>
<NumberFormat/>
<Protection/>
</Style>
<Style ss:ID="s21">
<Borders>
<Border ss:Position="Bottom" ss:LineStyle="Continuous" ss:Weight="1"/>
<Border ss:Position="Left" ss:LineStyle="Continuous" ss:Weight="1"/>
<Border ss:Position="Right" ss:LineStyle="Continuous" ss:Weight="1"/>
<Border ss:Position="Top" ss:LineStyle="Continuous" ss:Weight="1"/>
</Borders>
<Font ss:FontName="Arial Cyr" ss:Bold="1"/>
</Style>
<Style ss:ID="s22">
<Borders>
<Border ss:Position="Bottom" ss:LineStyle="Continuous" ss:Weight="1"/>
<Border ss:Position="Left" ss:LineStyle="Continuous" ss:Weight="1"/>
<Border ss:Position="Right" ss:LineStyle="Continuous" ss:Weight="1"/>
<Border ss:Position="Top" ss:LineStyle="Continuous" ss:Weight="1"/>
</Borders>
</Style>
</Styles>
<Worksheet ss:Name="Sheet1">
<Table>
<Column ss:Width="50"/>
<Column ss:Width="50"/>
<Column ss:Width="50"/>
EOF
print "<Row>" >> $SQL_QUERY_OUT
for i in `awk 'NR==1' $f1`
do
print ' <Cell ss:StyleID="s21"><Data ss:Type="String">'$i'</Data></Cell>' >> $SQL_QUERY_OUT
done
print "</Row>" >> $SQL_QUERY_OUT
awk 'NR>1' $f1|while read LINE;
do
print "<Row>" >> $SQL_QUERY_OUT
print "$LINE"|awk -F\t '{for (i=1;i<=NF;i++){print $i}}' > $f2
while read i;
do
k=$(print "$i"|sed 's/&/&/g') # replace ampersand
print ' <Cell ss:StyleID="s22"><Data ss:Type="String">'$k'</Data></Cell>' >> $SQL_QUERY_OUT
done < $f2
print "</Row>" >> $SQL_QUERY_OUT
done
# add XML footer
while read LINE;
do
print $LINE >> $SQL_QUERY_OUT
done <<EOF
</Table>
<WorksheetOptions xmlns="urn:schemas-microsoft-com:office:excel">
<PageSetup>
<PageMargins x:Bottom="0.984251969" x:Left="0.78740157499999996"
x:Right="0.78740157499999996" x:Top="0.984251969"/>
</PageSetup>
<Print>
<ValidPrinterInfo/>
<HorizontalResolution>200</HorizontalResolution>
<VerticalResolution>200</VerticalResolution>
</Print>
<Selected/>
<Panes>
<Pane>
<Number>3</Number>
<ActiveRow>1</ActiveRow>
<ActiveCol>2</ActiveCol>
</Pane>
</Panes>
<ProtectObjects>False</ProtectObjects>
<ProtectScenarios>False</ProtectScenarios>
</WorksheetOptions>
</Worksheet>
</Workbook>
EOF
[ -e $f1 ] && rm $f1
[ -e $f2 ] && rm $f2
}
ora_process_query()
{
[ $err1 -ne 0 ] && return
print "Start Oracle ..."|tee -a $DELTA_LOG
print "Query: " |tee -a $DELTA_LOG
print "~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~" |tee -a $DELTA_LOG
print $SQL_QUERY |tee -a $DELTA_LOG
print "~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~" |tee -a $DELTA_LOG
print "conn $DBCONN" > $SQL_SCRIPT
print "set feed off" >> $SQL_SCRIPT
print "set pagesize 0" >> $SQL_SCRIPT
print 'set colsep ""","""' >> $SQL_SCRIPT
print "set linesize 5000" >> $SQL_SCRIPT
print "set pagesize 5000" >> $SQL_SCRIPT
print "set trimspool on" >> $SQL_SCRIPT
print "spool $SQL_QUERY_OUT" >> $SQL_SCRIPT
print "$SQL_QUERY" >> $SQL_SCRIPT
print "exit" >> $SQL_SCRIPT
sqlplus -s /nolog @$SQL_SCRIPT >> $SQL_LOG
err1=$? # save exit code
err2=`grep -i ora- $SQL_LOG|wc -l|awk '{print $1}'` # if SQL_LOG has any ORA- errors
if [[ $err1 != 0 || $err2 != 0 ]]; then
finish "SQL Failure!"
fi
sed -e 's/[ ]*["][,]["]/","/g;s/^[ ]*/"/g;s/$/"/g' $SQL_QUERY_OUT|sed -e '3d'|sed -e '1d' > /tmp/_f1
mv /tmp/_f1 $SQL_QUERY_OUT
[ "$SQL_QUERY_TYPE" == "XML" ] && generate_xml
}
ora_work()
{
[ $err1 -ne 0 ] && return
print "Start Oracle ..." |tee -a $DELTA_LOG
print "conn $DBCONN" > $SQL_SCRIPT
# put all commands to Oracle database in ordered way
for i in `awk '/^SQL_FILE/ {print substr($0,9,256)}' $CFG_FILE|sort -k 1n|awk -F= '{print $1}'`
do
eval print "\$SQL_FILE$i" >> $SQL_SCRIPT
done
sqlplus /nolog @$SQL_SCRIPT >> $SQL_LOG
err1=$? # save exit code
print "SQL LOG:" |tee -a $DELTA_LOG
cat $SQL_LOG |tee -a $DELTA_LOG # save sql log to the main log
err2=`grep -i ora- $SQL_LOG|wc -l|awk '{print $1}'` # if SQL_LOG has any ORA- errors
if [[ $err1 != 0 || $err2 != 0 ]]; then
finish "SQL Failure!"
fi
}
save_local()
{
[ $err1 -ne 0 ] && return
[ "$HIST_DIR" == "" ] && return
print |tee -a $DELTA_LOG
print -n "Save files to the local dir ..." |tee -a $DELTA_LOG
for i in `awk '/^SFTP_FILE/ {print substr($0,10,256)}' $CFG_FILE|sort -k 1n|awk -F= '{print $1}'`
do
generated_file=$(eval print "put \$SFTP_FILE$i"|awk '{print $2}') # fullname of the generated file
fn=$(basename $generated_file) # shortname of the generated file
history_file=${fn%%.*}_$TS.${fn##*.} # add timestamp between name and extension
cp -p $generated_file $HIST_DIR/$history_file # copying
done
print "done." |tee -a $DELTA_LOG
}
Last edited by marchelloUA; 12-05-2014 at 06:40 AM.
Ok so now we have more of the script. There are several issues with the code, so I will ask specifically which function is not performing as you wish and to please narrow it down
to what you have found?
You must understand you have presented nearly 200 lines of code, which to be perfectly honest looks a mess.
My suggestion would be to start with a single function and work out what it does and what you do not understand ... then come back and ask questions about those specific parts.
It would also possibly help, once you have done some work, to provide both the current and the expected output for the section you look at.
Ok so now we have more of the script. There are several issues with the code, so I will ask specifically which function is not performing as you wish and to please narrow it down
to what you have found?
You must understand you have presented nearly 200 lines of code, which to be perfectly honest looks a mess.
My suggestion would be to start with a single function and work out what it does and what you do not understand ... then come back and ask questions about those specific parts.
It would also possibly help, once you have done some work, to provide both the current and the expected output for the section you look at.
Well, it worked just fine before migration and now it generates result file not properly.
We have columns in result file:
RB FB FBH AB ABH FBAD FBHAD FBAC FBHAC ABAA ABHAA
and values of these columns are all inserted into single RB column
941 189 211 2 3 0 0 0 0 3 0
They should be placed each into another column.
How do I perform it?
cat /etc/*-release
LSB_VERSION=base-4.0-amd64:base-4.0-noarch:core-4.0-amd64:core-4.0-noarch:graphics-4.0-amd64:graphics-4.0-noarch:printing-4.0-amd64:printing-4.0-noarch
Red Hat Enterprise Linux Server release 6.6 (Santiago)
Red Hat Enterprise Linux Server release 6.6 (Santiago)
Last edited by marchelloUA; 12-05-2014 at 11:35 AM.
This looks fine, because result file is created successfully.
Thats up to you, my work is to suggest the right use of utilities given.
Quote:
941 189 211 2 3 0 0 0 0 3 0
Isee the values are separated by an space, there are many ways to store them one of which is arrays..
use array to store all values in different indexes and print them in columns you need.
Since you think you are not doing any other mistake or useless uses this is my suggestion.
So as you have said it works great I am guessing this is SOLVED? (please mark as such)
I am interested that your previous reply to me suddenly mentioned migration which you never mentioned before.
Little details are important if you want people to be able to help you properly.
As for your new solution, you should virtually always quote unusual characters other wise the shell may make changes you are not prepared for.
I would suggest only quoting the the tab (\t) as -F is a switch to awk and quoting it may make awk not behave as you expect (maybe not now but later on)
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.