LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Slackware (https://www.linuxquestions.org/questions/slackware-14/)
-   -   .bash_profile, "-bash: export: command not found" (https://www.linuxquestions.org/questions/slackware-14/bash_profile-bash-export-command-not-found-853444/)

Squall90 12-31-2010 09:02 AM

.bash_profile, "-bash: export: command not found"
 
Hi there,

I was trying to get rid of qtconfig's error message "QGtkStyle was unable to detect the current GTK+ theme.".

So I created the file ~/.gtkrc-2.0 with this content:
Code:

gtk-theme-name="Xfce"
Then I had to read this file after logging in (or at least, when I start X. So I created a ~/.bash_profile file with the following line:
Code:

export GTK2_RC_FILES="$HOME/.gtkrc-2.0"
When I now login, I get this message:
Code:

-bash: export: command not found
Any idea what could cause this problem?

I am running Slackware 13.1.

GazL 12-31-2010 09:15 AM

export is a shell built-in command. It should be impossible to not find it. I'd be inclined to check your profile file for any non-printable/control-characters that may have crept in.
grep export < ~/.bash_profile | od -tx1c

Squall90 01-01-2011 05:11 AM

This file is UTF-8 encoded. Could this the problem?
Code:

christian@tux-netbook:~$ grep export < ~/.bash_profile | od -tx1c
0000000  ef  bb  bf  65  78  70  6f  72  74  20  47  54  4b  32  5f  52
        357 273 277  e  x  p  o  r  t      G  T  K  2  _  R
0000020  43  5f  46  49  4c  45  53  3d  22  24  48  4f  4d  45  2f  2e
          C  _  F  I  L  E  S  =  "  $  H  O  M  E  /  .
0000040  67  74  6b  72  63  2d  32  2e  30  22  0a
          g  t  k  r  c  -  2  .  0  "  \n
0000053

"less" prints this:
Code:

<U+FEFF>export GTK2_RC_FILES="$HOME/.gtkrc-2.0"
So there's the UTF-8 notion? Is bash not UTF-8 compatible?

Squall90 01-02-2011 04:36 PM

I still don't know what caused this problems but I now added this line in my /etc/profile:
Code:

#  GTK QT Support
if [ -r "$HOME/.gtkrc-2.0" ]; then
        export GTK2_RC_FILES="$HOME/.gtkrc-2.0"
fi


amadeov 05-14-2012 09:38 PM

I know that this thread is a bit old, but I had the same problem running a script that I downloaded (check_snmp_printer from exchange.nagios.org) and ran on Ubuntu Server.

Somehow, the encoding at the start of the file was messed up. Though no extraneous characters appeared there in nano, the problem persisted after I had opened the file. I removed the first line, then replaced it by typing "#!/bin/bash" and inserting a line of space below it.

After doing so, the script ran fine. Hope this helps others! :D

Richard Cranium 05-15-2012 12:31 AM

Odds are that the file had Windows/DOS end-of-line characters. (That's CRLF or 0x0d 0x0A in hex). See "man fromdos" to see how to fix such a file.

David the H. 05-15-2012 08:39 AM

No, it wasn't the line endings, it was a byte order mark. You can see it in the output in post #3. MS programs sometimes add them to text files.

http://en.wikipedia.org/wiki/Byte_order_mark

rg3 05-16-2012 02:59 PM

Even Vim, under certain configurations, writes the BOM. To make sure it never does, put "set nobomb" in your .vimrc file.

colucix 05-16-2012 03:07 PM

The BOM shown above denotes UTF-16 encoding. You can easily convert to UTF-8 using iconv:
Code:

iconv -f UTF-16 -t UTF-8 file > newfile
and then rename the resulting file as the original.

David the H. 05-16-2012 03:45 PM

I'd say chances are they actually are UTF-8 files, just with BOMS inserted.

The Wikipedia entry says that while the BOM is not necessary for UTF-8, it's not actually invalid, and so some programs add them anyway. But most *nix applications choke on it in some way or another, so you really need to remove them when encountered.


All times are GMT -5. The time now is 05:01 PM.