Quote:
Thanks Red |
Mysterious. Is echo * output complete?
|
Catkin,
Thank you for the reply. I am really showing my ignorance now, but how do I implement the echo * command? Do I add it as another line after the ls -l command? Thanks and sorry for the basic question. Red |
Quote:
It was a theoretical answer of no practical significance for your issue. |
You could try echo * at the command line. In your script you can use it instead of ls -l or as well as ls -l (as another line after). This suggestion is to investigate the problem, not to workaround it.
The * is a shell "filename expansion" expression. It will be expanded by the shell and replaced by the names of all files and directories not beginning with . The echo command will write those names to stdout which you can The reason for asking you to try it is to try something more basic than dir or ls to show whether the files are visible. If your ls command has options to change the order files are listed in (like -t, -S, -r) then you could experiment with them and maybe see whether the problem is that ls does not "see" the files or whether its output is is truncated. At this stage, we are varying the action in the hope of gaining a clearer understanding of the problem. It's particularly strange that only two of the four very similar "Any Human Heart_<nnnnnn>.rec" files are shown. And ls is pretty robust ... :scratch: Can you try ls -l > lsl.$$.out when running from cron? That would show whether the problem is with ls itself or somehow in the post-processing of its output. Can you post the part of the script that runs ls, up to emailing its output? |
Quote:
|
Catkin
what a fantastic observation! I moved some files around and let it run in cron again. The following is the output from the script. Again, only the files with a size below the cut-off you identified are listed with ls -l whilst dir-l lists all files. I confirmed this is all files by ftp'ing to check. I have a new question now, which is why is there this cut off and is there a way around it? Quote:
Red |
Catkin
apologies for the pause I was away for a couple of days Here is the text of the script that does the listing. I've added the parts you suggested but it is due to run in cron later. The text files that are created get turned into the body of an email and sent to me. Thank you for your help. Red Code:
|
Hello Red :)
Quote:
Quote:
The ls -l > lsl.$$.out /mnt/nas1/Films/keep >> /public/ftop2nas/nas1keep.log line should put nothing in the log because it should all go to lsl.$$.out. In case ls is producing any error messages you could try ls -l > lsl.$$.out /mnt/nas1/Films/keep 2>> /public/ftop2nas/nas1keep.log |
Catkin thanks to the thought about 2^31
I modfified the script to use echo *. This lists all the files that dir pulls up. I used ls.$$$.out and output the output to the text file. It is included below and behaves as anticipated. I also modified the script so that ls sorted the files by size and in reverse size order also. Essentially it makes no difference. It looks like ls is broken on unslung. Thank you very much for all your help trying to understand what is going on. Red Quote:
|
I have concluded this is probably a unslung slug based problem so I have moved on to an unslung slug specific forum at
http://tech.groups.yahoo.com/group/n...neral/messages Thank you to catkin for all the help. Red |
Environments
Perhaps it is an environment thing... where your standard and cron call different environments.
Test it by writing some scripts with different environments and then having the cron() call these scripts instead. Code:
#!/bin/bash Code:
#!/bin/sh Code:
#!/bin/csh Code:
#!/bin/tcsh Code:
#!/bin/zsh Code:
#!/bin/ksh |
In case anyone wants to follow, here's the specific nslu2 thread. Apparently nslu2 has a /bin/ls with the 2 GB limit and a /opt/bin/ls which does not so this problem is probably a cron $PATH issue.
|
Catkin and cin_
thank you very much. I was dropping back to tidy up with the answer and I saw it had already been posted. The issue is as cin_ suggested and as catkin outlines. Using /opt/bin/ls in the script sees it work perfectly. Thank you all very much Red |
Glad you found a solution in the end :)
We might have got there if we had focussed more on the cron/terminal aspect than on the bizarre 2 GB limit. Threads can be marked SOLVED via the Thread Tools menu. |
All times are GMT -5. The time now is 03:42 AM. |