LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Slackware (https://www.linuxquestions.org/questions/slackware-14/)
-   -   Large file transfers crash Slack 9.1? (https://www.linuxquestions.org/questions/slackware-14/large-file-transfers-crash-slack-9-1-a-144926/)

steyr 02-11-2004 02:21 PM

Large file transfers crash Slack 9.1?
 
For the last couple days I've been trying to backup my laptop (running rh9) on my file server (slack 9.1), so I can go dual boot on the laptop.

Anyway, while backing up large directories using scp, the fileserver always crashes. If I try mounting the drive, and copying files over, it crashes also. It crashes in somewhat random spots, which may lead me to believe it may be a hardware issue. Before I rule everything out, I wanted to ask if anyone else has had this issue.

For reference, this doesn't happen while any of my roommates play dvd-rips (3 gb+ of data) off the file server (they all run windows) or even while I play movies off the server. So I'm not sure it's dependent on file size, but when I have to move lots of space divided along lots of files, it gets pretty far, then always crashes.

So does anyone have any advice on where I can go to trouble shoot this or has anyone experienced anything similar?

Thanks in advance,
Paul

nesware 02-11-2004 03:26 PM

any error messages in the logs?
amount of available space?
check server docs if some maximum file size setting blocks anything

the file server really crashes, or does it just stop the session? meaning do you need to restart the demon?

steyr 02-11-2004 11:26 PM

No, it definitely crashes the system. Can't login from the computer, can't ssh in, can't ftp in, nothing.

Everything from the /var/log dir around the times of the crashes looks normal. Hard drive in question is a 45 gig with 86% used (according to df), so disk space shouldn't be the issue.

Does scp create a log anywhere? Also which server docs should I be looking through?

Thanks,
Paul

nesware 02-14-2004 02:23 PM

Quote:

Everything from the /var/log dir around the times of the crashes looks normal. Hard drive in question is a 45 gig with 86% used (according to df), so disk space shouldn't be the issue.
14% free of 45 gb is 6.3 gb free
how big is the large directory and how large is all the data you're backing up?
Code:

du -hs /the/large/dir/where/it/crashes
du -hs /

is there maybe a softlink in a directory that points to directory back.. use a command line switch to ignore softlinks..
and a good thing to try is to audit the amount of available diskspace on the slackware server..
Code:

touch /var/log/diskspace
crontab -e

add this line
Code:

1 * * * * df -h >> /var/log/diskspace
now try backing up again and when it crashes reboot and check the file... the last line will tell you enough about the available disk space..


All times are GMT -5. The time now is 08:10 PM.