Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
I am new to linux and normally do uploads and downloads using
ftp and monitor via webmin. However, I have a large data file [26GB] I am trying to copy the file out of the data area to pull it off the server
Trying to cp the file I have twice bought the server to a standstill and had the session terminated after about 15GB. So I would like to put this copy in a low priority batch job ( data file will be write locked).
The file is an index file to a mysql database table. It can be re-created from the corresponding .frm and .MYD. So copying for backup is a bit pointless. Use mysqldump if you want a backup. If you delete the file, mysql will get problems. And I think the real reason is probably because there are too many indexes on the table, or there's simply too much data in the table. I would use webmin or phpmyadmin to check and possibly delete. The database name is live01 and the table name is stage01. How big is "/var/lib/mysql/live01/stock01.MYD"?
linosaurusroot- firstly the script file did appear to have lost its header but I had subsequently opened it in vi (wow - there's another story) and thought this had caused the problem. I will retry with a clean script file and the corrected syntax
shneidz - i waited 5 minutes and more , i will have a look at man atq
guttorm - you are quite correct I have listed the index file which is 6GB in this case, and I copied that using cp at the terminal, I put it in the script file to test a smaller copy on the test system. It is the MYD file that is huge, mysqldump works fine on smaller files but it blocks the mysql processing on this file and I eventually lose network connection. It's too big I know - I have just inherited it.
For now I would like to nail how to submit a script ,that hopefully copies a file and creates a log, part of my linux adventure ..