LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Networking (https://www.linuxquestions.org/questions/linux-networking-3/)
-   -   Serial port : not able to write big chunk of data (https://www.linuxquestions.org/questions/linux-networking-3/serial-port-not-able-to-write-big-chunk-of-data-794364/)

anujmehta 03-10-2010 12:31 AM

Serial port : not able to write big chunk of data
 
Hi

I am trying to send text data from one PC to other using Serial cable. One of the PC is running linux and I am sending data from it using write(2) system call. The log size is approx 65K bytes but the write(2) system call returns some 4K bytes (i.e. this much amount of data is getting transferred). I tried breaking the data in chunks of 4K but write(2) returns -1.

My question is that "Is there any buffer limit for writing data on serial port? or can I send data of any size?. Also do I need to continously read data from other PC as I write 4K chunk of data"

Do I need to do any special configuration in termios structure for sending (huge) data?

theNbomr 03-10-2010 09:44 AM

Can you post the relevant code fragment(s)? Also, get the output of perror() when write() returns -1. I just cobbled up a quick test, and saw no such problems. Are you sure you are not getting some intervention from a flow-control mechanism?
--- rod.

anujmehta 03-10-2010 08:52 PM

Hi rod

Thanks for your reply. As I have mentioned earlier that using write(2) system call I was not able to send complete data in one shot. I did looping and tried to transfer the remaining bytes. This way I am able to transfer complete data.
I had observed that in general I was able to transfer 32 or 48 bytes in one write call. Is this fine for serial port or is there any way of optimizing it?

Code:

/*
 * buff - buffer to be transfered
 * len - length if 'buff'
 */
void sendData(int len, const char * buff)
{
  int writtenBytes = write(fd, buff, len);
  string str(buff);
  if(writtenBytes < len)
  {
      int begin = 0;
      while (len > 0)
      {
          usleep(9000);
          len -= writtenBytes;
          writtenBytes = write(fd, str.substr(begin, len).c_str(), len);
          if(writtenBytes == -1)
              writtenBytes = 0;
          begin += writtenBytes;
      }
  }
  else if(writtenBytes < 0)
      TRACE << "Error in sending data" << endl;
}


theNbomr 03-11-2010 11:05 AM

I don't see anything that looks out of sorts with your code, and it is remarkably similar to my test code. Is there a consistency to the size of the bytes written per write? What about the nature of the data? Are there end-of-lines embedded in the data? If so, I guess I would try setting the serial port to raw mode:
Code:

struct termios  serialPortCfg;

    tcgetattr(fd, &serialPortCfg );
    serialPortCfg.c_oflag &= ~OPOST;
    tcsetattr( fd, TCSANOW, &serialPortCfg );

I'm not completely sure about how cooked output deals with CRs & LFs or NULLs. A reasonable speculation might be that NULLs could cause the output stream to be broken into blocks, but that shouldn't be the case here, as it looks like your data is one long string. Can you confirm that the buffer length value actually being requested to write() is greater than the value returned as writtenBytes?
Sorry for the sketchy reply; I'm at a loss for better suggestions.

--- rod.

anujmehta 03-11-2010 09:08 PM

Rod thanks for your reply. I have set both input and output in raw mode. I am reading data from a file and it do contain new lines.

Code:

options.c_lflag &= ~(ICANON | ECHO | ECHOE | ISIG)
options.c_oflag &= ~OPOST

Yes the value of buffer size is greater than writtenBytes
I didn't thought of CR, LF and NULL, probably either of them is the reason for the data going in packets of size 32/48.

Thanks again for your help.


All times are GMT -5. The time now is 08:49 AM.