LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Software (http://www.linuxquestions.org/questions/linux-software-2/)
-   -   Raw terminal mode with less than 1/10 second delay (http://www.linuxquestions.org/questions/linux-software-2/raw-terminal-mode-with-less-than-1-10-second-delay-4175480889/)

Jay Braun 10-15-2013 11:54 AM

Raw terminal mode with less than 1/10 second delay
 
I am helping a colleague port a legacy application from OpenVMS to Linux. He wishes to retain a command-based interface that uses "raw" terminal mode to detect errors immediately, i.e., not wait until a record is input and entered. The legacy interface also inserts prompts at various points.

Although I have recommended a GUI-oriented approach, which interests him, he is understandably concerned about acceptance by legacy users.

It appears that non-cannonical terminal input using ttydly() allows for a minimum delay of .1 second (1 unit of tenths-of-a-second).

Is there an alternative that would allow smaller delays, perhaps a variant of ttydly() that allows floating-point arguments less than 1?

Thanks,
Jay

TB0ne 10-16-2013 01:31 PM

Quote:

Originally Posted by Jay Braun (Post 5046230)
I am helping a colleague port a legacy application from OpenVMS to Linux. He wishes to retain a command-based interface that uses "raw" terminal mode to detect errors immediately, i.e., not wait until a record is input and entered. The legacy interface also inserts prompts at various points.

Although I have recommended a GUI-oriented approach, which interests him, he is understandably concerned about acceptance by legacy users.

Having done things like this, the best thing I can suggest would be "Tell your legacy users to enjoy the new interface".

Seriously...having old/outdated stuff around, because a few users will complain about the new interface is pointless. Update and move forward...and it's not like you're asking them to input characters in hex to get things done. A GUI vs CLI is a MUCH easier learning curve.
Quote:

It appears that non-cannonical terminal input using ttydly() allows for a minimum delay of .1 second (1 unit of tenths-of-a-second).

Is there an alternative that would allow smaller delays, perhaps a variant of ttydly() that allows floating-point arguments less than 1?
What kind of software are you porting over? And why would a .1 second delay be what you're after, since the USER is going to be the slow part of the equation, and they absolutely will NOT be responding that quickly.

Can you give some screen shot/examples of what you're talking about?

Jay Braun 10-16-2013 02:32 PM

Maybe it's not the legacy users so much as the original developer's perception of what the original users expect. I think you are correct that they would adjust well to a GUI.

The second part of your response, I think, gets to the heart of the matter. I think that you envision a system where a key is pressed (or a few keys are entered) to respond to some query. In that case, yes, the user would be the bottleneck. But in this system, the user's input would be a military order that might begin with verbiage such as:

DEFINE SECTOR TIGER . . .

The design calls for a "beep" if the user makes an error, e.g., types SECTR instead of SECTOR, with no further input until it is corrected. But wait, it gets better. After the user enters the above a prompt will appear in-line:

DEFINE SECTOR TIGER (centered at)

after which the user enters a latitude and longitude, with further prompting for radius and other characteristics.

In other words, the user is essentially typing whole sentences with no enter key until the very end. A good percentage of people type faster than 10 characters per second, at least in short bursts.

In any case, I would agree with whatever critcism you might have related to this mode of input; at this point, I am more interested in the technical answer -- is there a way to make the delay shorter on Linux?

Thanks,
Jay

TB0ne 10-17-2013 10:31 AM

Quote:

Originally Posted by Jay Braun (Post 5047059)
Maybe it's not the legacy users so much as the original developer's perception of what the original users expect. I think you are correct that they would adjust well to a GUI.

The second part of your response, I think, gets to the heart of the matter. I think that you envision a system where a key is pressed (or a few keys are entered) to respond to some query. In that case, yes, the user would be the bottleneck. But in this system, the user's input would be a military order that might begin with verbiage such as:

DEFINE SECTOR TIGER . . .

The design calls for a "beep" if the user makes an error, e.g., types SECTR instead of SECTOR, with no further input until it is corrected. But wait, it gets better. After the user enters the above a prompt will appear in-line:

DEFINE SECTOR TIGER (centered at)

after which the user enters a latitude and longitude, with further prompting for radius and other characteristics.

In other words, the user is essentially typing whole sentences with no enter key until the very end. A good percentage of people type faster than 10 characters per second, at least in short bursts.

In any case, I would agree with whatever critcism you might have related to this mode of input; at this point, I am more interested in the technical answer -- is there a way to make the delay shorter on Linux?

There MAY be a way to do so, but I don't see this as something that would be handled through the terminal device, but through the program RUNNING on that terminal. Your software will be accepting the input and doing something....and passing that along later, at whatever point you deem the data sanitized.

And if you go for a GUI based solution, this is done ALL the time...simply by greying out the button(s) needed to move forward, until the data passes muster. A good case in point would be a field to enter an activation key that has to be 20 characters...until you actually PUT 20 characters in, the "OK" button is greyed out....and INSTANTLY becomes 'active' after the 20th key is pressed. This would be in the same vein...your context/spell checking would be running, and the GUI box (or CLI window), would beep/flash/whatever if there were error(s)/omissions, and once corrected, would allow the program to continue.


All times are GMT -5. The time now is 02:13 PM.