Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
gettimeofday is a C function. You have to write a C program, compile it and execute treating it as a linux command. However, I agree with H_TeXMeX_H: use the time command.
If you really want to do your own calculation you can add the %N specification to the date format, to get nanoseconds. Then use bc or awk to compute the difference (since bash itself does not manage floating point numbers). E.g.
I must have a stripped down Fedora or something, I didn't have the "bc" command installed. I have it now, and it works like a charm. This place really kicks ass, learn so much in a friendly way.
How accurate is this millisecond time, I've read around a few places and I seem to get the impression that it can be affected if there are other intensive processes running. Or do they have no effect on the time.
I guess it depends on what you want to do but there's a subtle difference between accuracy and precision. Even using APIs you'll be millisecond precise but not accurate. That is, units of milliseconds but not that you're reading the timer at the right time.
Plus if you're trying to time say a visual display to the millisecond then you're likely to be out of luck. PC's, Macs and Linux will all have similar problems as they use the same hardware. Even RTOS systems won't help you all that much as soon as you interact with a standard TFT or keyboard.
For some background on the issues take a look at our website and read about our Black Box ToolKit which helps users achieve millisecond timing accuracy in experimental work. Just Google Black Box ToolKit to find our site.
just to add something: you can easily create a c application which will return the current time in your preferred format, it looks like this:
Code:
void main (int argc, char *argv[]) # here you need to decide what to put in arg[1], arg[2] (maybe the format string or whatever you need to influence the result)
{
# some initialization
struct timeval *tv;
struct timezone *tz;
...
# call to gettimeofday
int i = gettimeofday ( tv , tz );
# process tv
# format output
printf (...)
}
here you can find a lot of similar example.
but actually we do not know why do you want to use it, maybe the time command would be sufficient for you
Distribution: Debian /Jessie/Stretch/Sid, Linux Mint DE
Posts: 5,195
Rep:
I am quite sure the last nanosecond in the nanosecond format displayed is not very accurate .
If I am not mistaken, the kernel runs an internal counter, faster than the 18 ms interrupt PC's have (do they still use this interrupt?) Still, intervals smaller than 20 ms measured in this way should be considered meaningless. Not just because of this timer interrupt, but the OS has many more tasks to do and you never know if time was spent running your program or some kernel task with a higher priority.
If your command takes a few seconds to run, run it several times and the inaccuracy will average out. I am sure some statistics whizzkid will be able to tell you what accuracy you can expect if your run the program N times and the inaccuracy is Y ms.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.