LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Software (https://www.linuxquestions.org/questions/linux-software-2/)
-   -   gettimeofday versus TSC register - coding (https://www.linuxquestions.org/questions/linux-software-2/gettimeofday-versus-tsc-register-coding-589097/)

RedOctober45 10-03-2007 07:56 AM

gettimeofday versus TSC register - coding
 
Ok, so for a class I am debugging a code snipet that compares the gettimeofday() function to the TSC register values. The idea is to grab a value of each, sleep for X amount of microseconds and them grab another value of each. Then the difference between the two, before and after, of each call is analyzed. (delta of gettimeofday compared to delta of TSC). So, when running the code, it makes sense and appears to be similar, the two deltas differ by only about 0.005%, which is normal. (on a Redhat-based machine)

However, when I try to run this code on a different Linux machine (FC7), the deltas are very sporadic and the percent different can range from 2% to 80%. This makes no sense to me and don't know why it is different. I ran it on a third platform (FC4) to see what's going on and this one was somewhat sporadic but the percentages were lower, like around 5% usually and more precise than the previous.

The following is a snipet of the main code so you can get the idea of what is going on:

Code:

main()
{
  struct timespec dt;
  struct timeval  bt, et;
  tsc_t      tsc_bt, tsc_et;
  double      delta_t, tsc_delta_t;
  int        i;

  printf("CPU clock: %.2f MHz\n", Init_TSC() / 1000000.0);

  for (i = 100; i < 500000; i += 100) {

    dt.tv_sec = 0;
    dt.tv_nsec =i  * 1000;

    gettimeofday( &bt, NULL );
    Read_TSC( tsc_bt );
    nanosleep( &dt, NULL );
    Read_TSC( tsc_et );
    gettimeofday( &et, NULL );

    delta_t  =  (et.tv_sec - bt.tv_sec) ;
    delta_t += (et.tv_usec - bt.tv_usec);
    tsc_delta_t = Convert_TSC( tsc_et - tsc_bt, TSC_Microseconds );

    printf("Requested %6d uS sleep:  actual %8.3f, tsc %8.3f, delta %.6f, %.3f%% err\n",
      i, delta_t, tsc_delta_t, delta_t - tsc_delta_t,
      (fabs(delta_t - tsc_delta_t) / delta_t) * 100.0);

  }
}

Convert_TSC() just converts the tick value from the TSC and converts to the time specified.

Thanks guys for any help!

sarans1987 08-27-2009 12:10 PM

Want ur entire code...
 
hi,
First of all sry..since i am not goin to giv the soln but wait.. iam a newbie n interested in ur code ..so i request u to upload the entire code...it will be very useful for me.

thanx...

TB0ne 08-27-2009 12:57 PM

Quote:

Originally Posted by sarans1987 (Post 3659734)
hi,
First of all sry..since i am not goin to giv the soln but wait.. iam a newbie n interested in ur code ..so i request u to upload the entire code...it will be very useful for me.

thanx...

Spell out your words, and don't reopen dead threads. This is two years old.

And sorry, I doubt anyone here is going to give you code, so you don't have to write it. There are lots of C examples to measure elapsed time you can get from the Internet...try Google.


All times are GMT -5. The time now is 07:32 PM.