Dear All,
I was experimenting with the timing and timing accuracy on BBB. and I came across a very interesting yet strange thing!
Just running clock_gettime()twice and measuring the time difference, gives you horrendous timing. Here is my simple C code:
`
#include <stdio.h>
#include <stdlib.h>
#include <math.h>
#include <time.h>
int main()
{
struct timeval startTime, stopTime;
clock_gettime(CLOCK_REALTIME, &sT);
printf("\n%ld.%09ld\n" ,sT.tv_sec,sT.tv_nsec);
clock_gettime(CLOCK_REALTIME, &eT);
printf("\n%ld.%09ld %ld uSec\n" , eT.tv_sec, eT.tv_nsec, ((eT.tv_sec - sT.tv_sec)*1000000 + (eT.tv_nsec-sT.tv_nsec)/1000));
return 0;
}
`
Just two gettime are causing 2894 to 3200 micro sec (uSec) time difference, how is such huge amount possible!??
Can someone throw some light on this? and can it be improved or fixed?
I have tried the following to investigate.
-
Running same code on my i7
it gives 80uS to ~300 uS delay. -
Trying TimeVal and GetTimeOfDay – gives worse result!
-
Running NTP:
No impact.
Thank you, and hope you find it interesting as well