Also ... "you'll never get nanoseconds." Especially when it comes to I/O latencies, which are almost exclusively ruled by the vagaries of hardware devices. About the best that you can hope for is a bunch of samples from which you can take means and standard-deviations. Millisecond accuracy is certainly good enough, due to the fact that a physical device is the ruling constraint, and I therefore would probably round them or truncate them to milliseconds, treating the rest as noise.
|