Thanks for all the replies.
Just to clarify :-):
- I know what a millisecond, microsecond and nanosecond is.
- I did not think the resolution I got is nanoseconds just because the interface is in nanoseconds.
- Our product needs to process complex events in latencies of up to 10 milliseconds (and millions of those per second) and it does that in our tests on other platforms.
- We require time sampling resolution of about 1 microsecond or higher. Considering the requirements, I do not think that this is a design defect.
- We are just now starting to port to Linux.
Upgrade the Linux machine is what I'm going to ask (and pass this as a requirement to our clients as well).
It is still a bit of a disappointment for me to find that Linux from not so long ago does not expose properly the underlying machines ability to user applications.
Thanks again for all the information,