Boost logo

Boost :

From: Michael Fawcett (michael.fawcett_at_[hidden])
Date: 2006-10-31 12:27:12


On 10/31/06, Philippe Vaucher <philippe.vaucher_at_[hidden]> wrote:
> >
> > You could use SetThreadAffinity to force QPC to only run on one core,
> > although I'm not sure what other ramifications to the timer's design
> > and use this might have.
> >
>
> This is interesting but unfortunately it'd mean that the whole thread runs
> on one core, which very likely most programmer won't be happy with.... and
> running the QPC timer in a thread of his own just looks overkill to me.

Definitely something the user should be made aware of. I agree it's
overkill for most, but it might be something user's want (see below).

> Maybe a mid solution would be to provide some macro allowing the user to
> make the lib automatically use SetThreadAffinityMask... but I think that
> simply mentionning the issue in the documentation is better.
>
> Is there really that much of a need for a QPC based timer ? In my current
> state of mind I really provide it as an alternative to the microsec_timer
> for those who specifically need it, but microsec_timer is portable and
> offers the same resolution than QPC...

Is that really the case? Microsoft's own documentation states:

"The default precision of the timeGetTime function can be five
milliseconds or more, depending on the machine. You can use the
timeBeginPeriod and timeEndPeriod functions to increase the precision
of timeGetTime. If you do so, the minimum difference between
successive values returned by timeGetTime can be as large as the
minimum period value set using timeBeginPeriod and timeEndPeriod. Use
the QueryPerformanceCounter and QueryPerformanceFrequency functions to
measure short time intervals at a high resolution."

I have not done any tests to verify that QPC is indeed more accurate
over short intervals, but if that is the case, it should be provided I
think. Note that games often base their physics calculations off of
elapsed time per frame and they need to behave the same no matter the
framerate. These intervals are often as small as 0.003 seconds,
sometimes smaller.

Perhaps of interest, NVIDIA has a Timer Function Performance test app
that shows the performance of various timing methods. I have no clue
if the benchmark is written well, but speed of the actual timing
function may be of interest to some users as well as its precision.

http://developer.nvidia.com/object/timer_function_performance.html

--Michael Fawcett


Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk