Home > Articles > Programming > C#

.NET Reference Guide

Hosted by

Toggle Open Guide Table of ContentsGuide Contents

Close Table of ContentsGuide Contents

Close Table of Contents

.NET Timer Resolution

Last updated Mar 14, 2003.

The .NET Framework provides access to three different timer objects. Each type of timer can be configured to execute some code at periodic intervals. Windows Forms timer are useful for user interface applications, but are difficult and inconvenient to use in non-UI applications.

Thread timers (System.Threading.Timer)are lightweight timers that you can configure to execute a callback function either one time after a specified delay, or periodically at a specified frequency.

The System timer (System.Timers.Timer) is a component that provides an event-oriented wrapper around the thread timer object. You can configure the timer for one-shot or periodic operation, and you can also stop and restart the timer as well as modify its interval.

Thread timers, which also provide the functionality behind the system timers, are based on the Windows Timer Queue Timers. The Timer Queue is essentially a wrapper around a single timer, and a queue of some sort that keeps track of the times at which it has to make callbacks. In essence, the timer is set to fire at the next required time. This design makes very good use of system resources because it requires only one real timer object, and a list of very small data structures that contain timer frequencies and references to callback functions.

These timers are quite good, too, at lower resolutions. For example, if you want to be notified once per second or even 10 times per second (once every 100 milliseconds), the timers handle that, no problem. We can even write a program to prove it.

The program below creates a timer at the requested tick frequency and then runs for a minute, recording the elapsed time for each tick. When the minute is up, it does some quick analysis to determine how closely the program matched the requested frequency.

using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Threading;
namespace TimerTest
{
    class Program
    {
        const int TickFrequency = 1000;
        const int TestDuration = 60000;   // 60 seconds
        static void Main(string[] args)
        {
            // Create a list to hold the tick times
            // Pre-allocate to prevent list resizing from slowing down the test.
            List<double> tickTimes = new List<double>(2 * TestDuration / TickFrequency);
            // Start a stopwatch so we can keep track of how long this takes.
            Stopwatch Elapsed = Stopwatch.StartNew();
            // Create a timer that saves the elapsed time at each tick
            Timer ticker = new Timer((s) =>

                {
                    tickTimes.Add(Elapsed.ElapsedMilliseconds);
                }, null, 0, TickFrequency);
            // Wait for the test to complete
            Thread.Sleep(TestDuration);
            // Destroy the timer and stop the stopwatch
            ticker.Dispose();
            Elapsed.Stop();
            // Now let's analyze the results
            Console.WriteLine("{0:N0} ticks in {1:N0} milliseconds",
             tickTimes.Count, Elapsed.ElapsedMilliseconds);
            Console.WriteLine("Average tick frequency = {0:N2} ms",
             Elapsed.ElapsedMilliseconds / tickTimes.Count);
            // Compute min and max deviation from requested frequency
            double minDiff = double.MaxValue;
            double maxDiff = double.MinValue;
            for (int i = 1; i < tickTimes.Count; ++i)
            {
                double diff = (tickTimes[i] - tickTimes[i - 1]) - TickFrequency;
                minDiff = Math.Min(diff, minDiff);
                maxDiff = Math.Max(diff, maxDiff);
            }
            Console.WriteLine("min diff = {0:N4} ms", minDiff);
            Console.WriteLine("max diff = {0:N4} ms", maxDiff);
            Console.WriteLine("Test complete.  Press Enter.");
            Console.ReadLine();
        }
    }
}

Running that program with a requested tick frequency of 1,000 (one tick per second, produces this output):

60 ticks in 60,001 milliseconds
Average tick frequency = 1,000.00 ms
min diff = 5.0000 ms
max diff = 15.0000 ms

Translation: the ticks happened once per second, with an error between 5 and 15 milliseconds. No tick occurred exactly on time. Every tick was at least five milliseconds late, and at least one was 15 milliseconds late.

Running the program at a requested tick frequency of 100 milliseconds tells me that some happened exactly on time, but at least one was 21 milliseconds late.

All these tests, by the way, are run on a quad-core 2 GHz machine with Windows 2008 and .NET 3.5. The machine isn't completely idle while I'm running the tests, but it's running less than 10% CPU and it's not doing anything that should cause a big slowdown. In any case, real-world situations would have me using these timers in programs that do put high demands on the CPU, so I consider the results from these tests to be "best case."

The timers are reasonably good if you don't push them too hard. But at higher frequencies they tend to miss ticks. For example, a requested frequency of 20 times per second (once every 50 milliseconds) results in:

962 ticks in 60,000 milliseconds
Average tick frequency = 62.37 ms
min diff = 0.0000 ms
max diff = 25.0000 ms

It's possible that the missed ticks (we should have received 1,200 ticks in that period) are due to the timer callback taking too much time. That would be true if the timer is implemented to prevent re-entrancy, but I don't think that's the case because if I run with a requested frequency of one tick every 60 ms, I get the same total number of ticks, and slightly different error stats:

962 ticks in 60,000 milliseconds
Average tick frequency = 62.37 ms
min diff = -1.0000 ms
max diff = 14.0000 ms

It looks like the timer's resolution is a multiple of 15 milliseconds. To test that, I tried to get a 15 millisecond timer. Here's what I got:

3,846 ticks in 60,000 milliseconds
Average tick frequency = 15.60 ms
min diff = -1.0000 ms
max diff = 16.0000 ms

That's pretty close. Here's what happens at 16 ms:

2,309 ticks in 60,000 milliseconds
Average tick frequency = 25.99 ms
min diff = -1.0000 ms
max diff = 16.0000 ms

Anything less that 15 ms, by the way, gives results that are almost identical to the 15 ms times.

What does it mean?

The conclusion I drew from all my testing is that, at best, you can count on a timer to tick within 15 milliseconds of its target time. It's rare for the timer to tick before its target time, and I never saw it be early by more than one millisecond. The worst case appears to be that the timer will tick within 30 milliseconds of the target time. Figure the worst case by taking your desired target frequency (i.e. once every 100 milliseconds), rounding up to the next multiple of 15, and then adding 15. So, absent very heavy CPU load that prevents normal processing, a 100 ms timer will tick once every 99 to 120 ms.

You definitely can't get better resolution than 15 milliseconds using these timers. If you want something to happen more frequently than that, you have to find a different notification mechanism. No .NET timer object will do it.