- 
    
Bug
 - 
    Resolution: Done
 - 
    
  Not Evaluated                     
     - 
    4.8.4, 4.8.5, 4.8.6, 4.8.7
 - 
    None
 - 
    Tested on Archlinux, Fedora 21-22, RHEL 5-6 + custom Qt builds
Core i7 4.3 GHz 
- 
        bf5f2a9e3e3bf70c373b65bf95a332f4e1c514f9
 
Using the glib event dispatcher, a simple application that only starts a timer with a timeout of 10 msec, takes about 10% of a cpu (core i7 4.3 GHz). See the project in the attachment.
The results were the same for versions 4.8.4 to 4.8.7. I haven't tested with older versions.
Using the unix event dispatcher (by setting "QT_NO_GLIB" environment variable), the cpu usage is almost negligible.
This performance issue does not exist on Qt 5.
Going through the source code, it would seem that there is a timeout rounding issue in the "timerSourcePrepareHelper" method in "qeventdispatcher_glib.cpp". 
Indeed in Qt 5 the timeout is calculated as follows:
    timeout = (tv.tv_sec * 1000) + ((tv.tv_nsec + 999999) / 1000 / 1000);
while in Qt4 :
    timeout = (tv.tv_sec * 1000) + (tv.tv_usec / 1000);
With a similar rounding on microseconds, the performance issue disappears.
I attach a patch of the correction.