Details
-
Bug
-
Resolution: Done
-
P2: Important
-
5.0.0
-
None
-
8fb379dc8a4d7069a99c3181283e581b86e99ceb
Description
When developing for a QtWebKit2 -based browser I noticed that a file download can take 100% CPU for many seconds. This happens when the file is downloaded from QNetworkDiskCache and isn't fully read in a directly connected slot to QNetworkReply::readyRead() signal. I attached a test app demonstrating this code path (that is hit in QtWebKit2) as "slowcachedownload.tar.gz".
The problem comes down to this: the file is read into a QByteArray which is appended into QNetworkReplyHttpImplPrivate::downloadMultiBuffer. From there it is read via QIODevice::readAll() 16kB at a time, and the remaining data is copied to the start of the array each time.
I made a fix that eliminates unnecessary copying by adding a read position variable to QByteDataBuffer. A fix and a unit test that includes benchmarks is attached as "qbytedatabuffer.tar.gz". The fix adds very little overhead to the best case scenario and improves the worst case scenario I encountered with a ~30MB file from 10s user time to practically 0s.
Attachments
For Gerrit Dashboard: QTBUG-27522 | ||||||
---|---|---|---|---|---|---|
# | Subject | Branch | Project | Status | CR | V |
38035,3 | Improve QByteDataBuffer::read() performance with partial reads | master | qt/qtbase | Status: MERGED | +2 | 0 |