Details
-
Bug
-
Resolution: Duplicate
-
P2: Important
-
None
-
5.9.3, 5.11.0, 5.11.2
-
None
-
Ubuntu 16.04 64bit, Intel Haswell card, Mesa 17.2.4
Description
I am experiencing a very high CPU usage even when there are no changes on a scene.
My usage case is a deferred pipeline, so I modified the deferred-renderer-qml example and disabled all the animations. So it's basically a still frame.
See attached screenshot.
In this case I am (on purpose) on a quad core i5-4590 CPU @ 3.30GHz, with an Intel Haswell card on Mesa:
Vendor: Intel Open Source Technology Center (0x8086) Device: Mesa DRI Intel(R) Haswell Desktop (0x412) Version: 17.2.4 Accelerated: yes Video memory: 1536MB Unified memory: yes Preferred profile: core (0x1) Max core profile version: 4.5 Max compat profile version: 3.0 Max GLES1 profile version: 1.1 Max GLES[23] profile version: 3.1
However, I've seen this too on a octa core i7-7700 and an nVidia GTX 960M.
I believe Qt3D is not correctly processing changes on the G-buffer. In this case there are no changes at all, so I'd expect a nearly 0 CPU usage, since all the render textures won't change in time.
Not sure if this behaviour can be seen on a forward pipeline as well, but I believe it's a major blocker when dealing with complex pipelines on Qt3D.
Just for my understanding: is Qt3D core event driven (e.g camera changes, QQ animations, etc) or is it polling every entity on the frame graph looking for changes on every frame ?
Attachments
Issue Links
- duplicates
-
QTBUG-57791 High CPU usage with static 3D scene
-
- Closed
-