QQuickRenderControl? Offscreen (to texture) rendering? Not supported with the QRhi-based rendering path at the moment.
Key enabler for a number of use cases, such as:
- rendering Qt Quick content into textures, to be then used in a VR system
- the same, to be used by an external 3D engine
- reading back image data from the texture and saving to images or image sequences
- QQuickWidget (even if QQuickWidget is not supported in 6.0, it does not invalidate the need for QQuickRenderControl)
One interesting issue is QRhi being private. However, making a rendercontrol-based pipeline involves creating a QRhi instance on the application side (as there is no scenegraph render loop to do it for us). Then the question of the render target: cannot just pass in a QRhiTexture / QRhiTextureRenderTarget in the Qt Quick public API if those QRhi classes are not public.
- For 6.0 this will mean that that we will work with the typical void * native object input (so one can pass in a VkImage and a layout, or an ID3D11Texture2D, etc.), and that will be the alternative to the OpenGL-specific QQuickWindow::setRenderTarget(GLuint), This is then complemented by some helper functions that under the hood initialize a QRhi, like a built-in scenegraph render loop would do. This of course raises a number of issues to consider (importing existing device objects etc.)
- Once QRhi becomes at least semi-public, this debt can be addressed later on, by introducing proper QRhi* based APIs.
Ideally this should come with a set of new tests and examples, replacing the lonely old rendercontrol example. Needs examples for more than one graphics API. Also consider the image sequence use case described in https://www.qt.io/blog/2017/02/21/making-movies-qml as that may make a useful test or demo in some form.