We need to streamline Qt Multimedia to have more predictable cross-platform support and to reduce our maintenance load.
We should aim to support the 90% use case and make the rest possible through hooks in the API, and we should aim for all Qt Multimedia APIs to be supported and work on all backends. Additional APIs that are needed but which can only be supported on specific backends, should be exposed through platform-specific APIs, such as in a Qt GStreamer Extras module or similar.
Preliminary roadmap for Qt6 was drafted here: Qt6 multimedia roadmap
In order to serve QML customers in Qt6, we shall create framework that fulfils the requirements of high level use-cases and uses a lean-and-mean multimedia architecture that is built on Qt components originating from old Qt Multimedia, but made private until further notice.
The components should be also isolated, therefore if customer does not required audio, the audio subsystem would not be imported, etc.
Existing QT Multimedia plugins (video/videonode, audio, mediaservice, playlistformats, resourcepolicy) should work with the new Qt Multimedia API.
Overview of high level uses cases in priority order. We should implement these from 1 to n in order or priority, higher in earlier Qt 6 releases and lower priority in later Qt 6 releases. Having said that, there may be dependencies where architecturally the implementation order may vary.
Supported Qt 6 Multimedia use cases must work on all supported reference targets (in order of prio and sequence set in Platforms Area roadmap). The performance (FPS, memory consumption etc) may vary between targets, the FPS should not drop below 24 on samples and our own pipeline should not be the limiting factor (but HW is).
Supported functionality in first release is marked with bold in the list below.
- Playback (low and high resolution) video from a local file.
- Playback audio from a local file MP3 files, possibly also AAC/Vorbis/OGG. If MP3 patent does not allow shipping the codec the document how to swap from AAC/Vorbis/OGG.
- Sound effects with low latency (max 20ms delay) audio with WAV / PCM files
- Support for both (low and high resolution) camera.
- Ability to control video and audio stream (sample for play, pause, resume, seek).
- Ability to draw overlays and add your own functions on top of the video stream
- Stream and playback video from network on the device.
- Utilization of GPU / HW acceleration in embedded HW.
- Extracting frames / video for analysis from a buffer/file.
- Ability to select which camera to use in case there are more than one.
- Capture a picture (a frame) from camera to a buffer/file.
- Recording video from camera to a buffer/file. Assume also that if the camera stream contains audio the recording captures both, video and audio.
- Recording audio (assume from a microphone)
All other features not listed above will not be supported as part of Multimedia but rather they will need native / OS specific solution. Multimedia works in both QML UI and WidgetUI application (if priority needed QML comes first)
- Wider set of codecs and formats (MP4, MP3, Ogg/Vorbis, vp8, vp9, AV1 etc etc)
- Stream video (and Camera) out from the device (to network over RTSP or TCP/IP).
- Camera/video settings, like changing fps, resolution, pixel formats, zoom, focus, white balance etc.
- Custom G-streamer pipelines. Note that there are no any technical limitations to support this. Currently supported in Qt 5.
- Support more than one camera at a time
- Capture image settings. Codec, resolution.
- Image processing settings for captured images. Brightness, contrast, etc.
- Extracting audio frames from camera and media player.
- Locking the camera.
- Exposure settings for the camera.
- Focus settings for the camera.
- Flash settings for the camera.
- Audio roles, supported mime types for media player.
- Metadata in media player.