Details
-
Bug
-
Resolution: Fixed
-
P1: Critical
-
6.4.0
-
None
-
-
476960a858 (qt/qtmultimedia/6.4) bd2e80867e (qt/qtmultimedia/dev) 476960a858 (qt/tqtc-qtmultimedia/6.4) bd2e80867e (qt/tqtc-qtmultimedia/dev)
Description
I've written code that uses some of the new multimedia classes to capture video frames from the device's camera. I've attached the .cpp and .h files for the module that implements this support. After we call videoInputs() to enumerate the available cameras, select a camera, and select a video format, we execute the following code to do the necessary "plumbing", and initiate the flow of video frames:
QMediaCaptureSession* session = new QMediaCaptureSession();
QVideoSink* sink = new QVideoSink();
session->setCamera(selectedCamera);
session->setVideoSink(sink);
connect(sink, &QVideoSink::videoFrameChanged, this, &UIVideoFrameGrabber::onVideoFrameChanged);
App.ScLog("AR background camera start");
selectedCamera->start();
On Windows, this code works perfectly, and we start receiving video frames via our onVideoFrameChanged() function, as expected.
On Android, although I can successful detect the camera, get the list of formats, and select one, our onVideoFrameChanged() function never gets called. So we simply don't get any video frames.
Can you see anything wrong with my code that would account for this?