NXP i.MX8 (three of the four families) support dual display (2xHDMI ports), or iMX8 QuadMax/QuadPlus even up to four displays with Yocto based eLinux, (and INTEGRITY and QNX). On yocto based embedded Linux, these work as separate frame buffers for developers.
We may need documentation and sample sources, possibly also a demo for this. Also, if both displays have touch how does a developer use the touch in these two displays (and understand the context the touch events appear).
We need an engineering study to identify what is needed on Qt side to be able to provide support for multi display use cases? Is sample applications enough? What about virtual keyboard or touch interaction model?