Details
-
Bug
-
Resolution: Unresolved
-
P3: Somewhat important
-
None
-
5.10.0 Beta 4
-
None
Description
I have a config file like this (syntax is from http://doc.qt.io/qt-5/embedded-linux.html#touch-input-in-systems-with-multiple-screens-on-kms-drm ):
{ "device": "/dev/dri/card0", "hwcursor": false "outputs": [ { "name": "DVI1", "touchDevice": "/dev/input/by-id/usb-Advanced_Silicon_S.A_CoolTouch_TM__System-event-if00", "virtualIndex": 0 }, { "name": "DP2", "virtualIndex": 1 } ] }
I set QT_QPA_EGLFS_KMS_CONFIG=/home/rutledge/.screenlayout/eglfs-work.json and start up a Wayland compositor which spans both screens. Then I run a qml example which uses touch. I can touch on the right screen (the touchscreen, which is DVI1) to generate touchpoints on the left screen (DP2, which is not a touchscreen), which is of course not the intention.
If I run the qml example as qml mpta-crosshairs.qml -platform eglfs it appears on the left monitor (the one which is at index 1, not index 0). And again, touchpoints on the right screen (which is not displaying the scene) appear on the left screen.
Attachments
Issue Links
- is duplicated by
-
QTBUG-78839 Map touchscreen input device to specific display
- Reported
- relates to
-
QTBUG-54151 Support for multiple touchscreens using eglfs
- Closed