Because of the holidays, TI E2E™ design support forum responses will be delayed from Dec. 25 through Jan. 2. Thank you for your patience.

This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Linux/AM5728: External Touchscreen HID

Part Number: AM5728

Tool/software: Linux

Customer is prototyping using AM572x EVM for dual display application.  They are using an external HDMI display that support Capacitive touchscreen. The touchscreen is USB HID connection to EVM. However they are seeing touches on the HDMI screen/USB HID being recorded also as touches on the EVM local display.  Using a USB Mouse does not show this issue. 

They are using Linux PSDK v5.02, running QT and they tried following https://doc.qt.io/qt-5/embedded-linux.html#touch-input-in-systems-with-multiple-screens-on-kms-drm

Any suggestion to fix this?

  • Hello Lawrence,

    Please tell us more about the screen setup - e.g., the screens are not mirrored, are they?

    Could you provide more details about "they are seeing touches on the HDMI screen/USB HID being recorded also as touches on the EVM local display"? E.g., are they seeing touch values reported in the command line that are associated with the EVM local display even though they were made on the HDMI screen? Is there any debug information they can provide?

    Regards,
    Nick
  • Here are some sample debug output. They have overridden the QApplication::notify() method to snoop on QEvent::TouchBegin, QEvent::TouchUpdate, etc

    Here's a sample output when you touch  on a small portion of the primary (EVM) screen:

     

    QTouchEvent(TouchBegin device: "" states: TouchPointPressed, 1 points: (TouchPoint(2000001 (107,224) TouchPointPressed pressure 1 ellipse (8 x 8 angle 0) vel (0,0) start (107,224) last (107,224) delta (0,0)))

     

    QTouchEvent(TouchEnd device: "" states: TouchPointReleased, 1 points: (TouchPoint(2000001 (105,224) TouchPointReleased pressure 0 ellipse (8 x 8 angle 0) vel (0,0) start (107,224) last (105,224) delta (0,0)))

     

     

    and here's a sample output when you touch in more or less the same location on the HDMI touchscreen:

     

    QTouchEvent(TouchBegin device: "" states: TouchPointPressed, 1 points: (TouchPoint(2000001 (117,172) TouchPointPressed pressure 1 ellipse (8 x 8 angle 0) vel (0,0) start (117,172) last (117,172) delta (0,0)))

     

    QTouchEvent(TouchEnd device: "" states: TouchPointReleased, 1 points: (TouchPoint(2000002 (117,202) TouchPointReleased pressure 0 ellipse (8 x 8 angle 0) vel (0,0) start (117,172) last (117,202) delta (0,0)))

     

     

    As you can see, the touch point co-ordinates are very similar (107, 224) vs. (117, 172) so of course these are going to get passed on to the same QWidget.

     

    If the mapping they set up in my eglfs_kms_cfg.json file had been used they would've expected touch events from the HDMI screen to be remapped to be something like (800 + 117, 172).

    Then Qt's event handlers would conceivably know to pass the event on to Widgets which are displayed on the HDMI screen. But as it stands now, events look identical

    regardless of which screen they are generated on.

     

    Also, the only thing that I think you really need to add to the EVM is a device like this

     

     https://www.amazon.com/10-1inch-HDMI-LCD-case-Touchscreen/dp/B01H013FGC/ref=sr_1_14?keywords=hdmi+monitor+with+capacitive+touch&qid=1549906546&s=electronics&sr=1-14 

     

    and you should be able to see this exact issue with any of the sample Qt GUI applications.

  • Additional information;

    On TI's wiki  here:

     

    http://processors.wiki.ti.com/index.php/Processor_Linux_SDK_Graphics_and_Display#QT_Graphics_Framework

     

    it is clear that Qt is currently running on wayland-egl. However, on this Qt website:

     

    https://doc.qt.io/qt-5/embedded-linux.html#display-output

     

    looks like it applies when Qt is running on eglfs. Since this latter link are the instructions customer followed for setting up multiple screens, it's understandable that the json file is not being used.

     

    Therefore, it seems like perhaps they need to rebuild Qt with the -platform=eglfs option? So question is, does this sound correct and anyone able to provide an insight as to whether QWebEngine will run on eglfs, or is eglfs in the same boat as you mentioned with linuxfb and therefore likely not supported?

  • Hello Lawrence,

    In order to disable wayland/weston, please run the following command on the evm: "/etc/init.d/weston stop". Next, take your Qt application and run it using the following command "<application> -platform eglfs".

    In the following example, we use eglfs_kms QPA to render contents on two different screens. Also, I was able to run the "simplebrowser" application using the eglfs option.

    Regards,

    Krunal

  • Update from the customer:

     

    This has not yet fully solved their issues, and running without Weston has created some new challenges. They explain where they are at both with and without Weston. As a refresher, for they will have 2 Qt Applications that they want to run full screen on the 2 displays and get touch input from their respective display.

     

    Without Weston using eglfs and eglfs_kms backend

    1) Touch input from both screens are always routed to the application running on the primary display. There is a minor improvement: when they intercept the touch events they are able to see which screen the touch events originated on. If this were the only problem, they would work around that by manually remapping to the touch events based on the screen they originated on.

    2) they can place the the application in whichever screen they want, a big improvement over Weston

    3) The second application does not properly run. When they instantiate the second QWebEngineView, the view is not displayed and they do not get any of the normal callbacks such as loadStarted() like they would expect.

     

    - Suspect that items 1 and 3 might be related - perhaps there is something wrong with how they create/instantiate QApplications and one is behaving like the "primary" one and the other isn't really a full fledged application. They are investigating how this could be.

     

    With Weston

    1) Touch input from both screens are always routed to the application running on the primary display with no way to determine which screen they originated on and therefore no chance to remap them.

    2) Weston does not allow a program to decide which screen to draw itself in - it uses some kind of algorithm. They thought they had found a complicated but deterministic way to ensure that apps always came up on the right screen (this involved creating some temporary widgets before the app launches), but this turned out to not be reliable so sometimes the applications come up in the wrong screen. Read briefly about some possible solutions that might be available through the QtWaylandClient interface but there is very little documentation about it.

     


     They are kind of halfway there with both options right now, just trying anything they can read about or think of that could poke away at a solution.  Is it supposed to be possible to run 2 fully functional QApplications in full screen on different displays using eglfs? They understand that eglfs makes the assumption that the application is running full screen, and therefore running 2 applications at the same time would not be supported unless there are 2 displays. Is it necessary to split up the HDMI and LVDS screens so they don't appear as one large framebuffer device (I only see /dev/fb0) perhaps?

     

    Any suggestion?

  • Hi Lawrence,

    If you are using eglfs, you cannot have two applications running and it only supports one application. Also, fb0 is no longer supported and please use drm driver instead.

    With regards to your touch screen issue, could you please confirm if a USB mouse shows a similar pattern if touch is disabled?

    Lastly, please refer to the following link to learn more about IVI shell: 

    Regards,

    Krunal

  • Hi Krunal,

    You probably missed it, but in my original post I stated USB mouse does not show this issue. I will let the customer know about the other information.

    Regards,

    Lawrence

  • Krunal,

    Customer looked into the ivi shell but they do not understand exactly how it would help. They can run the examples on that page, and they can create layers and surfaces for the two screens. But only applications such as weston-simple-shm will run in those surfaces. they can't get any of the Qt sample applications to run. Are any of them supposed to, and if so do you know what arguments they need to specify to get them to run? they get output like this:

     

    Using Wayland-EGL

    wlpvr: PVR Services Initialised

    Failed to load shell integration xdg-shell-v5

    Could not create a shell surface object.

    Could not create a shell surface object.

    Segmentation fault (core dumped)

    Regards,

    Lawrence

  • Hi Lawrence,

    Please look at the following example: 

     

    Regards,

    Krunal

  • Krunal,

     

    The ivi compositor runs for my customer, but they could not find any Qt applications which would then run with the -platform wayland option. Only apps such as weston-simple-shm ran with that compositor.

     

    However, they did find the multi-screen compositor example. It runs, and all the Qt applications including multiple instances of their client app can run simultaneously. They are currently trying to adapt that to their needs. Basically they are trying to add code to the wayland compositor to allow their client application to place its window on a specific monitor (because for reasons they do not understand wayland completely forbids user applications to specify where windows are placed on the desktop).

     

    They might have a workaround for correctly mapping touch events to the correct application as well by bypassing Qt's touch handling and using mtdev to pass messages to their javascript code.

     

    Is there a better way to handle these issues?

    Regards,

    Lawrence

  • Hi Lawrence,

    I apologize for the delayed response but I am wondering if you are still having problems with the Qt Application.

    Regards,
    Krunal
  • Hi Lawrence,

    I will be closing the ticket and if you are still experiencing issues, please feel free to open the ticket in the future.

    Regards,
    Krunal