This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Linux/PROCESSOR-SDK-AM335X: LVDS video issues

Part Number: PROCESSOR-SDK-AM335X
Other Parts Discussed in Thread: DS90C385A

Tool/software: Linux

I’m having a problem getting LVDS video working.

Background:We’ve got a custom AM335x board that supports LVDS and HDMI video output (note: we use only one of these two video outputs at a time based on which product the board is placed in – there is no need to be able to switch between outputs; EDIT: we don't even have a configuration for HDMI in the device tree when using LVDS). We’ve had HDMI working for a while and are now trying to get LVDS to work.

The CPU’S LCD video pins go to three mux chips (one for the red pins and video control signals – e.g. vsync – one for the green pins, and one for blue pins), and then the mux is switched to send the video signals to either an LVDS transmitter chip or an HDMI transmitter chip. The upshot of this is that the LCD pin mappings out of the CPU are the same for both HDMI and LVDS.

The Problem:Display timing appears to be set correctly in the device tree – the simple QT GUI I’m displaying for testing is steady and positioned correctly – but I appear to be having color issues. More specifically:

  • Red and blue are swapped.
  • The LVDS display wants RGB888, and the data path between the CPU and LVDS display is wired for this. I’ve put some prints into the DRM and TILCDC code, and it’s reporting a pixel format of DRM_MODE_RGB888. But not only are red and blue swapped, it appears that something is set for 6-bit color depth (maybe RGB565?).

I know the above because using my test GUI, I had Qt ramp the red channel of the window background from 0-255, but instead of seeing a steadily brightening red color, I saw a blue that brightened as the value stepped from 0 – 63, then went black at 64, ramped in brightness to 127, went black at 128, ramped until 191, black at 192, and then ramped up until 255.

 At this point, I should mention that I’m aware of the AM335x LCD pin errata for red and blue pins being swapped when using 24 bpp (see AM335x errata document at https://tinyurl.com/y8y48dk9, page 8) and that our board has swapped the red and blue pins as described in the document to support RGB888 on the HDMI display. Given that video sent to our HDMI display is displaying correctly, I think the swap was done correctly (and our schematic bears this out).

I’ve spent several days messing around with this and can provide various information that any responders may want, but I’ll start with the following.

Device Tree:

Relevant device tree nodes.

        panel {

                       compatible = "ti,tilcdc,panel";

        pinctrl-names = "default", "sleep";

        pinctrl-0 = <&lcd_pins_default>;

        /*pinctrl-1 = <&lcd_pins_sleep>;*/

                       status = "okay";

                       panel-info {

                                      ac-bias           = <255>;

                                      ac-bias-intrpt    = <0>;

                                      dma-burst-sz      = <16>;

                                      bpp               = <24>;

                                      fdd               = <0x80>;

                                      sync-edge         = <0>;

                                      sync-ctrl         = <1>;

                                      raster-order      = <0>;

                                      fifo-th           = <0>;

                       };

 

                       display-timings {

                                      800x600 {

                                                     clock-frequency = <40000000>;

                                                     hactive = <800>;

                                                     vactive = <600>;

                                                     hfront-porch = <40>;

                                                     hback-porch = <88>;

                                                     hsync-len = <128>;

                                                     vback-porch = <23>;

                                                     vfront-porch = <1>;

                                                     vsync-len = <4>;

                                                     hsync-active = <1>;

                                                     vsync-active = <1>;

                de-active = <1>;

                pixelclk-active = <1>;

                                      };

                       };

        };

 

&lcdc {

        status = "okay";

};

 Note: I’ve omitted the pinctrl mapping.

 I also tried setting bpp in the device tree to 16, just for giggles, and saw no difference in the behavior.

 

Modetest:

Another thing that’s going on is that the LVDS display does not show any video at all until I run modetest at least once. This was not needed when using the HDMI display.

Speaking of modetest, I’ve also used it to display its color bar test image. This shows up fine when using the HDMI display, but when using the LVDS display, not only are the colors off (as expected based on the other color problems I’m seeing) but the color bars are appear to be offset to the bottom of the screen, with only a small portion of the showing. This would imply that maybe the display resolution or timing is not correct, but when I display my Qt test GUI, which is specially sized for the display, everything appears in the correct location.

There is some other information I could relate – some things I’m seeing in driver printing that don’t make sense to me, but given that I’m completely unfamiliar with the DRM video subsystem, and thus don’t really know if what I think is wrong is actually wrong, I’ll hold off on that for now.

Any help would be greatly appreciated.

  • Hi Jeff,

    Given that everything works correctly on HDMI, this sounds more like a hardware issue. Are you sure that the LVDS transmitter is connected properly to the LCD signals? You could use this wiki as a reference: processors.wiki.ti.com/.../LCD_connectivity
  • Thanks for the prompt response, Biser.

    Not sure we've seen that wiki page, so we'll read it, and then check our pin mappings again. I'll let you know.

    Best,
    Jeff

  • Hi Biser -

    Thanks again for the wiki page. I've read it. As discussed on that page, our display is indeed capable of 18- or 24-bit RGB, and as far as we can tell, we're asserting the pin that puts it in 24-bit mode.

    As for our onboard video connections, we match the diagram labeled "Connection to 24-bit LCD Format 2", but please note the following:

    • We're using a DS90C385AMT LVDS transmitter instead of an LVDS83B transmitter. Our transmitter's inputs are all labeled with the same number as the LVDS83B, though, and our pins are mapped as in the diagram.
    • We're not connected directly from the AM335x to the transmitter (as noted in my OP, each color channel goes through a mux).
    • The OMAP DSS pins in the Format 2 picture on the wiki page do not appear to be arranged as they are on the AM335x. This can be seen in the errata document (as linked in my OP):

    OMAP PINS:

    DSS 0-7 = Blue 0-7

    DSS 8 - 15 = Green 0-7

    DSS 16-23 = Red 0-7

    AMM335X Corrected for errata:

    LCD_DATA PINS: 23       22     21    20    19     18      17     16      5-11     10 - 5     4 - 0

                                   B[0]  G[0]  R[0]  B[1]  G[1]  R[1]   B[2]   R[2]   B[7:3]   G[7:2]    R[7:3] 

    I'd note that even without the errata correction (please see the document for that), which we have implemented on our board, R, G, and B appear to be mapped to the AM335x LCD_DATA pins differently than they are to the OMAP DSS pins in the picture on the linked-to page.

    Regardless, both I and the original board designer have - using our board and LVDS cable schematics and the mux, LVDS transmitter, and LCD Display datasheets -  have (again, as far as we can determine) verified the pixel connections several times, and now I just went over the connections between the muxes and the transmitter one more time using the wiki page.

    Short of somehow putting a scope on these connections, we're pretty sure we're connected correctly.

    I'm still wondering why I need to run the modetest command before I'm able to get any video. I'm also wondering if doing so is putting the driver in some sort of 6-bit mode, instead of the 8-bit mode we need?  As far as I know, the DRM/TILCDC driver is not capable of RGB666, but maybe it's getting put into some 565 mode? 

    Thanks again!
    Jeff

  • Jeff Fuller said:
    We're using a DS90C385AMT LVDS transmitter instead of an LVDS83B transmitter. Our transmitter's inputs are all labeled with the same number as the LVDS83B, though, and our pins are mapped as in the diagram.

    What is important here is the LVDS transmitter sequencing, i.e. how the parallel signals are mapped on the serial side. Can you compare both transmitters to see if they use the same LVDS format?

    Jeff Fuller said:
    The OMAP DSS pins in the Format 2 picture on the wiki page do not appear to be arranged as they are on the AM335x.

    This is true and you should ignore it. Just make sure your parallel signals match the AM335x arrangement as given in the Errata.

  • Hi Biser,

    Yes, we've compared the transmitter data from the wiki page (LVDS83B) to the transmitter we're using (DS90C385AMT), and the parallel to serial mapping is the same:

    (Note that the two diagrams list the serial lines in opposite order).

    Any other suggestions?

    Thanks!

  • Jeff:

    Could you confirm if the PCLK is muxed or directly connected to both the LVDS and HDMI transmitter devices?

    If you happen to have a access to the PCLK (from SOC), or, the TXCLKIN pin of the LVDS transmitter device  on the board, can you check it on the scope, to see if the clock signal is clean when switched to LVDS transmitter?

    Regards

    Jian

  • Thanks for the response, Jian.

    Indeed, PCLK is muxed. Yes, we can try to scope it out. In fact, we need to scope some of the pixel lines too, because frankly, I've become quite confused, as described below.

    I changed our device tree back to its HDMI configuration and used printks from the DRM driver, output from the fbset tool, and a dump of the contents of the frame buffer itself (via /dev/fb0) to inspect what color depth/pixel order the DRM driver is in when using HDMI. Surprisingly, everything points to the driver being in RGB565 mode. This stunned me as I had assumed all along that the driver would be in RGB888 for HDMI because we're sending all 24 LCD_DATA pins through the muxes and out the HDMI transmitter. And since the driver video mode when using HDMI is set from EDID information, as opposed to the device tree, and the video had just worked, I had never had reason to actually check what mode the driver was in.

    The problem is this: if, when using HDMI, the driver is actually in RGB565, as opposed to RGB888, then red and blue should be swapped for HDMI just as we see on LVDS, because according to our board schematic, we've wired the LCD_DATA pins for RGB888 as shown in Figure 2 of the AM335x errata. And as the errata states, if the driver is configured for RGB565 with the pins wired that way, red and blue should be swapped. But those colors are not swapped when using HDMI (another reason I thought the DRM driver was in RGB888 mode when using HDMI). So this is confusing.

    Now, back to LVDS: I had set the bpp field in the panel device tree node (this node is only used for LVDS) to 24 (again, based on the fact we had all 24 LCD_DATA pins connected), and that indeed puts the driver in RGB888 mode (as confirmed via fbset, the frame buffer contents, and driver printks, same as above). Of course, if what we're actually wired for is Figure 3 in the Errata doc, then yes, it makes sense that with the driver in RGB888 mode, we'd see red and blue swapped, and it also makes sense that we'd have only 6-bits of color (as described in my original post, my test program that ramps a flat field of red or blue from 0 - 255 results in black at 0, 64, 128, 192, and the brightest color at 63, 127, 191, and 255, just as if we're in a 6 bits per color).

    Now, our LVDS display happens to be capable of RGB666 as well as RGB888. I've tried putting the display in that mode and setting the bpp field in the device tree to 16, and while that does put the DRM driver into RGB565 mode, the display shows garbage when I do this.

    Does anyone know if the display's RGB666 mode would be compatible with the DRM driver's RGB565 mode, assuming all the signals are hooked up correctly? I expect that  LCD_DATA pins for for Red[0] and Blue[0] would just carry 0, and thus if we have the hardware connected correctly, our display would work?

    Also, based on the things I've posted, does anyone have the impression we're reading the errata document wrong here?

    Some things we're going to try here:

    • Scoping various signals as discussed above
    • Follow the LVDS data path through from the CPU pins out through the cable try to ensure that, if we are for some reason (incorrect schematic?) wired as per Figure 3 in the errata doc, the display would work in its RGB666 mode. Maybe our cable needs some pins swapped.

    Any thoughts anyone has would be appreciated.

    Thanks,

    Jeff

  • Jeff:

    Thanks for the detailed description. I think there are several items:
    1. why R/B color was not swapped when the HDMI display set the LCDC to RGB656 mode via EDID
    2. why the color got swapped on the LVDS display
    3. why we need to run modetest once before we can see the LVDS display
    I will check with driver team on the EDID conditions assuming that there will be EDID via HDMI but none from LVDS. Will also confirm why we are seeing opposite from Errata.

    I have a couple of further question to you in the mean time:
    1. I read the DS90C385A spec., it does not support RGB666 mode. So even if the panel do support RGB666 (we can wire RGB565 to RGB666), you will get garbage due to DS90C385A. In fact, RGB666 only uses three data lanes instead of 4, therefore the serial data packing is quite different. I suspect the panel just ignored the 4th data lane and interpreted only 3.
    2. Can you confirm if you can set your HDMI monitor to RGB888 mode? Hopefully that will trigger EDID and to put driver in the right mode.

    regards
    jian
  • Hi Jian,

    Thanks for the prompt reply - I agree with your item list. And thanks for pointing out that the LVDS transmitter doesn't support RGB666 - that explains why I'm seeing the garbage.

    I have tried to put thing in RGB888 when using HDMI - figured it would lend further evidence to determining which way our LCD_DATA pins are wired. I've tried a few things, but I'm not sure how to force it into that mode (default is RGB666). I used modetest to dump the various modes that EDID says the display supports, but there is no color order or bit depth information in output from modetetst (at least not that I can see), only resolution and display timing, so I'm not sure if I can use modetest to do it.

    I also tried hacking the DRM driver to force RGB888, but that hung the kernel during boot (I did get an error message and so I know where in the driver it died).

    Do you have any idea how I might force RGB888 when using HDMI?
  • Jeff:

    I asked the same question to the driver team no how to force to HDMI. Will post back when he answers. May be overnight as he is in Europe time zone.

    On the HDMI display, could you confirm what was the resolution you were trying to display? If it is standard HD monitor, I am guessing the monitor gives back supported modes and AM335x picked the lower resolution settings (as it does not support HD), and, the choice were limited.

    Jian
  • Great, thanks Jian.

    The driver is picking a 1280x800 mode, which is actually the native resolution of the display (this is a small display for a medical device). Still don't know why it defaulted to RGB565 instead of RGB888, though. Transmitter is capable of it, and so is the display, though I should state hear I do not have a very good datasheet for the display - it's third party custom for us, and more focused on mechanical and low level electrical signals, as opposed to the HDMI receiver/LCD display characteristics.

    I can try to get our manufacturing guys to get a better datasheet from the third party if we need to.

    Edited to add:

    In case the driver guys want to know how I'm determining the mode the  display is in, I've done this in three different way (following results are for HDMI, while LVDS shows RGB888):

    • printk output from drm driver's drm_fb_get_bpp_depth function shows this: 

    int that holds the current pixel order/bits per pixel = DRM_FORMAT_RGB565 = RG16 little-endian (0x36314752))
    depth = 16
    bpp = 16

    • fbset -v -i shows:

    rgba 5/11,6/5,5/0,0/0

    •  When I dump the frame buffer (dev/fb) with R G and B pixels set to recognizable 8-bit hex values (e.g. set R to 0xaa, G=0xcc, G=0xdd in my test GUI via Qt calls), I don't see those values in the frame buffer (I do when running the LVDS display at RGB 24) which I assume is because they've been truncated or packed in some way to fit RGB565.

  • Jeff:
    Thanks for the details. I think this is good enough for driver team to get a read. I will update you tomorrow.
    Jian
  • Jeff:

    see my questions to the driver team and answers. 

    can you check with kernel we are using?

    thanks

    Jian

    1. There is an active Errata on AM335x LCD, that says the color pins will be swapped if the LCD is in RGB888 mode. http://www.ti.com/lit/er/sprz360i/sprz360i.pdf. But customer seeing opposite behavior. Question is “Does the driver automatically swaped the Red and Blue data?”                              

    [A]Different kernel versions have dealt with the HW errata in different ways. The more recent kernels do it correctly: in the device tree data you need to define whether your board has the color lines swapped or not. This then affects which pixel formats the DRM driver supports: either RGB565 + BGR888, or BGR565 + RGB888. Note that there are many applications that don't support BGR formats, so even if the driver says it supports BGR888, it may be that the userspace refuses to use it. In earlier kernels, if I recall right, the driver just said it supports RGB565 and RGB888, even if the other one was actually BGR.

    2. For HDMI display, they configured device tree for RGB888, but EDID overwrote the configuration and put driver to RGB565. Is this expected? Can we disable the EDID overwrite?          

    [A] EDID doesn't affect the pixel format, so it can't "overwrite" it.

    3. Related to #2 above, for their LVDS display, they have to run the “modetest” first, before they can see LVDS video. Are there parameters not initialized in the actual driver?

    [A]I don't know what's causing this. If they have fbconsole enabled, they should get an image on the LVDS display when the display driver is loaded.

    I suggest looking at the mainline kernel, and how AM3 beaglebone and AM3 EVM do things. If I recall right, beaglebone has HDMI in rgb565 setup, and AM3 EVM has LCD in rgb888 setup.

  • Jeff:

    Further QA on HDMI driver behavior, questions were posed to get confirmation/clarifications from driver team :

    1). Resolution and frame rates will be automatically selected by driver via EDID?
    [A] The drivers often doesn't do anything automatically. It's the applications that should decide what to do. That said, there's the framebuffer console, created by the kernel based on the EDID (if EDID is available). But applications don't usually use that fb, it's mostly for boot messages and such.

    2). EDID will overwrite device tree specifications for resolution and frame rate?
    [A] This is up to the HDMI encoder driver. But if it supports EDID, I find it odd that it would also support getting the data from device tree.

    3). Pixel format, e.g., bpp, will not be determined by the EDID. Instead, it is specified in device tree.?
    [A] Pixel format is not defined in the device tree, it's up to the application to allocate the buffer with the pixel format it wants to use (limited, of course, by the formats supported by the driver).

    Jian
  • Jian, my responses to both of your posts in blue.

    jian35385 said:

    1. There is an active Errata on AM335x LCD, that says the color pins will be swapped if the LCD is in RGB888 mode. http://www.ti.com/lit/er/sprz360i/sprz360i.pdf. But customer seeing opposite behavior. Question is “Does the driver automatically swaped the Red and Blue data?”                              

    [A]Different kernel versions have dealt with the HW errata in different ways. The more recent kernels do it correctly: in the device tree data you need to define whether your board has the color lines swapped or not. This then affects which pixel formats the DRM driver supports: either RGB565 + BGR888, or BGR565 + RGB888. Note that there are many applications that don't support BGR formats, so even if the driver says it supports BGR888, it may be that the userspace refuses to use it. In earlier kernels, if I recall right, the driver just said it supports RGB565 and RGB888, even if the other one was actually BGR.

    I actually came across this patch to the device tree and driver online the other day: https://patchwork.kernel.org/patch/9308611/

    Is this what they're talking about? I was going to try to apply the patch as an experiment, but unfortunately, it looks like our kernel is of an older version that won't work with the patch, as the patch modifies a certain function that is not present in our version of the driver. We're currently at AM335x Processor SDK 2.00.00.00 (here is the "uname -a" output: Linux DRXC-12 4.1.6-g52c4aa7 #44 Wed Jun 7 11:00:47 EDT 2017 armv7l GNU/Linux)

    Do they agree that this kernel is too old to support the patch?

    jian35385 said:

    2. For HDMI display, they configured device tree for RGB888, but EDID overwrote the configuration and put driver to RGB565. Is this expected? Can we disable the EDID overwrite?          

    [A] EDID doesn't affect the pixel format, so it can't "overwrite" it.


    Right - apologies if I wasn't clear on this, but when using HDMI, the device tree doesn't support setting bits per pixel.

    jian35385 said:

    3. Related to #2 above, for their LVDS display, they have to run the “modetest” first, before they can see LVDS video. Are there parameters not initialized in the actual driver?

    [A]I don't know what's causing this. If they have fbconsole enabled, they should get an image on the LVDS display when the display driver is loaded.

    I suggest looking at the mainline kernel, and how AM3 beaglebone and AM3 EVM do things. If I recall right, beaglebone has HDMI in rgb565 setup, and AM3 EVM has LCD in rgb888 setup.

    I'm not sure what fbconsole is - I'll research this, but if they can tell me how to enable it, that would be helpful.

    I did indeed base our device tree entry for LVDS on the EVM device tree (or maybe it was the statrer kit EVM). 


    1). Resolution and frame rates will be automatically selected by driver via EDID?

    [A] The drivers often doesn't do anything automatically. It's the applications that should decide what to do. That said, there's the framebuffer console, created by the kernel based on the EDID (if EDID is available). But applications don't usually use that fb, it's mostly for boot messages and such

    So we're using psplash for a splash screen during boot, and then a Qt GUI app after that. I'm not totally familiar with our GUI app, but am not aware of anbut we set these Qt environment variables before starting the GUI:

    export QWS_SIZE=1280x800 (this is for HDMI; it's 800x600 for LVDS)

    export QWS_DISPLAY="LinuxFb"

    and we pass -display LinuxFb to the gui as arguments (and pass them on to the Qt application class).

    I don't think when using HDMI that our application asks for any specific frame rate or bits per pixel - we've just been using what the driver selected by default...  Again, I'm not familiar with fbconsole - frankly, I've never had to delve into the Linux video subsystem - when we got HDMI going 2 years ago, it just worked once we had the device tree configured for it. So I appreciate you bearing with me.

    2). EDID will overwrite device tree specifications for resolution and frame rate?
    [A] This is up to the HDMI encoder driver. But if it supports EDID, I find it odd that it would also support getting the data from device tree.

    Yes, this is correct. Our device tree when using HDMI does not do anything to specify resolution, frame rate, display timing, or bpp. It's only when using LVDS that the device tree contains this information.

    3). Pixel format, e.g., bpp, will not be determined by the EDID. Instead, it is specified in device tree.?
    [A] Pixel format is not defined in the device tree, it's up to the application to allocate the buffer with the pixel format it wants to use (limited, of course, by the formats supported by the driver).

    Okay, this sounds like the meat of the issue when it comes getting the HDMI into RGB888. As stated above, we're using Qt, so based on this answer, it seems it must be Qt must be asking for RGB565? Are they aware of how to use Qt to ask for an RGB888 mode? 

    At this point, I just want to restate the issues to keep us focused: We're interested in getting HDMI into RGB888 for test purposes only. Our schematic seems to show that we've wired the LCD_DATA pins per Figure 2 in the errata, which means that when using HDMI in RGB565, red and blue should be swapped. But it's not, thus the reason for putting HDMI into RGB888 is to see if that swaps red and blue.

    As for LVDS, we must use it in RGB888, since the transmitter will not support RGB565. But when it's in RGB888, red and blue is swapped, and we're only getting 6 bits of pixel range, as opposed to 8. 

    So I'm going to research how to use Qt to set a color mode of RGB888 when using HDMI so that we can see if red and blue is swapped. We also have to consider how we're going to get LVDS into BGR888, as it's seeming like that may be the ultimate solution here. Will it require that driver patch? If so, will we need the latest TISDK version? And will Qt even support doing so?

    Thanks again to you and the driver team.

    Jeff

  • Jeff Fuller said:

    We're currently at AM335x Processor SDK 2.00.00.00 (here is the "uname -a" output: Linux DRXC-12 4.1.6-g52c4aa7 #44 Wed Jun 7 11:00:47 EDT 2017 armv7l GNU/Linux)

    Do they agree that this kernel is too old to support the patch?

    Yes, this PSDK version is too old. We recommend migrating to latest PSDK (version 3.3) for better support. PSDK 2.0 version doesn't support configuration of RGB mode via dts file. 

  • Hi Manisha,

    Okay, so I want to be clear. Moving to 3.3 will allow us to put the driver in BGR888 when using LVDS, and that will fix the swapping of red/blue?

    When using LVDS, we also have the problem where it seems like our dynamic range is only 6-bits per color despite the driver being in an 888 mode. If this persists upon updating the SDK and switching to BGR888, we'll still have color problems. Do we expect updating to SDK 3.3 will fix this as well?

    And finally, do you recommend we just move ahead with the update, and don't both doing further tests that involve trying to get the driver into RGB888 when using HDMI?

    Thanks!

    Jeff

    Late Edit:

    Another question for you, unrelated to our video problem. I believe SDK3.3 has Qt 5.6 in it. Do you guys know if/when you'll release an SDK with Qt 5.8 or 5.9 (5.9 being the Qt version we'd prefer)? Even a rough time frame would be helpful.Thanks!

  • Jeff Fuller said:
    Okay, so I want to be clear. Moving to 3.3 will allow us to put the driver in BGR888 when using LVDS, and that will fix the swapping of red/blue?

    Moving to 3.3 will give you control to instruct the driver with wiring scheme. I can't comment if it will fix the problem unless the problem is wrong instruction to the driver. 

    Jeff Fuller said:
    When using LVDS, we also have the problem where it seems like our dynamic range is only 6-bits per color despite the driver being in an 888 mode. If this persists upon updating the SDK and switching to BGR888, we'll still have color problems. Do we expect updating to SDK 3.3 will fix this as well?

    I would say, please check the QT app, what color scheme is it using for rendering. 

    Jeff Fuller said:
    And finally, do you recommend we just move ahead with the update, and don't both doing further tests that involve trying to get the driver into RGB888 when using HDMI?

    That's up to you. If your usecase doesn't require RGB888 for HDMI, then you can skip testing it. 

    Jeff Fuller said:
    Another question for you, unrelated to our video problem. I believe SDK3.3 has Qt 5.6 in it. Do you guys know if/when you'll release an SDK with Qt 5.8 or 5.9 (5.9 being the Qt version we'd prefer)? Even a rough time frame would be helpful.Thanks!

    We will migrate to QT 5.9 in 2Q, 2018. You can use master branch of OE/Yocto and use whatever version is the latest  there. Current master has Qt 5.8 and is about to switch to Qt 5.9 

  • Okay, thanks. We're going to bring down the 3.3. SDK. I'll post back after we're up and running with that.

    Best,

    Jeff

  • I’ve migrated our project to TI Processor SDK v3.03.00.04.

    By placing a blue-and-red-wiring=”crossed” into the device tree, our blue and red are no longer switched. Excellent! Thanks for recommending that.

    Unfortunately, we still seem to only have 6 bits of color range, not 8.

    The ramping and resetting of R, G, and B flat fields at 64-bit boundaries I described in my earlier posts is still occurring – it seems like we’re losing the two MSBs of each color.

    Here’s some information I’ve gathered that may or may not help you help me. At the end of my post, I discuss one other error I’m seeing.

    DRM_MODE as reported from driver:

    I’ve got a printk in drm_fb_get_bpp_depth. With bpp in the device tree set to 24, and blue-and-red-wiring=”crossed”, the DRM_MODE is reported as XRGB888. I was expecting it to be RGB888, but I assume the X is just the unused byte of the 32-bit word each pixel is stored in?

    Modetest lists supported modes as BG16 RG24 XR24

    As a side note, I did originally start with blue-and-red-wiring set to “straight” with bpp set to 24 – that resulted in a DRM MODE of RG16; this also swaps our red and blue channels and as would be expected, modetest lists supported modes as: RG16 BG24 XB24

    Modetest:

    Speaking of modetest, I no longer need to run modetest before I can see video on the LVDS display, but this could be because weston is running at boot. I haven’t figured out how to disable weston’s automatic start yet (right now, I’m killing it after boot, though the loss of two bits of color happens whether I kill Weston or not). Note that I see XRGB888 from the printk output early on in boot, before weston can run, so it’s not Weston, or at least not Weston alone, that’s putting the driver in XRGB instead of RGB.

    Device tree:

    The device tree bindings help files (…/Documentation/devicetree/bindings) for tilcdc have changed from what they were in SDK V2. I did not try to update my device tree (except for adding the blue-and-red-wiring field) to match these changes; the fields are the same, it’s just some of the containing nodes have changed. I don’t know if this is relevant to the 6-bit problem or not, but everything else in the tree seems to be working correctly, with the display’s timings being set up correctly, etc….

    One exception to the above. During boot, I’m getting a pinmux error that I wasn’t getting before. Not sure how to explain that as I did not modify the device tree in any way for SDK V3 (again, excepting the blue-and-red-wiring field), and the error I’m getting is not related to any of the tilcdc nodes, but instead to the lcd pins used by those nodes.

    Here is the error:

       [ 1.056359] pinctrl-single 44e10800.pinmux: pin 44e108a0.0 already requested by panel; cannot claim for 4830e000.lcdc

    [    1.067074] pinctrl-single 44e10800.pinmux: pin-40 (4830e000.lcdc) status -22

    [    1.074258] pinctrl-single 44e10800.pinmux: could not request pin 40 (44e108a0.0) from group lcd_pins_default  on device pinctre

    [    1.086492] tilcdc 4830e000.lcdc: Error applying setting, reverse things back   

    I’m not sure exactly how to determine which pin in my lcd_pins_default pins node this error message is referring to. I can try putting some more prints into the driver to figure it out, but if you guys could give me guidance on this, I’d appreciate it.

    Thanks,

    Jeff

  • Jeff,

    the conflict seems to be over lcd_data0 pin (refer TRM Control Module Registers offset 8A0h). I don't have both SDK 2 and SDK3 versions handy to compare the entries but it looks to me that in the new SDK the pinmux for the LCD has already been configured, so your pin control entry contradicts the default

    regards
    Michael
  • Finally got back to looking at the only 6-bits of color range problem today. Went to put a scope on the LVDS connector pins, and noticed two pins were damaged. Switched to a different board, and color is now correct.

    So to sum up: we needed the version 3 SDK to fix the red/blue color swap, but the 6bit color problem was due to board damage. Next time, I'll be checking pins right off the bat.

    Thanks to everyone for the help.

    Jeff

  • Thanks for the update. Glad to know that your problem is resolved now.