This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

DM365 display problem ---with offset

Other Parts Discussed in Thread: THS8200

hi, everyone:

      I'm useing an Appro DM365 IPNC to display the demo source "davincieffect.mpeg4" in 1280x720, and got an stream with offset, show as follow:

 

     you can see the color line on the top and left of the picture, and I can only display the picture without the top and left  on TV with YPbPr, you know in YUV420PSEMI, V4L2  and I have followed the souce code, haven't find any offset set to display params,  so, I confused,  some meet this before and how could it be? you should know that we use dvsdk_2_10_01_18, any suggest will be appreciate.

  • This seems like an offset problem, overscanning by the display (it is common for displays to overscan and leave out a few pixels around the border) or a combination of both.

  • Juan:

           As you say, but how it happened? I mean I have followed the code and haven't find any offset have been set, could you give any possble reason this happened?

  • Juan:

           I have learn something about overscan, and I don't thank it's an overscan problem, because it's just cut off the top and left no around the image,  but if this is an offset problem it should be some black area on the top and left , but I just got an stretched image full of the screen, and I have test this on 3 TV, but got the same image cutoff, any suggest?

  • I am not familiar with the IPNC software stack as this is supported by the third party that sells this solution...have you tried contacting them?

    From my end, I am afraid that all I may be able to do is offer advice at the very low level register offset.  If you consider things at the lowest level, you basically have two parts (the source being dm365, and the sink whatever part is inside the display) which need to speak the same language. This means that both need to support the same resolution (including blanking sizes) and refresh rates in order for things to work as expected.  From a dm365 IPNC perspective, I am guessing you have a fixed hardware clock that can produce a small set of pixel clocks thru PLLs; furthermore, it is quite possible the IPNC software uses a single pixel clock configuration.  the pixel clock is key because once you define you resolution size and blanking sizes, the pixel clock will determine what refresh rate you can support; the more pixel clock choices you have, the more refresh rates you can support.  On the display side, you likely have a more sophisticated pixel clock hierarchy since displays normally have to adapt to the video source being provided; however, in some display manufacturers will specify required pixel clock, and refresh rate ranges supported for each resolution.  If you have the source (DM365) defining a longer horizontal or vertical blanking period than the display expects or can handle (no longer talking the same language), this would also cuase this valid video portion to appear to have an offset;

  • Hello,

    Do you mean you are decoding the file on IPNC and trying to display? IPNC has only D1 (standard definition output) and hence we just crop the central part of 720P frmae and send it for display. We do not resize 720P to D1 before display.

    This is how the IPNC has been spec and designed but can surely be changed to resize 720P to D1 and then display.

     

    Regards,

    Anshuman

  • hi,

          As you said, you have used IPNC before? and how do you know this limited? or what do you thank to cause this limited? you know, they all use DM365 to do everything, from decode to display,  I can't got any different from the VPBE for display.

                                                                                                                                                                                Eric

  • Juan:

           Thank you for your Reply,  I do thank there may be some problem with the pixel clock, but I don't understand how it work,  we use an fixed PLL to supply 74.25M clock,  and as I see, the DM365 EVM board supply the same clock,  and what confused me is how it works well to support many different resolution? like 720P and 1080I? I haven't see any different configure between this two resolution about EXTCLK, how it self adapt?  also, I want to know in V4L2 structure, how can we specify the "left_margin" and something like this? you know I have fixed the xpos, ypos = 0; but got the above picture

  • Eric,

    There are industry standards (e.g. SMPTE-xxxx) that dictate the pixel clock and vertical/horizontal blanking requirements for various video standard such as 720p and 1080i.   Fortunately, the people that put these industry standards together were pretty smart and they adjusted the blanking area of the video to ensure 720p and 1080i can operate with the same pixel clock.  I touch a little on this in the following App Note: http://focus.ti.com/general/docs/litabsmultiplefilelist.tsp?literatureNumber=spraan0

    In additional we do have hardware features (e.g. PLLs and PCLK registers) that can also give the user some flexibility over varying the VPBE pixel clock...

    Since you mention the pixel clock is at 74.25 M (used for 720p or 1080i), what Anshuman suggests makes sense.  It appears he has some experience with IPNC as well so he may be able to provide good insight.  Personally, I have not even see a DM365-based IPNC though I know they are out there :).  Since I have little insight into the IPNC hardware or software, I can help you from a hardware level and can tell you which registers affect what... but I could not tell you what type of support is present in their V4L2 dirver (my understanding is they modified ours or re-wrote it to add proper sensor support).

  • I and probably others who are new to DaVinci stuff that does not have background with the dm6x platforms would like to know if documents like spraan0 and spraah0a and others are/will be available specifically for the dm365.  There are some learning-curve inclines that are pretty steep and I believe these document could make it a bit easier.  Any pointers to document would be appreciated.  I am currently struggling to do OSD stuff specifically in encodedecode demo as well as video_looopback_blend apps.

     

  • Hi Hijn,

    Over the last year or so, we have moved from creating App Notes such as spraan0 and spraaha to creating wiki articles available at http://wiki.davincidsp.com/index.php?title=Main_Page (click on your product in the Embedded Processor section, and you will be taken to wiki articles specific to your product).

    The reason for this is that our App Notes require a lenghtier approval process and changes in our software architecture can often render App Notes obsolote; wiki articles can be editted quickly in a few minutes allowing us to get our customers the information quicker.  For example, I wrote spraan0 when we did not have digital output support in our DM6446 drivers and as a result of customer demand for that feature.  Eventually, our software team which focuses on drivers implemented this feature in our DVSDK drivers and the AppNote, though still a good reference point for some video basics, is obsolete from the source code standpoint.  The same could be said for spraah0a; the architecture has changed so much that it would need a rewrite to apply to today's DVSDK.  Our wiki approach will allow us to quickly update articles so we can reduce the amount of outdated info. 

    In practice, just as we did with App Notes, our wiki articles are created based on customer demand... when we see a common question/request come up several times, we take action to address it in the form of a wiki article.  That said, we really appreciate your feedback and will consider wiki articles around the topics covered by the app notes you suggested above; you may want to keep the wiki link above handy.

     

  • Juan,

    thanks for the reply.  Maybe a complaint from my ignorance especially with things linux and not working with DSP for a long time (only 8-bit micro's for the last 10 years) but the whole development environment with the millions of files (probably the linux thing I refer to) is a bit overwhelming or sort of thrown-into-the deep-end-but pushing-the head-under-as-well.  For instance when I set up my environment I used the wiki "Linux host configuration - Ubuntu" because I am using ubuntu (which was/is also a bit new to me) but doing a lot of reading the last 2 months with this gut-feeling all the time that something is missing especially with the app I am busy with to do data insertion on the video signal with the thought that using OSD it would be a breeze.  I saw last week that the GSG DM365 DVEVM Software Setup wiki actually shows a bit of the installation of the PSP which you and others refered to many times but I never had and was always wondering and searching for where/what it is and does.  Doing the installation I found the link to some other examples on OSD but only then stumbled across /home/gvi/workdir/lsp/ti-davinci/linux-2.6.18_pro500/drivers/media/video/davinci/davinci_osd.c which might have been the gut-feel item that was always missing because the demos "encodedecode" that does use OSD in a specific manner and blend.c (from dmai api's) that is only for the dm6467 was keeping me searching for an API for the OSD which might still not be available in the DMAI for the dm365.  My suggestion is for some level of documentation/matrix that shows what is availible and supported by what and where to find it so top-down operators like me and newbies can find things also a bit easier.

    With the suggestion I also want to ask for some guidelines on doing the video data insertion at a bit higher level (i.e being able to use mostly API without having to go too low level) - Mayebe something like SPRAAD7 you did for the 6443 but one for the dm365.  At this stage I am working on evaluation of the dvevm365 but also demonstration of the insertion of bit-encoded data on the left edge of the visible screen, probably 11 or 12 pixels wide for now to conform to the way it is done now with PIC microcontrollers.  I would like to know which API's I can use to modify the encodedecode demo to the point where instead of updating the output buffer with the output of the encoder, I can just "blend" my edge-data into the buffer - bitmap OSD and display it?

    Sorry for the long story but hope it adds some value also.

  • Hi Hijn,

    I certainly understand where you are coming from and we are working hard to improve our offering based on feedback from our customers.  Part of the challenge is that we do not control all the software needed to work on these platforms (e.g. open source kernels, file-systems, graphics libraries, third party-montavista- kernels, third party development tools); many of these external software components have industry defined APIs (V4L2 video capture/display Linux standard, Frame Buffer driver Linux Standard for OSDs, Gstreamer multimedia framework, X-Windows APIs, QT graphic APIs...).  I am not trying to justify our offering, but perhaps confirm that much of the software is not created nor documented by TI and if the software seems fragmented is becuase to a large extent it is.  In my humble opinion this is the reality of working with Linux (open, flexible but complex) and for some that have been doing this for a while, it is second nature.  We are diligently working on improving our collateral to make it more user friendly; in the mean time, we are doing our best to educate customers on industry standardized APIs and at least point them in the right direction when we do not have the answers..Your feedback is valuable to us so please keep it coming.

    With regards to your work at hand, DM6467 and DM365 have very different OSD hardware...actually I believe DM6467 is not even advertised as having a hardware OSD, because it has very limited capabilities in this area.  Therefore, trying to use graphics-based DM6467 application in DM365 is probrably not going to go well.  If you are trying to overwrite pixel data around the left edges of the video buffer, you can either over-write this data using DMA (and ARM CPU), or have this data present on a separate OSD buffer and choose to 0% blending, which means your OSD buffer will replace your video buffer for display purposes.  If you define your OSD window to be of smaller size (only cover left edge of video), then only those pixels will get overwritten.  The driver you want to become familiar with is the Frame Buffer Driver; this is a standardized Linux driver with standard APIs, hece we do not document usage of these APIs, though I am sure you can find documentation on line (search for 'Linux Frame Buffer driver' or 'FBdev').

  • Hi Juan,

    Thanks for the information.  I was talking to one of the more experienced Linux guys yesterday and he confirmed what youy said about the linux environment - or vica versa.  I think think you understand my confusion with the whole open setup and can only wish you well with a matrix of some sort as I can see how many things there are out there having to keep track of.  Maybe we should have a separate thread for newbies as we are going a bit of course for this one?

    After "discovering" the PSP I have been diving into the FBDev since yesterday and am glad that you confirmed to me it is the way to go to use the OSD on the dm365.  These sort of pointers into the right direction is what I need just to keep me on the right track, i.e what documentation to look at, what drivers to use and what demos/examples might help after learning in 'n sentence or two what I am trying to achieve -the edge insertion.  My plan was to use the OSD as you proposed initially for evaluation and demonstration purposes but to later insert my data directly into the relevant display buffer, using the DMA as you mentioned, and use the OSD functionality for text that I need to annotate and other user characters that I need to display e.g a cross-hair etc for the operator to aim his camera at a calibration board.  I will probably have a question on when the time comes to blend my OSD data and will start a new thread on it.  Thanks again for the help!

  • Hi, Juan:

             As the problem I metioned above, I got an bad news. I test this on the 365 EVM-board and got the same problem, also I should say my co-worker have erase the flash complatly so we reburn the flash use the source code, and we use NFS without burn flash_image, and got the same problem, Also, I have contact your agent SEED INTERNATIONAL LTD. where we got the board, and use their board and their Environment but got the same problem, so, haven't you saw this before?

        maybe people haven't note this because it just 40 pixel on left and 20 on top when we just look at image, but if you display stream with OSD, we can got this problem so clearly, because we miss half of font on the left, so, Juan please test and told me, I really need your help and suggest!

  • Hi, Juan:

             I'm still working on this problem and haven't got any idea, anyone meet this before? any suggest will be appreciation!

  • I am using the fbdev_loopback example code and import a bitmap into osd0 as per SPRAAD7.  The bitmap is 720x576 pixels in a chequered format.  It is definately cut off on the left by at least 15 pixels and the top line is only displayed on video line 31 of my waveform monitor.  I have just switched off to go home and numbers could be out a bit but will give exact numbers tomorrow when I switch on again.  My application will use bit-encoded data in the video on the left edge of the screen so I am investigating if there is a problem as I can always put my "bargraph" at an offset but would like it right on the edge of the buffer displaying on the edge of the screen.  Till tomorrow with some accurate numbers on the offsets I am seeing.

  • Eric, I might be on a different track and not seeing an offset but just the "visible" portion of my test bitmap which is 720x576 pixels in OSD0.  It is displayed with my first line in Field1 line 26 or Field2 line 338 on my waveform monitor.  I am constantly upgrading my test map and probably don't have the perfect test map yet.  On the left pixel 16 or 17 is the first ones visible and on the right about 21 pixels are lost.  On the bottom 7 pixel lines are not shown.  I am using a Sony monitor that can underscan so I can see everything that comes out of the dm365.

    Hope this is on the correct track as I am learning lots about video and probably still missing stuff that I will come across in the datasheets soon.

  • Hi, Hijn:

               thank you for your reply, I should say we do differernt thing but maybe got  same problem --"pixel lost", when I run the demo decode on EVM-365, got an picture lost it's left 40 pixel and top 20 pixel, and I have try to change it's start point and find it haven't gone but just not in "visible portion", maybe you could run the decode demo and see if there is an problem,

  • Hi, everyone:

            I have contect TI FAQ in shenzhen China and they have confirmed the problem, but haven't give a solution,and I wonder why so many people here haven't saw this problem? also, nobody discuss with me for a long time, around this problem I want to know when stream encoded by the internel HW, how they give the stream timing sync, you know,  for TV to decode and display, as I can see from the source code, they just set the BASEX and BASEY for left_margin and right_margin, but as the STD, 720P should be 1650x750, which mean the blanking range should be 370x30, not just left_margin x right_margin, also, the give an rigister ETMG2 and ETMG3 for set timing sync, but how could I set blanking offset like 260 to 4 bit? you know 4 bit just can express 15 in decimal?   wait for your help!!!!!

     {
      .name = VID_ENC_STD_720P_60,
      .std = 1,
      .if_type = VID_ENC_IF_INT,
      .interlaced = 0,
      .xres = 1280,
      .yres = 720,
      .fps = {60, 1},
      .left_margin = 300,
      .right_margin = 70,
      .upper_margin = 26,
      .lower_margin = 3,
      .hsync_len = 80,
      .vsync_len = 5,
      .flags = 0},

  • Eric,

    Please note that you should be looking at HSTART and VSTART registers instead of ETMG2 and ETMG3.  If you refer to the App Note link I sent earlier (http://focus.ti.com/general/docs/litabsmultiplefilelist.tsp?literatureNumber=spraan0), you should see

    dispc_reg_out(VENC_HSPLS, BASEX720P);
    dispc_reg_out(VENC_VSPLS, BASEY720P);
    dispc_reg_out(VENC_HINT, 1649);
    dispc_reg_out(VENC_HSTART, 300);
    dispc_reg_out(VENC_HVALID, DISP_XRES720P);
    dispc_reg_out(VENC_VINT, 749);
    dispc_reg_out(VENC_VSTART, 26);
    dispc_reg_out(VENC_VVALID, DISP_YRES720P);
    dispc_reg_out(VENC_HSDLY, 0);
    dispc_reg_out(VENC_VSDLY, 0);
    dispc_reg_out(VENC_YCCCTL, 0);
    dispc_reg_out(VENC_VSTARTA, 0);

    Please note that the app note and these register settings apply to DM6446, but DM365 should be very similar.  An HINT value of 1649 (0-based) tells the VENC the line length is 1650 (0-1649), HSTART tells the VENC the valid video area starts at 300 after HSYNC.  HVALID defines the length of valid video area (1280 for 720p).  Therefore, the venc is smart enough to calculate that the back-poarch (1650-300-1280 = 70).  You can play with HSTART to change the value of front-poarch (blanking area before valid video) and back-poarch (blanking area after valid video) and you should see your valid video move to the left and to the right as you change HSTART.  The only other thing to keep in mind is the HSPLS (horizontal pulse width) as this may also affect the way hardware works (rising vs. falling edge trigger, valid data not allowed during sync pulse....). 

    Our DM365 video ports are very flexible and the user can program a video timing frame (see Figure 39 in VPBE User Guide) to meet their needs.  If you are seeing your video cut off at the edges and you are sure this is not due to overscanning being performed by the display you are using, then you certainly have the flexibility to adjust your video timing to correct the problem.  As I suggested earlier, I am not familiar with IPNC software or what register settings they used, but the DM365 demos included with our DVSDK display 720p video fine in all displays I have used thus far (disregarding overscanning performed by some displays).

     

     

     

  • Juan:

          I have to say you made an mistake, what we discuss here is DM365 internal HW decode, not connect to THS8200 for decode, and in the internal decode, sync was embed in the luma channel, not transmit from HSYNC/VSYNC pin, and what you suggest above is the second one, as I know from the VPBE datasheet, we should config this timing SYNC in the ETMG2/ETMG3, but how could I do this with 4bit to import an big number?

  • Maybe I missed something, I did not see 'embed sync' in this thread or realized you were working with analog video output using the buil-in DACs; this certainly changes things.