This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

DM365 encodedecode demo: Failed to detect Video Standard (edit: problem with monochrome input)

Other Parts Discussed in Thread: TVP5146

EDIT: For anyone happening upon this thread, the problem turned out to be that monochrome input was not properly recognized (the initial subject is rather vague).  A fix was posted below that involved changing one of the kernel drivers (simply commenting out two things).

I recently set up a DM365 evaluation board with DVSDK 3.10, and I've been trying to get some of the demo applications working.  Unfortunately, I've been running into problems with the encodedecode demo.

 

I run ./loadmodules.sh, then ./encodedecode  (I've also tried ./loadmodules_hd.sh and had the same issue)

and the screen remains black and I get the following output:

root@dm365-evm:/opt# ./encodedecode
Encodedecode demo started.
davinci_v4l2 davinci_v4l2.1: Before finishing with S_FMT:
layer.pix_fmt.bytesperline = 736,
 layer.pix_fmt.width = 720,
 layer.pix_fmt.height = 480,
 layer.pix_fmt.sizeimage =529920
davinci_v4l2 davinci_v4l2.1: pixfmt->width = 720,
 layer->layer_info.config.line_length= 736
EVM: switch to tvp5146 SD video input
tvp514x 1-005d: tvp5146 (Version - 0x03) found at 0xba (DaVinci I2C adapter)
Error: Failed to detect video standard, video input connected?
Error: Failed to create video decoder: mpeg4dec

 

Yes, the video input is connected - I have a composite camera and a viewing device connected to the Composite In and Video Out connections, respectively.

I've also tried running ./interface after the loadmodules step, but all that appears to do is start execution and then hang.

Does anybody know what the issue is here?

Further information:
I'm connecting to a file system on my computer over NFS, and I'm using the Arago file system.  I'm using the following bootargs:
bootargs=console=ttyS0,115200n8 noinitrd rw ip=192.168.1.5 root=/dev/nfs nfsroot=192.168.1.12:/dvsdk/filesys2,nolock mem=60M video=davincifb:vid0=OFF:vid1=OFF:osd0=720x576x16,4050K dm365_imp.oper_mode=0 vpfe_capture.interface=0

  • Hi Joel,

    I have seen this error earlier and have fixed it. This is because Capture_detectVideoStd in dmai_2_10_00_12/packages/ti/sdo/dmai/linux/Capture.c file, return with an error. Many things can lead to this error. Best way to fix it is to have a look at this function your self. To enable all the debug statements you see in the code, run the demo with CE_DEBUG=2 prefixed in the command line.

    You probably already know not to trust the release beyond a certain extent.

    Good luck.

    Cheers,

    N

  • Thanks for the idea where to look.  It turned out that it was specifically the one video input we were using that didn't work; using a different video source (a Nintendo Wii) worked just fine.  So we worked on other things, and are now coming back to try to figure out why our original video source didn't work (as we need to use this video source).  So here are some clarified details about the setup now that I've come back to this and tried to work on it again:

    Input: Watec WAT-902H Ultimate monochrome camera.  It outputs a composite NTSC (or PAL; we aren't sure but one test described later makes me think NTSC) signal to a 75-ohm BNC cable.  We then have a BNC female to RCA male adapter so we can plug it into the DM365 EVM (Rev F).

    What makes me think the camera is NTSC and not PAL is that we can start a demo (like 'encodedecode') running with the Wii connected (which is NTSC), and then leave the demo running and switch the Wii connection with the camera connection, we can see the camera's output from the demo just fine.  But when we try to start the demo with the camera initially plugged in, the demo fails to start.

    By running the demos with DMAI_DEBUG=2, I've figured out the specific line that is failing when the camera is plugged in at the start.  In Capture.c (in dvsdk_3_10_00_19/dmai_2_10_00_12/packages/ti/sdo/dmai/linux) in the function Capture_detectVideoStd(), there's the following segment:

    if (ioctl(fd, VIDIO_S_INPUT, &input) == -1) {
        Dmai_err2("Failed to set video input to %d (%s)\n", input, strerror(errno));
        return Dmai_EFAIL;
    }

    From my searching and trying to figure this out, this call is setting something with the v4l2 driver, and it gets into an area I know almost nothing about.  I'm befuddled, though, because I modified the code to print 'input' before that call, and in both cases (the Wii connected or the camera connected), input is 0.  But with the camera connected ioctl fails and it gives me the error message "Failed to set video input to 0 (Invalid argument)."

    Here's what printed out when I ran the encode demo with the camera initially connected and DMAI_DEBUG=2 (a few lines are my own changes printing out info):

    Encode demo started.
    @0x000017d3:[T:0x4001fce0] ti.sdo.dmai - [Dmai] Dmai log level set to '2'. Note that calling CERuntime_init after this point may cause unexpected change to DMAI tracing behavior.
    @0x0004a4af:[T:0x4001fce0] ti.sdo.dmai - [Display] Found width=720 height=480, yres_virtual=480,xres_virtual=720, line_length=384
    @0x0004a8b9:[T:0x4001fce0] ti.sdo.dmai - [Display] Setting width=720 height=480, yres_virtual=480, xres_virtual=720
    @0x0004ac32:[T:0x4001fce0] ti.sdo.dmai - [Display] New width=720, height=480, yres_virtual=480,xres_virtual=720, line_length=384
    @0x0004ae8c:[T:0x4001fce0] ti.sdo.dmai - [BufTab] Allocating BufTab for 1 buffers
    @0x0004b0ec:[T:0x4001fce0] ti.sdo.dmai - [Buffer] Set user pointer 0x403d8000 (physical 0x82300000)
    @0x0004b673:[T:0x4001fce0] ti.sdo.dmai - [Display] Display buffer 0 mapped to 0x403d8000 has physical address 0x201f0
    @0x00052fb3:[T:0x40c04490] ti.sdo.dmai - [BufTab] Allocating BufTab for 3 buffers
    @0x000531f5:[T:0x40c04490] ti.sdo.dmai - [Buffer] Alloc Buffer of size 529920 at 0x40c05000 (0x87f06000 phys)
    @0x000534e9:[T:0x40c04490] ti.sdo.dmai - [Buffer] Alloc Buffer of size 529920 at 0x40ca7000 (0x86b2d000 phys)
    @0x00053732:[T:0x40c04490] ti.sdo.dmai - [Buffer] Alloc Buffer of size 529920 at 0x40d49000 (0x86c71000 phys)
    @0x00053c72:[T:0x40c04490] ti.sdo.dmai - [Resize] Successfully set mode to continuous in resizer
    davinci_resizer davinci_resizer.2: RSZ_G_CONFIG:0:1:124
    @0x00053f01:[T:0x40c04490] ti.sdo.dmai - [Resize] Resizer initialized
    davinci_previewer davinci_previewer.2: ipipe_set_preview_config
    @0x000540a8:[T:0x40c04490] ti.sdo.dmai - [Resize] Operating mode changed successfully to continuous in previewer@0x00054199:[T:0x40c04490] ti.sdo.dmai - [Resize] Previewer initialized
    vpfe-capture vpfe-capture: IPIPE Chained
    vpfe-capture vpfe-capture: Resizer present
    @0x000545cd:[T:0x40c04490] ti.sdo.dmai - [Capture] Composite input selected
    Current input Composite supports:        // <-- I added some code to list the video standards supported by the "current input"
    NTSC
    NTSC-M
    NTSC-M-JP
    NTSC-M-KR
    PAL
    PAL-BG
    PAL-H
    PAL-I
    PAL-DK
    Capture.c.626: input = 0    // <-- I printed the value of 'input' right before the ioctl() call that fails
    EVM: switch to tvp@0x000567f2:[T:0x415ea490] ti.sdo.dmai - [Venc1] Creating encoder h264enc for max 720x480 bitrate 0 ratectrl 4
    @0x0006b05d:[T:0x415ea490] ti.sdo.dmai - [Venc1] Setting dynParams size 720x480 bitrate 0
    @0x0006b339:[T:0x415ea490] ti.sdo.dmai - [Venc1] Made XDM_SETPARAMS control call
    @0x0006b52f:[T:0x415ea490] ti.sdo.dmai - [BufTab] Allocating BufTab for 2 buffers
    @0x0006b7c2:[T:0x415ea490] ti.sdo.dmai - [Buffer] Alloc Buffer of size 518400 at 0x419d7000 (0x86dba000 phys)
    @0x0006ba2c:[T:0x415ea490] ti.sdo.dmai - [Buffer] Alloc Buffer of size 518400 at 0x41a56000 (0x86d13000 phys)
    @0x0007a17d:[T:0x422f7490] ti.sdo.dmai - [BufTab] Allocating BufTab for 9 buffers
    @0x0007a409:[T:0x422f7490] ti.sdo.dmai - [Buffer] Alloc Buffer of size 518400 at 0x422f8000 (0x86bcf000 phys)
    @0x0007a70e:[T:0x422f7490] ti.sdo.dmai - [Buffer] Alloc Buffer of size 518400 at 0x4239a000 (0x873c9000 phys)
    @0x0007a96e:[T:0x422f7490] ti.sdo.dmai - [Buffer] Alloc Buffer of size 518400 at 0x42457000 (0x8730c000 phys)
    @0x0007abd2:[T:0x422f7490] ti.sdo.dmai - [Buffer] Alloc Buffer of size 518400 at 0x42514000 (0x8724f000 phys)
    @0x0007ae5f:[T:0x422f7490] ti.sdo.dmai - [Buffer] Alloc Buffer of size 518400 at 0x425d1000 (0x87192000 phys)
    @0x0007b0e0:[T:0x422f7490] ti.sdo.dmai - [Buffer] Alloc Buffer of size 518400 at 0x4268e000 (0x870d5000 phys)
    @0x0007b36a:[T:0x422f7490] ti.sdo.dmai - [Buffer] Alloc Buffer of size 518400 at 0x4274b000 (0x87018000 phys)
    @0x0007b66d:[T:0x422f7490] ti.sdo.dmai - [Buffer] Alloc Buffer of size 518400 at 0x42808000 (0x86f5b000 phys)
    @0x0007b8da:[T:0x422f7490] ti.sdo.dmai - [Buffer] Alloc Buffer of size 518400 at 0x428c5000 (0x86e9e000 phys)
    @0x001bf660:[T:0x40c04490] ti.sdo.dmai - [Capture] Failed to set video input to 0 (Invalid argument)
    Error: Failed to create capture device

    The output when the Wii is connected at start is identical, other than the timestamps and the failure.

     

    Anyone have ideas on why this is failing to run when the camera is initially connected?  Is there some way I can modify the code so that it sets everything up the same way as if the Wii had been connected even if the camera is connected from the very beginning (since switching from Wii to camera during execution worked)?

    One other thought - it's possible that the BNC to RCA adapter we got is meant for a 50-ohm impedance cable and not 75-ohm impedance (we bought it off Amazon and it didn't say).  Could this possible mismatch somehow affect whatever the ioctl call is trying to do?

    Thanks for any help/suggestions, and happy new year to everyone!

  • Still haven't figured this one out, though I tried a few more things and thought I'd throw up a few more points of data I've gathered:

    There's an ioctl call earlier in the code that calls (VIDIOC_ENUMINPUT).  I took the struct after that call and printed out information.  The same information was printed when both the Wii and camera were connected.
    From the v4l2_input struct used with the ENUMINPUT call:
    index = 0
    name = "Composite"
    type = 2 (for camera)
    std = 45311 = 0xB0FF  [supports PAL (B, B1, G, H, I, D, D1, K) and NTSC (M, M-JP, M-KR)]
    status = 0

    Also, another thought... could the fact that the camera is monochrome somehow be part of the problem?

  • Joel Carlson said:
    Also, another thought... could the fact that the camera is monochrome somehow be part of the problem?

    This may be the problem, the V4L2 capture driver is only tested with a few color input streams, it is quite possible that the monochrome signal being fed to the TVP5146 video decoder is coming up as an unrecognized format.

    The fact that you can get the video through by connecting up the Wii at the start and than swapping in the camera stream is good news, since this means that the TVP5146 is able to successfully capture the video from the monochrome camera, so the problem is in software not in hardware.

    What I suspect is happening, is that when you start the demo it queries with the V4L2 driver about the video format being captured, and if the V4L2 driver does not return a proper value it kicks out an error. At a low level, this means that the V4L2 driver is sending some I2C query to the TVP5146 to read a status register, if this status register happens to be different due to the monochrome camera, the value ends up being passed up through the software stack to your application and causes the program to exit (even if the video capture is legitimately working). There are a couple of ways to try to fix this, one would be to go into the V4L2 driver itself and modify it such that when it queries the TVP5146 and finds the value for the monochrome camera that it returns something that DMAI and your application can deal with and continue on. The other way would be to modify DMAI to ignore when it gets this monochrome camera value and continue on, since it seems to work if you start with the Wii anyway. To start you could have it just always pass and try to run, though for the long term you would probably want to implement the code such that it just accepts the new monochrome format and not just any other status, such that the program will know if there is no video input connected, or some format it cannot properly deal with.

  • To monochrome camera change file tvp514x.c:

    lock_mask = /*STATUS_CLR_SUBCAR_LOCK_BIT |*/

                STATUS_HORZ_SYNC_LOCK_BIT |

                STATUS_VIRT_SYNC_LOCK_BIT;

            break;

     

  • Thank you very much, Mikhail.  That did the trick.  We had put this issue on hold and had been working on other things; it's relieving to not have to work this one out.

    And for reference for anyone else that needs this: the necessary file is in %KERNEL_PATH%/drivers/media/video/tvp514x.c.  There are also two different occurrences of STATUS_CLR_SUBCAR_LOCK_BIT that need to be commented out (or at least the one we commented out didn't do it, and then we saw and commented out the second one as well).