This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

VIDIOC_DQBUF Hangs

Other Parts Discussed in Thread: TVP7002

Hello,

I am working with a DM365 running MontaVista 5 and DVSDK 2.10.1.18.  I am trying to add support for QVGA from our camera.  So far, I have added in enough to the drivers so that the camera is successfully configured to 320x240 (as far as I can tell).  However, when I try to run our program, it hangs at the VIDIOC_DQBUF ioctl.  It is in the videobuf_waiton function; it gets as far as calling schedule().  The program runs fine when configured for VGA (640x480).  Do you have any thoughts on what could be missing/not completing?

 

Thanks,

Nate M

  • Hi Nate,

    I assume you have correctly modified the sensor drivers to capture 320x240. A few things that you can check are:

    1. Polarity of HD, VD signals. I assume your 640x480 resolution is working precisely means that polarity should be ok.

    2. Try inverting the polarity of PCLK.

    3. Configuration of the VDINT0 and VDINT1 registers. The register value should be less than 240. The processing of the buffer queue happens on this interrupt and if the register is programmed with more lines than what your input device is giving, then there will be no interrupt and DQBUF would hang.

    BTW, i want to understand a few things.

    1. What are the changes in the driver that you did for getting 320x240 output?

    2. Is your input device (sensor or analog input) configured to give out 320x240 only?

    3. If your input device gives say 640x480 and you want 320x240 from it, then either you can crop or you can resize the input to get your specified resolution. Is this not your usecase?

    4. I would like to understand your data flow and how you have configured the capture driver (like single shot mode or continuous mode etc).

    Regards,

    Anshuman

  • Hi Anshuman,

    Let me start by answering your questions.

    1. What are the changes in the driver that you did for getting 320x240 output?

    The changes I have made so far was to add a 320x240 mode in the mt9v034.c driver.  I added entries in mt9034_format_parameters and mt9v034_standards.  I've also added an entry in ch0_params from ccdc_common.c  (Also corresponding modifications were made to the header files so everything will compile)

    2. Is your input device (sensor or analog input) configured to give out 320x240 only?

    The camera is setup using these parameters I set in the driver, I call VPFE_S_CCDC_PARAMS and VIDIOC_S_STD ioctls where the standards are set.

    3. If your input device gives say 640x480 and you want 320x240 from it, then either you can crop or you can resize the input to get your specified resolution. Is this not your usecase?

    This is what we were doing initially.  However, we wanted to achieve a higher framerate.  The camera is limited to 60FPS at VGA  resolution so if we cut down the amount of pixels being read out, we can increase our framerate.

    4. I would like to understand your data flow and how you have configured the capture driver (like single shot mode or continuous mode etc).

    The camera is in continuous mode.  Currently the previewer is chained along with the resizer to convert the RAW image to bayer.


    Let me know if there is any other info you think would help.  Here is a snippet of the printout from the program as it is setting up:

    Camera: opened
    Camera: found RAW-1 source at index 0
    mt9v034 chip version reg = 1324
    mt9v034_set_format_params height = 1008, width = 752
    CCD: Linearizion enabled
    CCD: Configured
    mt9v034_set_format_params height = 240, width = 320
    vpfe ccdc capture vpfe ccdc capture.1: hpitch = 320, vpitch = 240, bpp = 2
    Camera: Configured

    As for your suggestions, I tried increasing the frame height in the camera driver as well as reducing frame height out (from which VDINT0 is set).

    I inverted PCLK as well, but none of these changes made a difference.

    Thanks,

    Nate

     

  • Hi Anshuman,

     

    After playing around in the driver some more, I did end up getting this ioctl to complete successfully.  The problem did appear to be that the image sensor was set too small and VDINT0 wasn't firing.  I had set the vertical resolution all the way back up to 480 and worked it back down to 260 and it still works.  I have a few more tweeks to make before the performance is what we want, but it isn't hanging anymore.

     

    Thanks again for your help.

    Nate M.

  • Nate,

    Good to know that you could proceed further. As we discussed earlier, such issue of not getting interrupt and eventually not returning from VIDIOC_DQBUF happens because the VDINTx might be set to be more than the actual number of lines in VD interval.

    Regards,

    Anshuman

    PS: Please mark this post as verified, if you think it has answered your question. Thanks.

  • Hi,

    We are using TI8168 EVM, running ti-ezsdk_5_05_02_00.


    Trying to run saLoopBack application for Capture and Display (v4l2).

     

    Capturing on "/dev/video0" which is connected to VIP0 Port A,

    which in turn is connected to one of the TVP7002s which is at 0x5D i2c address.


    When our application starts the capture - display while loop it gets stuck in the VIDIOC_DQBUF ioctl.

    To see what might be causing it we opened the device with O_NONBLOCK and found that errorno = 11 (EAGAIN) is returned.

     

    Next we tried printing the state of the buffers inside the vidioc_dqbuf(...) function in ti81xxvin_main.c by using this:

    buf_obj->buffer_queue.bufs[i]->state


    The state printed is 2, which corresponds to VIDEOBUF_QUEUED.

     

    Our interpretation is that for some reason video frames are not moving from VIP0 Port A to the Queued Buffers. Hence, there might be nothing to DeQueue.

     

    saLoopBack detects video on tvp7002 and using CRO we were able to determine somewhat that frames are reaching VPSS(VIP0 Port A).

     

    Can anyone please suggest what might be the problem with this. What are we missing. Anything to do with VDINTx, and if yes how to figure it out.


    One last thing, till a few days back the application used to complete the whole while loop (10,000 - default), it was just that the video was all yellow screen. Then, suddenly this VIDIOC_DQBUF error started coming.


    Please Suggest ...

     

    Thanks and regards

    Tushar Nautiyal