Hi,
I'm trying to capture 576@50Hz, YUV4:2:2 using the encode demo. To do so I've modified the capture.c code by changing VideoStd_720P values to a defined constant CAPTURE_STANDARD which I could use as a wildcard to change the VideoStd_720P values everywhere in an easier way. To find the failing code, I've also modified the main.c code by calling the captureThrFxn() directly instead of spawning it as a thread, so I could set apart multithread debugging issues that are still difficult to me. I've been browsing the code for a long time, and I think this last change won't affect the inner workings of the capture.c code, because I found it is quite independent with respect to the other video threads launched by the main.c code.
The hardware settings for the trials are:
- J13 connector from DVEVM board connected to the source of composite video.
- The source of composite video is a Grass Valley Turbo iDDR video player, playing a PAL 576lines, 50Hz YUV4:2:2 video sequence.
- DVEVM connected to the computer through ethernet only.
The results of the trials are as follow:
When using VideoStd_576P, the execution stalls at Capture.c:line 670, in Capture_detectVideoStd(), where the ioctl(fd, VIDIOC_S_STD, &std) is called. The system call
fails to set the video standard. Why?
When using VideoStd_D1_PAL, the result is the same. (Why, again?)
As a result, I've many questions:
1- The video standards declared as an enumeration within the DMAI code, to what interface are referred? I mean, they are standards expected to be connected to the board? or may they be standards expected to be found somewhere in the interface between chips in the board?
2- Are the v4l2 drivers of the git linux kernel limited to 1080i30, 1080p60 and 720p?
3- Can this be solved by proper use of the '-r' argument? I've read the encode demo can use the detected standard's resolution as a default, but this could not work in my case because its the detection of the standard what causes troubles.