Hi
I am capturing 10-bit BT.656 from a TVP5147 on a custom, DM365-based, board. After the joy of verifying that I was in fact receiving correct video, the realization that nothing downstream was working sunk in.
When capturing 10-bit BT.656, each component for Y, Cb, Cr, is packed into 16
bits, vs 8 bits for 8-bit BT.656. So, now one needs 32 bits for every
pixels vs 16 bits per pixels.
As far as I can tell, the VPBE will not display this format. Every mention of YUV422 format indicates it wants 16 bits per pixel. It simply
can't display the captured data correctly. If I use the CPU to shave off the two LSBs and repack it into 16 bits per pixel, the display looks fine, but obviously slooooow.
I've also heard that the codecs won't accept 32 bits per pixel
I'm perplexed why 10-bit BT.656 is even an option if nothing downstream can make use of it.
Can anyone shed any light into any of the following:
Does anything in the encoder or display side actually work with 32 bits per pixel or is it in fact limited to 16 bits per pixel?
If it can be made to work, any hints would be appreciated :)
If it is a given that nothing works downstream with 32-bit pixels, is there a way on the front-end to shave off the two LSBs? Otherwise we will have to pursue HW changes.
Thanks,
Paul