A little confused on the video output format. We have implemented a design that connects Vout[0] to an AD9889B HDMI Transmitter. We want to run at 720p60. We have VOUT[0]_G_Y_YC[9:2] from the DM8167 connected to the Y[7:0] input of the AD9889B. We have VOUT[0]_B_CB_C[9:2] connected to the Cb/Cr[7:0] input of the AD9889B. The AD9889B is configured for "YCbCr 422 Format, 16 bits with Embedded Syncs". Is this a valid output format from the DM8167? We do not have the code up and running yet to test this but I want to be ahead of the curve.
It seems like the "more correct" or better quality video would be to run these devices in 20-bit mode but it also seems like the interenal DM8167 processing is only done on 16-bits - is this correct? Would the video be better quality if we ran the output of the DM8167 at 20-bits verses 16-bits? Or are these extra 4 bits "created"? The AD9889B can be configured to run in 20-bit mode as well.
Thank you.