Hello,
I'm replacing a legacy PCI video capture card (Leutron brand) with a custom USB video capture unit using a TVP5147PFP. I realise this part is now NRND but originally thought I had to stick with it for square pixel support. I now see that a patch for TVP5147M enables square pixel support, so will probably migrate to that in future.
My primary query is with respect to the image quality obtained. Visually it looks perfectly good, capturing a grayscale PAL 768x576 image from an analog camera. As part of our validation process however, my colleague put together a program that compares the PCI capture and my USB version side by side with real time histograms, basically showing the number of pixels of each level of grayscale.
While the histograms of the PCI card and my unit look basically the same (once contrast, gain and brightness have been adjusted to give an equivalent picture), we're getting a strange combing effect in the histogram, with regular intervals of individual levels of gray getting spikes up or down from the rest of the curve. It looks like some kind of harmonic effect enhancing or degrading individual levels of gray, although the levels effected change as settings (primarily contrast) are changed. There's not visible noise in the picture, so I'm not sure where they're coming from. The spike positions and levels are also constant up or down, and only change when contrast/brightness/gain are changed.
I've attached a few images of these histograms that should illustrate the issue much better.
Is this a known outcome of the way filtering in the TVP5147 would operate? I'm using it currently in CVBS mode, although I've tried setting it to SVIDEO with my input as the Luma input, and that just makes the combing worse. Tried changing the comb&trap filter settings with no change visible. I'm using the chip set to 20 bit mode and just grabbing the 8 most significant bits from the luma outputs, ignoring the color outputs.
Can you make any suggestions on how to eliminate this?
Thanks,
Andrew