Hi,
I am using OpenVG to provide a graphics solution for Analogue TVs. My problem is that I see interlace flicker when outputting the video to a TV due to the intensity of a pixel on field 0 being significantly greater or less than the intensity of the pixel on field 1. I realise this is a 'feature' of interlaced scan monitors but after seeing graphics on TV and menus on DVD players etc, it must be possible to get a clear image free from any interlacing flicker.
Does anyone have any ideas on how to achieve this?