This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TLV320AIC34: ADC Architecture and Idle tones vs. Temperature

Part Number: TLV320AIC34


Is there any documentation or test data describing the limit cycle and idle tone generation in the AIC34's ADC result? I am able to observe frequency tones in the ADC conversion result (sampled and reported at 96kHz) that are not present at the ADC input to the codec. The frequencies change dependent on the common mode DC level of the differential inputs: I have the ADC inputs shorted together and left floating or tied to various DC voltages...the value of the DC offset changes the observed frequencies.

If I apply a time varying DC offset to the shorted differential ADC inputs (my approach to dithering the quantizer), the frequency tones go away. From delta-sigma ADC literature, this can be considered normal behavior, however if I knew more about the design of the ADC (order of the modulator and how many levels (bits) the quantizer is), I think I would be able to predict when and where the idle tones would appear in the frequency spectrum.

On a related note, if I have the codec and PCB that it is installed on in a thermal chamber with the common mode DC signal supplied from a quiet benchtop supply outside the thermal chamber (perfectly stable over temperature) then I get a very repeatable change in frequency of the idle tones. I am suspecting that this can be attributed to a slight change in the 3.3V analog supply over temperature that we have connected to DRVDD and AVDD_DAC. Would you expect that a reference voltage change on the order of a couple mV to impact the frequency of the idle tones observed in the ADC conversion result?

I was under the impression that if the DC offset is near-0V, then the idle tones should appear at low frequency, if DRVDD is 3.3V, should I aim for a DC offset near 0V or near 1.65V, the middle of the differential input range? The critical frequency range when I need to guarantee absence of idle tones is from 5kHz - 17kHz.

Thanks for any thoughts!

  • Hello Curtis,

    I'm going to loop in our Design/systems team on this. Can you share a little more about your application? it sounds like you are not using coupling capacitors on the inputs, correct?

    best regards,

    -Steve Wilson
  • Hi Steve,

    Thanks for the response. I can share a bit of application information, but the tests that I was describing in my original post are relatively generic, using waveform generators, power supplies and jumpers.

    The issue was discovered after connecting a custom front-end ASIC to the codec ADC inputs. The ASIC output pins tie directly to the codec input pins (no coupling caps) and actively hold a DC common-mode bias at 1.65V. The noise floor of the ASIC is lower than the noise floor of the codec, allowing us to get the most out of the codecs dynamic range. It seems to be a combination of this low noise front-end circuit, DC bias stability and variation of codec's bandgap reference over temperature that allow us to see the idle tones in the codec output.

    In my current configuration, I have one of the codec's differential inputs shorted together with all power supplies located outside the thermal chamber. In this configuration, I still see the idle tones change frequency over temperature.  I am able to significantly change the idle tone frequencies by changing the common mode DC level of the shorted differential inputs.

    One "fix" that I have right now is to tie unused codec ADC inputs together, enable weak biasing to ADC common mode of the unused inputs, then add the MIC3L signal (tied to other unused, biased inputs) to the differential input via the Codec's adder block. From what I can tell, this applies some DC noise around the ADC's common-mode, preventing idle tones from being generated.  Does this seem like a viable solution? I'm concerned that I'm going to be injecting wideband noise into my real signal, but it is currently the lesser evil than the idle tones.

    Thanks and I look forward to your thoughts,

    Curt

  • Curt,

    I'm apologise for the delayed response.
    here is the design team's response.

    "Idle tones are expected given the ADC architecture for near zero differential input.

    Option 1 : For each ADC, a value of input offset can be found which negates the offset of the ADC and removes the idle tones. However, this will change with temperature, so it is a not a good solution.
    Option 2: Add a significant offset, maybe 100mV or so and I would expect the idle tones to go away or become smaller. However, this is usable only if the customer is not using much gain in the PGA (otherwise the offset will saturate the PGA).
    Option 3: Smaller the offset, lower the frequency of the idle tone. Not clear from the customer whether he is seeing any idle tone in the 5KHz-7KHz range. If he is not seeing it across devices, better not do anything."


    best regards,
    -Steve wilson