This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

ADC12DJ3200: strange behavior vs temperature

Part Number: ADC12DJ3200
Other Parts Discussed in Thread: DAC38RF89

hi all,

we are observing a strange behaviour on a design involving DAC38RF89 and ADC12DJ3200.

We setup a system in such a way that only DAC and ADC are involved in the analog path.

The problem is that when AND ONLY WHEN (!) we heat the system to 75oC board temperature and above and then cool it, at around 8oC board temperature we get a disturbance resulting in a few uncorrected errors in our modem.

From the tests we have conducted up to now, we have verified that this response is always reproducible when we perform the aforementioned sequence.

The resulting disturbance is also more or less typical. Do these devices monitor the die temperature and adapt algorithmically their internal operation at some thresholds?

The behavior we get, is something like a change in an internal parameter occurring by a temperature-dependent decision, that creates once an instant disturbance at the output (for DAC) or input (for ADC).

If this scenario holds, is there any way to.. “smoothen” this effect?

thansk a lot in advance

KR

Vincenzo

  • Hi Vincenzo
    I will provide a more detailed response later today.
    Jim B
  • Hi Vincenzo
    I'm not aware of any thermally related issues for the DAC38RF89 other than the requirements in the Recommended Operating Conditions and Absolute Maximum Ratings sections of the datasheet.
    The ADC12DJ3200 has similar limitations listed in those sections of that datasheet.
    For either device operation above the maximum rated junction temperature can increase the failure-in-time (FIT) rate and/or affect reliability.
    The ADC12DJ3200 performance (primarily linearity) can be degraded by significant changes in device temperature.
    For applications where continuous usage is required (such as a modem) we recommend customers configure the device into one of the Background Calibration modes. In these modes the device linearity is continually optimized without interrupting device functionality.
    If the application is sensitive to shifts in the linearity performance of the ADC, and Background Calibration is not currently being used, then I suggest you try using that mode instead of Foreground Calibration mode. This may eliminate the issue you are seeing.
    I hope this is helpful.
    Best regards,
    Jim B
  • hi Jim

    thanks a lot...

    we used the background calibration in the past, then disabled it as caused more problems than benefits...

    we will give another try...

    will keep you posted

    KR

    Vincenzo

  • Hi Vincenzo
    If you can quantify in any way what type of degradation is encountered in the ADC or DAC performance as the temperature changes that would be helpful. For example is the performance just slightly worse, or do some samples appear to be completely corrupted.
    With the ADC in foreground calibration mode the performance won't have any sudden changes versus temperature, there will just be a gradual degradation in the linearity as the device temperature is moved farther away from the temperature at which it was calibrated.
    I would like to close this thread for now, but please respond or start a new thread if you have additional information to discuss.
    Best regards,
    Jim B
  • hi Jim

    The problem appears as an instantaneous corruption of one or very few samples.

    KR
    Vincenzo
  • Hi Vincenzo
    If the ADC is operating in foreground calibration mode I wouldn't expect to see anything like that.
    In background calibration mode there can be small glitches in the data when the just calibrated and to be calibrated next ADC cores are swapped. See Figures 60 and 61 to see what is expected.
    It would be helpful if you can isolate the problem to the ADC or the DAC so we can focus on one of the devices.
    It would also be useful to know what board or device temperature this occurs at if the issue is quite repeatable.
    In the meantime I will discuss this again with my DAC colleagues to get their opinion.
    Best regards,
    Jim B
  • hi Jim,

    the “tricky” part is that when we connected two systems, one inside an oven and the other outside, the problem appeared at the reception of both systems, when the temperature of the system inside the oven crossed the specific point where the problem happens (8degC board temp.). The only common to both converters is the 1.85V supply which is towards the high limit of DAC (needs 1.8V, limit 1.89V) and towards the low of ADC (needs 1.9V nominally, low limit 1.8V).
    It is one of the tests that we plan to do, i.e. to use an external voltage regulator for this voltage.

    KR
    Vincenzo
  • Hi Vincenzo

    I've discussed this more with a few experts and we have a few theories and some debugging steps to try.

    1) It's possible the RBD (receive buffer delay) setting in one of the JESD204B receivers has insufficient margin to handle timing variation across temperature. At a certain point this would cause a brief interruption of the link. Please try increasing the RBD settings by 1 or 2 to see if that has any impact.

    2) Are any JESD204B error flags in the receivers set at the time that the error occurs?

    3) Do the JESD204B links reinitialize or restart?

    4) Are there any alarms being set in the ADC or DAC (ie. i.e. LINK_ALM or REALIGNED_ALM or CLK_ALM in ADC12DJ3200) Please note that the ADC alarms may be set initially and should be cleared once the link is up and operating.

    Best regards,

    Jim B

  • Hi Vincenzo
    I haven't heard any response to my previous post so I'm hoping this item has been resolved.
    If the issue is still open please respond to let us know if any additional information is available.
    Best regards,
    Jim B