This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

ADS1258: Question on Table 11, Figure 56 Start Condition to First Data

Part Number: ADS1258

I am creating a bus functional model of the ADS1258 in VHDL, and in trying to model the initial delay from rising edge of conversion start to first stable data versus the subsequent delays, I am getting a model that appears as though the time to first data is actually shorter than the subsequent data rate for continuous conversions.

I think the problem is that the DLY[2:0] (switch-time delay) is not incorporated into the values in Table 11 because it says DLY[2:0]=000.

I am using an external 12.5MHz clock and using autoscan mode, DR="10" and DLY="011", so that my data rate calculation is:

12.5MHz / (128 * (4^(3-2) + 4.265625 + 4)) = 7961.78 Hz, the denominator is1570 tclks (12.265625 * 128 tclks)

But my value from Table 11 for DR="10" and IDLMOD=0 (wake from standby) is 1092 tclks.

1092 < 1570 means my initial delay is shorter than my continuous sampling delay, which is contrary to the drawing in Figure 56. 

Perhaps I am missing something big here, like Table 11 value needing to be multiplied by 128? Please advise. Thanks! 

  • Hi Susan Beiter78,

    The switch delay time is described on page 21, and states: "The ADS1258 provides a switch time delay feature which automatically provides a delay after channel switching to allow the channel to settle before taking a reading."

    This is a minor detail, but it highlights the fact that any switch delay time only occurs after the channel switches. Therefore, no switch delay time occurs before the first conversion.

    So the initial delay in Table 11 is different from the switch time delay in Table 7. If you calculate the data rate as a number of tCLKs for your settings, but assuming that TD=0, the result is 1057 tCLKs.

    Also note that in auto-scan mode, the ADC effectively waits the initial delay each time a new conversion is started. You can see this in Table 6, where the data rate in Auto-scan mode is 23739 when DR = 11b (whereas in fixed channel mode it is 125kSPS). Basically each time the sequencer indexes to the next channel, the ADC resets the digital filter and then automatically waits for settled data. This requires the input signal to propagate through the sinc5 filter, which typically takes ~5 conversion periods (note that the max data rate in auto-scan mode is ~125kSPS / 5).

    Ultimately, the data output latency will be different depending on the converting mode you use

    You can also check out this app note for additional information on delta-sigma ADC conversion latency: https://www.ti.com/lit/sbaa535

    -Bryan