This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

DAC5681(z) FIFO_OFFSET register function seems not to match description

Other Parts Discussed in Thread: DAC5681, DAC3484, ADS5400, DAC5681Z

Hello,

The DAC5681 has an 8-deep FIFO in the datapath.

Register "OFFSET1" allows one to select the reset position of that FIFO pointer in bits 2:0 of that register. The description in the datasheet says:

FIFO_offset(2:0) Offset
011               +3
010               +2
001               +1
000               0
111               –1
110               –2
101               –3
100               –4

You would expect this FIFO_OFFSET register to have a direct impact on the DAC output delay.

And that seems to be the case... more or less: when cycling through the 8 values, I see only 4 distinct output delays.

Apparently only the two LSBs of that register seem to have any effect: "+2" and "-2" cause the same delay at the output (010 and 110), just like +1 and -3 are the same, etc.

This is regardless of the interpolation setting (and I have the DAC_delay bits at a constant zero).

Note I am using a synchronous data interface (DAC data input clock == DAC output clock, so the FIFO pointer never moves after I program it with the FIFO_OFFSET register and then trigger a SYNC event)

I need to be able to shift the DAC output in at least 8 steps of one input sample in order to do per-pixel positioning of the DAC output. My controlling FPGA has a by-8 interface (8 samples per FPGA clock cycle) so I cannot to pixel-accurate positioning inside the FPGA without resorting to those horrible barrel shifters..

Is this a known issue with the DAC?

Or is this a "user error"?

Koen.

  • Hi Koen,

    Yes, I expect the FIFO_OFFSET register to have an impact on the DAC output delay and I would expect 8 values. I haven't tried this to verify it. However, I think there is more that we should discuss beforehand.

    I do not think it is a good idea to use the FIFO offset to shift the data. Although it should work in theory, there is a synchronization issue that will introduce some uncertainty into the actual FIFO position. The issue arises with the way the sync signal is handed off from the data clock domain to the DAC clock domain. Due to the unknown relationship between the data clock and DAC clock, the results of resetting the FIFO pointer could vary by +/- 1 clock cycle, especially over temperature and voltage variations, but the variation is also possible at fixed temperature and voltage.

    Newers DACs, such as the DAC3484, have a sync signal that is synchronous to the DAC clock domain that could be used to perform this shift. The ostrobe signal was added to these DACs to fix this issue when trying to synchronize multiple DACs.

    My recommendation is to perform the shift in the FPGA. You will be able to precisely control the delay (under all operating conditions) without worrying about synchronization issues.

    Regards,
    Matt Guibord

  • Hello Matt,

    Thanks a lot for looking into this (and jumping right into the matter of the subject instead of asking pre-programmed first-line support questions ala "have you plugged in the system"...)

    I understand your recommendation to shift the data in the FPGA -- but I somehow miss how that will make any significant difference: you claim that the exact (FIFO) delay cannot be guaranteed or predicted due to the "asynchronous" nature of the SYNC function.

    If I cannot rely on FIFO_OFFSET tweaking to tune the DAC delay, then implementing a delay outside the DAC will not make a difference, will it? The unpredictable nature of the FIFO_OFFSET (even when it is modified to tune delay) will make the total delay unpredictable.

    In my application, the DAC clock and the data clock are perfectly synchronous (from the same 1GHz source). In fact I don't need a FIFO at all due to that. I did not find a switch to bypass it or disable it.

    Does the synchronization issue still hold if DAC clock and data clock are essentially the same (1GHz DAC clock and 500 MHz DDR data clock from same source)?

    The background of my query is our 4DSP FMC110 DAC/ADC board: it uses a TI DAC (=DAC5681z) + a TI ADC (ADS5400). WE send the DAC output through a system and want to sample the analog signal using the ADC. The FMC110 is connected to an FPGA eval board for data processing and control. DAC and ADC have a separate data buffer to hold the sample data.

    For our application it is essential that we can control the exact position where the ADC starts to sample data coming from the DAC through the system. SO we need to compensate for the FPGA+DAC+system+ADC+FPGA delay to make sure sample 1 from the ADC sample buffer matches sample 1 from the DAC sample buffer.

    Given the 1G sample rate, the FPGA runs at 1/8 of the sample rate (8 samples per clock cycle), so we implement a programmable sample delay in the FPGA with 8-sample accuracy (no need for barrel shifters yet).

    The only place in the whole system where it should be easy to control the precise sample delay (below the 8-sample accuracy of the FPGA) is in the DAC: either by using the FIFO_OFFSET and/or the DAC_delay.

    DAC_delay is a bit if a nightmare because (a) it only has 4 steps (we need 8) we also want to use the interpolator in some situations. What that one is enabled the DAC_delay register (which acts AFTER the interpolator) steps in sub-sample accuracy (with an interpolation factor of 4, the 4 DAC_delay steps are 1/4 sample positions)

    SO all that remains is FIFO_OFFSET. Which should have 8 possible positions (exactly what we need), and since we run the whole system off the same clock, there should be (almost?) no uncertainties due to async clock domains.

    The trouble is that only 4 out of 8 positions of FIFO_OFFSET seem to work. The MSB of that 3-bit register has no measurable effect.

    So I would like to discuss two separate issues here:

    • In our synchronous clocking environment, does the FIFO_OFFSET uncertainty still apply?
    • How many discrete steps should FIFO_OFFSET have? 4 or 8? Why can we see only 4?

    Best regards,

    Koen Gadeyne.

  • Hi Koen,

    I'm still trying to verify this myself, but yes, I think the total delay through the part will be within +/- 1 DAC clock cycle tolerance. Now, if you are able to keep your DAC clock and data clock at a constant known phase relationship, I think you will probably get the same latency every time a sync event occurs, unless the internal timing is right on the edge. However, keep in mind that the internal delay between the clocks (data and DAC clocks) will change vs temperature, so even if the external alignment is held constant, the delays inside the part will change verse temperature.

    Another point is that even if your data and DAC clocks are frequency synchronized, it does not mean that their phases are aligned. In order to pass data from the data clock domain to the DAC clock domain, a certain setup and hold time is required which cannot be guaranteed without known clock path delays. Since the phase between the two could vary, especially since the delays through the part vary vs PVT, a FIFO (or other mechanism) is necessary to pass data from one domain to the next.

    Now for your application, it sounds like you want to know the exact delay through the DAC so that you can characterize the delay through the system. Would this be done once and assumed to be accurate over time and temperature? Or is it possible your system delays could change vs time and temperatue and therefore calibration is needed?

    Lastly, is this a product planned for production or more of a research project? Have you been in contact with your local TI sales team?

    Regards,
    Matt Guibord

  • Hi Matt,

    Understood about the clock timing inside the DAC. There's 2 clock domains even if they are fed from the same clock -- and then PVT will create 1 cycle uncertainty.

    Yes I want to know the exact delay through the DAC -- and be able to control it over 8 input sample periods.

    The ideal goal is to do that once (at system init). I would like to avoid recalibration if possible.

    What we have now is just a research/evaluation platform. What will go into products is to be decided. I have not been in contact with local sales people -- mainly because I wanted technical support, not sales support :-)

    Cheers,

    Koen.

  • Hi Koen,

    The design of the FIFO in this DAC is a little strange. Shifting the FIFO Pointer does not simply change the starting location of the FIFO input or output counter (as intuition would suggest) but rather shifts the transition time of the data from the data clock domain to the DAC clock domain.

    I think your best bet is to get the DAC running with the default FIFO pointer location, perform your system calibration to determine the DAC latency, then use the FPGA to shift the samples based on the latency of the DAC. Although not ideal, does this sound feasible?

    Make sure you follow the "Recommended Startup Sequence" in the datasheet. I think you may be able to get more consistent results by replacing step 8 in the startup procedure with the "Recommended Multi-DAC Synchronization Procedure". This extra step first synchronizes the clock dividers so that every time you synchronize the FIFO the divided down clocks start with the same phase. These divided down clocks are used during synchronization. Then you will synchronize the FIFO after the clocks are synchronized. Otherwise, the clock dividers and FIFO are synchronized at the same time, so the clock edges could be changing while trying to synchronize the FIFO.

    Regards,
    Matt Guibord

  • Thanks Matt,

    It is now clear to me that I will have to bite the bullet and implement the sample-accurate delay inside the FPGA and not rely on the DAC FIFO.

    Thanks a bundle for the support,

    Koen Gadeyne.