This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

dac 5687 spur performance

Other Parts Discussed in Thread: DAC5687

I presently have six DAC5687 running at 96 mhz. with the nco commanded to 12 mhz all is well,as all spurs would fold back under. When i command 12.04Mhz the dacs have modulation at 11.06mhz apprx -83 db down which is no surprise. What I've seen is that if I re-run an labview application which resets, and commands the frequency, the spurs will sometimes drop to -90db. Any thoughts on how to achieve the -90db all the time, and what is actuall happening when i reset. Is it possible that the 90 degree delta from the sine/cosine generator has an slight inbalance, and sometimes I'm catching it at exactly 90 degree. I realize that -83 is good, but my customer has seen the -90db, and wants it.

 

                                      Thanks,

                                    Bill

  • Hi Bill,

    The first step would be to figure out the source of the spur (if you don't know it already).  An easy way to do this is to shift around the NCO frequency and see how the spur frequency changes.  If the change in spur frequency is an integer multiple of the change in NCO frequency, it is most likely a harmonic spur.  Your best bet would then be to look to see how the linearity of your external circuitry could be improved.  If the spur is non-harmonic, then it may be caused by other effects such as clock mixing.  This seems more likely to me since you seem to be getting better performance after resetting the NCO.

    Clock-related spurs can be reduced by running the DAC clocks or NCOs out of phase.  You can control the initial NCO phase by writing the desired offset value into registers NCO_PHASE_0 (0x0D) and NCO_PHASE_1 (0x0E).  Once each NCO is assigned a different phase offset, you have to make sure that they are all synchronized.  This can be done by transitioning each DAC's PHSTR input from low to high simulateously.  Note that in order for PHSTR transitions to sync the NCO, bit 6 of register SYNC_CNTL (0x05) must be written high.

    In addition to being synchronized by PHSTR, the NCO is also synchronized any time the frequency is set.  This sounds like what is currently happening in your system.  The NCOs are being synchronized by register writes (via your LabVIEW application), but the timing is not controlled.  You end up having a different phase relationship between your NCOs every time the application runs.  Sometimes it will be optimal and you will get 90-dB SFDR, and other times it will be less than optimal and you will get 83 dB.  For this reason, it is better to use the PHSTR input if possible.

    Best regards,
    Max Robertson
    Applications Engineer
    High Speed Data Converters