I have a couple of LM97593 tuner chips on prototype boards that I'm trying to interface with.
I have a microcontroller issuing command and address bytes to the tuner using a parallel bus, and some settings seem to get applied correctly. I can change the serial output rate and get the correct period, I can change the output width and get the correct number of clocks, change the mux mode, sfs mode etc. Considering that I can get this far and the tuner will produce output clocks, it seems the tuner is working and that my configuration words are getting into the chip.
But, I never achieve the correct output sample period (OSP) as described in the data sheet. With the decimation settings as maximum (for longest OSP) I would expect to get output bursts at 610Hz with a 10MHz crystal but get 500Hz instead which is not right. Furthermore, if I cut the decimation in half the OSP does not change (still 500Hz/2ms), and if I reduce the OSP further I don't get a regular sample period anymore. It looks like I miss several samples then get a burst of samples, some of which have fewer serial output clocks than others.
Even with the consistent OSP at maximum decimation, my output data doesn't seem to make sense. I get what looks like 'B' outputs on the Aout line even when MUX is off. And whether MUX is on or off, my measurements seem to mostly flip between 0 and full scale. I do get some 'counting' that I expect in the less significant bits for the quadrature output (only in 24-bit mode), but only zero or full-scale for the in-phase output words.
The OSP would seem to be my fundamental problem, and the data issues are possibly a symptom.
Any ideas of specific settings to check (I've fiddled with many) or specific hardware components?