Hello,
We are using a microcontroller to interface with the AIC3254. We generate the word clock via a timer interrupt and general purpose output. We then generate the bit clock and data using the SPI port.
Generally this works fine. But, there is a delay of about 13 us between the word clock toggle and the SPI generated bit clock start. We noticed that if this delay changes we get occasional skipped samples on the codec output. We can output a sine wave through the digital audio interface and see there are missed samples when probing the audio output with an oscilloscope. Therefore we eliminated all other interrupts in the system in order to maintain the constant delay and eliminate this problem.
Is this is a robust solution? The datasheet doesn’t mention anything about this. We are using Left -Justified mode with a 16-bit word length.
Also, the datasheet mentions that “for Left-Justified mode, the number of bit-clocks per frame should be greater than twice the programmed word-length of the data.” We currently output 32 bit clocks per frame (16 bits per sample) which is equal to, but not greater than twice the programmed word-length. Outputing another 16 bit clocks per frame (24-bits per Left/Right sample) didn’t seem to do anything to the skipped samples problem. Is this correct and the datasheet incorrect?
On what clock (bit clock or word clock) is the data transferred from the digital audio interface to the internal DAC signal processor? How does clock jitter effect this?
Thanks,
Kurt