This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Confirm accuracy of input frequency to AIC3254

Guru 24520 points
Other Parts Discussed in Thread: TLV320AIC3254

Hi community member,

Please let me confirm the following question.

[Question]

Does this device(AIC3254) has a limitation of input frequency accuracy?

i.e. Would you please teach me the rule of  input frequency in detail?

* I know that the maximum frequency was described in SLAA408 as "Maximum TLV320AIC3254 Clock Frequencies".

[Background]

Now, customer used  the signal which was generated by USB OSC block in DSP.

They would like to change this signal to use the signal which is generated by crystal(12MHz).

So, they would like to know whether the signal has a limitation to use.

If you have any questions, please let me know.

Best regard.

Kaka

  • Hi,

    According to the data sheet Recommended Operating Conditions table (page 6), MCLK has a maximum input frequency of 50MHz at a DVdd > 1.65V, and 25MHz at DVdd > 1.26V. So the only thing they would need for 12MHz is to recalculate the clock dividers.

    David

  • Hi David,

    Thank you for your comment.

    Regarding to your comment,  they only need for 12MHz is to recalculate the clock dividers. So, it does not mind the accuracy of input clock.

    i.e. there is not problem for the operation to use the clock which has Frequency stability of +/- 5000ppm for inputting signal for AIC3254.

    Is my understanding correct?

    If no, would you please teach me the required specification of input clock in case of 12MHz?

    Best regards.

    Kaka

  • Hi,

    Would anyone please provide answer for my question by end of today in US time?

    I must inform to customer this ASAP. So, It would be very helpful if anyone could answer for my question.

    Best regards.

    Kaka

  • Hi,

    Would anyone please provide answer for my question?

    I must inform to customer this ASAP. So, It would be very helpful if anyone could answer for my question.

    Best regards.

    Kaka

  • Kaka,

         If there is jitter in the clock it might degrade the SNR preformance of the device and can tolerate upto ~100ps rms jitter. However if PLL is used, it will tolerate much larger jitter. If there a time shift in the mclk frequency, it will show up as a shift in the output frequency.

  • Hi Vins,

    Thank you for response.

    Would you please the jitter tolerance of MCLK in case of using the internal PLL of AIC3254?

    If you do not have a data of this, it is OK that the data is reference data.

    * Customer's  board will make system clock from input MCLK by using the internal PLL of AIC3254.

    Best regards.

    Kaka

  • Kaka, 

      I do not have the information of PLL's jitter tolerance. If the customer insist, we can dig in and provide it towards end of next week only.

  • Hi Vins,

    OK.  I was requested to provide this data from customer.

    So, would you please evaluate to get to data?

    Best regards.

    Kaka

  • Kaka,

     Consulted R&D team on this and the recommendation is to have a clock source of max jitter of 100ps with or without PLL enabled.

  • Vins,

    Thank you for confirming.

    I will inform this information to customer as below. Would you please check whether there is not problem for my understanding(comment) just in case?

    [My understanding]

    If must keep the specification of AIC3254 on datasheet without reference to use the internal PLL,  they need to prepare a clock source which is the max jitter of 100ps.

    Also, if I provide the specification of Clock source which customer will use, would you please confirm whether there is any problem to use it for inputting AIC3254?

    Best regards.

    Kaka

  • Hi Vins,

    Would you please provide your comment to me?

    Best regard.

    Kaka

  • Kaka, your understanding is correct

  • Vins,

    Thank you so much your cooperation.

    Best regards.

    Kaka