This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

MSP430G2201 DCO calibration

Other Parts Discussed in Thread: MSP430G2201

 

We are looking at using the MSP430G2201 Value Line device for hardware protocol decoding. As you know, these devices place DCO calibration constants in the INFO-A segment for the different operating clock speeds. The G2201 units that I've been working with only had 1 MHz constants flashed from the factory and it was no problem for me to calibrate them for 16 MHz before my use.

My question is how do we handle this in production? Can the G2201 be purchased with the 16 MHz DCO constants pre-calibrated? What is your suggested way to do this during manufacture? Can Arrow program this value along with our Program code so we receive the deveices ready to stuff into the baords?

 

Thanks,

 

Bryan Busacco

  • It may be possible to do runtime calibration to 16MHz using the VLO as a transfer reference. First measure number of 1MHz clocks in a 12kHz cycle, and then trim the DCO until you have the desired frequency.

    Peter

  • I suggest Elprotronic Inc.

  • Peter Dvorak said:
    It may be possible to do runtime calibration to 16MHz using the VLO as a transfer reference.

    Unfortunately, the VLO frequency is inaccurate and unstable. It can range from 4 to 20kHz and has a heavy drift with temperature and supply voltage. SO even if you measure it first with a precision frequency meter, it likely won't be the same speed anymore when you start the calibration. Also, you could hten as well directly check and adjust the 16MHz :)

    The 5x series' REFO is much better suited for tis, but best for frequency calibration is an external precision reference frequency, such as a watch crystal, or an external oscillator attached to XT1 in bypass mode.

**Attention** This is a public forum