This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

MSP430FE427 Calibration?

Other Parts Discussed in Thread: MSP430FE427

Hi...

i am working on calibration. i tried my board without calibrating and saved values measured by MSP and Monofase Digital Electric meter. 

The values are like this:

While Msp reached 0.088 kWh, Reading value on the Meter was 0.075. So i need to calibrate. This point is clear.

  Value of single measurement rev 1.0

4: Voltage (V)    ':228.755

5: Current (A)    /:0.154

6: Peak Voltage (V/:315.946

7: Peak Current (A/:0.222

8: Frequency (Hz) ':50.008

9: CosPhi         ':0.997

10: Power (kW)    #:0.000000

11: Energy (kWh)  ':0.088

 

 We have two anolog inputs on MSP. One of them is Voltage input and other Current input.(V1+, V1-, I1+ , I1-).

The msp knows the voltage diver resistor's values  and shunt resistor's value by parameter.h file.

These informations must be enough to correct measurement and calculations.  

Why do we have to calibrate the MSP430FE427?  

 

My second questions is:

For example i produce Energy meter card 100 or more piece. They are complately same product of serial pruduction.  

Does i have to calibrate one by one evry card?

or Calibrate one of them and save offset etc. values. Then load executable file to the all MSPs.  

 

in fact, these questions i asked are about "what kind of way does i use during serial production?"

 

Thanks....

  • omeraygor said:
    The msp knows the voltage diver resistor's values  and shunt resistor's value by parameter.h file.

    No, the MSP knwos the intended values. the real ones vary depending on factury tolerance as well as temperature change.

    Therefore each and every device needs to be calibrated separately.

    This can be automated a bit. At least for the voltage divider  (which usually has the greater variation) this is simple: After building the device, apply a known (reference-generated) voltage and measure it. You can simply calculate the calibration factor. Or the current  channel, you can apply a current source with a knwon value and also calculate the correction factor. You can directly add this calibration process to the firmware by e.g. checkign for a 'calibration pin' at device startup. If it is set, it enters calibration mode, measures V and I and calculates the difference to the expected values (doing a probability check, so the measuerd values need to be in a certain window so the software can safely assume that the clibration voltage and current are applied, is a good idea) and stores them in flash.

    Then, on next device start, these values can be used to correct the readings.

    If you use a current transfrmer rathern than a shunt resistor for the current, then calibration of the current channel is more difficult. Also, the current transformers usually have a larger variation than shunt resistors. Not to mention that a current transformer reuires a shunt resistor on the board anyway, which has some tolerance.

    And to add to this, current transformes add to the CosPhi. Only a bit, but it has some impact on the power calculation.

    Due to lack of a reference generator, our first series of energy meters was connected to a load for 1/2 hour. Then the meter readign was compared to the reading of a calibrated meter and the correction factor was flashed into the device.Quite a time-consuming process, but since these energy meters were for industy use and no cheap home customer products... Well, this way we had an implicit burn-in for free. Which is important for industy usage.

     

**Attention** This is a public forum