This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

MSP430F6779A Energy meter : class 0.2 issues

Other Parts Discussed in Thread: MSP430F6779A

We are going to design the Energy meter for class 0.2 accuracy. We are using slaa577g_reference design file from TI for hardware design and we have downloaded the energy library files slac488 from TI. We are using MSP430F6779A microcontroller. We have designed the hardware as per the reference design .The analog voltage front end circuitry is same as per TI reference design and we are getting the correct voltage readings at display .In analog front end current circuitry, we are using Current transformer( ratio 1:2500 ) and we have selected resistor values to get the 625 mv for 5A current . Our nominal current is 5A and Imax is 6A.

 

We are using CT of turns ratio of 1:2500 and we are getting 625mv for 5A but the problem is for 5A We are getting 35A current at calibration software and on our meter display, it is showing 100A current. For 2.5A current, it is showing 50A current in calibration software and our meter display. It is showing same value on both place for 2.5A and below current but for 5A It is showing different values. So, Regarding this I have some queries as below.

1) Why It is showing different values for 5A current at calibration software and on our meter display.

2) What is the Current(high), Active(high) ,phase(high) parameters, why it is used. In calibration software, In meter calibration software window , It is showing scaling factor values of only current(low), Active(low) parameters . I have attached a file of snapshot of that window.

3) In Energy library file, I didn’t understand the working of some defined functions as below

   i) DEFAULT_P_SCALE_FACTOR_A_HIGH

   ii) DEFAULT_P_SCALE_FACTOR_A_LOW

iii) DEFAULT_I_RMS_SCALE_FACTOR_A

iv) DEFAULT_I_RMS_LIMP_SCALE_FACTOR

   v) DEFAULT_V_RMS_LIMP_SCALE_FACTOR_A

vi) DEFAULT_V_RMS_SCALE_FACTOR_A.

  • Hello Hemant,

    The issue observed is because the EVM has a maximum current of 100 Amps. As a result, whenever the chip sees an RMS current of 0.65 mV it associates this with 100 Amps. You will have to modify the shift amounts in the foreground’s functions for calculating the metrology parameters. This will change the association of a particular input RMS voltage to the actual RMS current. For fine-tuning the input->output association, the meter should then be calibrated after making this change.

    As a quick note, please also note that it is recommended that the maximum peak current(RMS current * 1.414) fed into the ADC is within ±930 mV. At 6 Amps, the system may go outside this recommended range.

    Lastly, the limp scale factors and “HIGH” scale factors are not used in the code.

    Regards,
    Ryan
  • Hello ryan,

    Thanks for your help.

    Can you tell me briefly about, how to modify the shift amounts in the foreground’s functions for calculating the metrology parameters. Which are the functions and how it works...

    It will be great help..

    thnx

  • Hi Hemant,

    You should change the functions that are dependent on current such as the ones used for calculating RMS current, active power, and reactive power. In these functions, there are various shift operations performed to get the final value of the meteorology parameters in real world units. Given that the maximum current is 6 Amps while the reference design's was 100 Amps, you should modify the shift amounts in these functions by 4 (i.e. dividing by 16). From there, the system should be calibrated using the normal procedure mentioned in the application note.

    Regards,
    Ryan
  • Hi Ryan,

    Thanx for you help. I have done the changes as per your suggestion.

    But now the problem is Active power and energy values are not showing correctly on my meter display and on calibration software also.

    I want to test my design as it is and for same energy library files without any change. only change is, I am using 7 segment LED display instead of  LCD display. I have add that program for 7 segment LED display in energy library file otherwise the whole program is same as provided by TI. I should get all the values of parameters correctly on my meter display and on calibration software. On my meter display, i am getting correct values for voltage and current but active power and energy values are not correct and on the other hand  on calibration software also the values are different.

    Another issue with the calibration software is if i test for the 5A (note: as you know for 5A current i am getting 100A current which is correct )current, i am getting 100A current on my meter display but on calibration software it is showing near about 30-40A and if i reduce the current to 50% i.e for 2.5 A current, my meter display and calibration software are showing correct values that is 50A which is correct. so for less than 50% current i am getting correct values on both place but for current values greater than 50% the calibration software shows wrong values that is near about 30-40A.

    so, I don't understand why this is happening, I should get the correct values on my meter display and on calibration software because i am using the same hardware and software which is provided by TI.

    Thanks...awaiting for reply.

  • Hemant,

    Did you use the same procedure to adjust the current calculations for active power and reactive power? I am confused regarding your response about current being correct but also say that 100 A is still being displayed when 5 Amps should be displayed. If the system is adjusted so that 5/6 Amps correspond to the maximum, the mismatch in readings between the GUI and LCD at maximum current should go away.

    Regards,
    Ryan
  • Ryan,

    We have designed the hardware for 5A(basic current ) and 6A( max current). But original TI design is of 100A(basic current) .

    Hence we are getting 100A on display  for 5A input current. 

    The main issue is power does not matches with the voltage and current values, as an example if voltage is 40V and current is 2A then power should be 80W but on display it is showing different values and energy also mismatch.this is the main issue.

    regards,

    Hemant

  • Hemant,

    If the meter is showing 100 Amps when 5 Amps is applied then I don't think the requested changes have been implemented correctly. Otherwise the meter would read near 5 Amps instead of 100 Amps. In addition, with these changes made the GUI and display should show the same value for currents when 5 Amps is applied.

    In regard to the power: for a given voltage, current, and power factor, what is the value of the measured active power? Similar to the current, the function used to calculated active power may need to be modified to correspond to the new current range.

    Regards,
    Ryan
  • Hi Ryan,

    Which are the changes that has to be implemented. can you tell me little more about that.

    If the voltage=230V and current=10A , PF=1 then power is 2100W which is lagging from actual value that is 2300W.

    Energy does not match with power .

    Which are the functions that has to be modified for active power?

    Thanks...

    Regards

    Hemant

  • Hi Hemant,

    Since the measured and actual active power are close enough, any further changes to the active power function is not needed. Performing the regular calibration routine mentioned in the design guide should take care of getting the meter to display the correct value of 2300W.

    Can you tell me what you mean by the energy not matching with the power? Is the % error calculated using the reference meter and the EVM’s pulse output different than the % error calculated from GUI readings and actual readings of the actual power? Or do you mean something else by this?

    Regards,
    Ryan

**Attention** This is a public forum