We are going to design the Energy meter for class 0.2 accuracy. We are using slaa577g_reference design file from TI for hardware design and we have downloaded the energy library files slac488 from TI. We are using MSP430F6779A microcontroller. We have designed the hardware as per the reference design .The analog voltage front end circuitry is same as per TI reference design and we are getting the correct voltage readings at display .In analog front end current circuitry, we are using Current transformer( ratio 1:2500 ) and we have selected resistor values to get the 625 mv for 5A current . Our nominal current is 5A and Imax is 6A.
We are using CT of turns ratio of 1:2500 and we are getting 625mv for 5A but the problem is for 5A We are getting 35A current at calibration software and on our meter display, it is showing 100A current. For 2.5A current, it is showing 50A current in calibration software and our meter display. It is showing same value on both place for 2.5A and below current but for 5A It is showing different values. So, Regarding this I have some queries as below.
1) Why It is showing different values for 5A current at calibration software and on our meter display.
2) What is the Current(high), Active(high) ,phase(high) parameters, why it is used. In calibration software, In meter calibration software window , It is showing scaling factor values of only current(low), Active(low) parameters . I have attached a file of snapshot of that window.
3) In Energy library file, I didn’t understand the working of some defined functions as below
i) DEFAULT_P_SCALE_FACTOR_A_HIGH
ii) DEFAULT_P_SCALE_FACTOR_A_LOW
iii) DEFAULT_I_RMS_SCALE_FACTOR_A
iv) DEFAULT_I_RMS_LIMP_SCALE_FACTOR
v) DEFAULT_V_RMS_LIMP_SCALE_FACTOR_A
vi) DEFAULT_V_RMS_SCALE_FACTOR_A.