Hi.
In the API Guide Document, the description of the USS_calibrateSignalGain(USS_SW_Library_configuration *config) is as follows.
USS_message_code USS_calibrateSignalGain | ( | USS_SW_Library_configuration * | config | ) |
The following API runs the SDHS Signal Gain calibration routine if no errors are encountered during Signal Gain calibration the config->captureConfig->gainRange value will be configured else it will remain the original value. Please refer to agcConstant parameter description in USS_Capture_Configuration structure for more information regarding the configuration for this signal gain calibration parameter.
uint16_t _USS_Capture_Configuration_::agcConstant |
This value is use to calculate the Optimal Gain Amplifier setting this value can be determined with the following formula agcConstant = floor(20*log10(adcNom))=floor(20*log10(ADCmax * 10^(-B/20))) where ADCmax = +/- 2^(12-1) for a 12 Bit ADC B = number of db that the ADC output should be backoff (typically 3 db) For USS module for B = 3, agcConstant = floor(20*log10((2^11)*(10^(-3/20)))) = 63
I did not fully understand this explanation. Can you explain it in more detail?
What is the purpose of the USS_calibrateSignalGain function, and what does the agcConstant variable mean?