This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

LAUNCHXL-F28379D: DAC Output Calibration

Part Number: LAUNCHXL-F28379D

Tool/software:

I’m new to using the F28379D controller and am currently testing the accuracy of the DAC outputs. Using Simulink, I set both DAC-A and DAC-B to output the same values. When measuring with a multimeter and oscilloscope, I found that DAC-A outputs a precise range of 0-3V. However, DAC-B consistently shows a positive offset of 30mV. Is this behavior normal? How can I calibrate it?

  • Hi Yuejun,

    Thanks for your patience.

    as shown in section"6.10.3.1 Buffered DAC Electrical Data and Timing" in datasheet, the offset and gain error are 10mV and 2.5%. So, considering these errors, the value you are observing on the pin is in the range.

    Here are steps for calibration of DAC_A/B using ADC module:

    1. Set the DACA input to 10% of FSR (equivalent to 0.3V) and 90% of FSR (equivalent to VDDA-0.3V) and measuring output voltage by ADC. These values are measured to calculate the offset and full-scale errors.
    2. Calculate the actual LSB size by dividing the span by the number of possible codes (4096 in this case)
    3. Multiply the digital input to the DAC by the reciprocal of the gain calibration coefficient (GCC)

    GCC = (H_Code_DAC - L_Code_DAC) / (H_Code_ADC - L_Code_ADC)

    4. The offset calibration coefficient (OCC) can be measured and removed by adding or subtracting an equivalent digital number to the DAC input. Extrapolate the end point fit straight line and calculate the Output Voltage corresponding to the 0d code which gives the offset error.

    OCC =  (GCC × H_Code_DAC) - H_Code_ADC

    5. Calculate the correct DAC input to produce the desired voltage:

  • Thank you Hadi, you solve my issue.