This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

IWR1443: Full Calibration, Partial Calibration, and Temperature Changes

Part Number: IWR1443

Afternoon folks:

  I'm working on an application in which the IWR1443 will be powered-up every once in a while to perform a single measurement (likely a single frame) and then will be shut down (the IWR1443 will be controlled by a master processor). It will likely be running for much less than 100 ms each time. Since the power budget is limited, I plan to perform a full calibration at the factory so that the device can do partial calibrations in the field to reduce power usage (will load calibration from QSPI flash and perform any appropriate boot time calibrations per TI labs and suggestions).

  Since the IWR1443's operation will be extremely intermittent, Scheduling of Periodic Runtime Calibration and Monitoring via the RfInit API is not appropriate to this application . With respect to the "Self-Calibration in TI's mmWave Radar Devices" SPARACF4A document, which of the calibrations should be performed in the field due to temperature changes? Text on page 10 of the document states,

"Every CALIBRATION_PERIODICITY, the processor reads the temperature and performs a calibration update if
needed. This update is done only if the temperature deviates by ±10 degrees compared to the temperature when
the last calibration was done. LO Distribution calibration updates are done only if the temperature deviates by
±20 degrees from the temperature at last update."

  It is clear from this text that the LO Distribution calibration should be performed if the temperature deviates by 20 degrees. However, it also infers that other calibrations should be performed if the temperature deviates by more than 10 degrees. Given that, which calibrations should be repeated if the temperature deviates by more than 10 degrees?

  In general, I'd like the IWR1443 to operate as follows. A full calibration will be performed at the factory with data stored in QSPI flash. Every time the IWR1443 is brought up by the master processor in the field it will load the factory cal data from QSPI flash and perform any appropriate boot time calibrations. The master processor will keep track of temperature changes and will command the IWR1443 to perform additional calibrations when it determines that a temperature change has exceeded thresholds. In that particular situation, I need to know which calibrations need to be performed in the field based on temperature deltas.

  Thanks! -Michael

  • Hello

    Please be aware that signal quality will be adjusted to bring it within the tolerance levels of the device specifications. As such we would want you to run all the signal quality  recalibariton..

    Have you looked at the following document :  Self-Calibration of mmWave Radar Devices (Rev. A)  for available knobs and insights into the calibration operation.

    Thank you,

    Vaibhav

  • Afternoon Vaibhav:

    "Please be aware that signal quality will be adjusted to bring it within the tolerance levels of the device specifications. As such we would want you to run all the signal quality  recalibariton.."

      This does not answer my question(s).

      Yes, I have reviewed the Self-Calibration document several times. In fact, I referenced that document in my post. The document does not answer my questions, either. That is why I am asking the question here on the forum.

      Regards, Michael

  • Hi Michael,

    Answer to your question depends on the signal quality variation the application can handle, we recommend doing all calibrations so that device performs within the specifications as suggested:

    As such we would want you to run all the signal quality  recalibariton..

    Have you tried to see if the calibration based changes show tangible difference for the application.

    Regards

    Vaibhav

  • Hi Vaibhav:

      As mentioned, my application is power-constrained. I cannot afford the time and energy that is required to do a full calibration at every power-up/down cycle. This is why I want to perform a full calibration at the factory, but then only perform specific calibrations in field when they are necessary. This approach has been discussed in your documentation and here on the forums, but without the details I need.

      Are you saying that all off the calibrations (except APLL and Synth VCO, which are "automatic") should be performed if temperature changes? This is not the impression I get from reading the calibration document. For instance, the calibration document says that the LOC DIST only needs to be done when the temperature changes by 20 degrees. I'm just trying to get enough information to determine when any of the specific calibrations need to be performed based on temperature changes because I don't have the power budget to do them on every power cycle. Therefore, I'm looking for details regarding under what conditions the specific calibrations need to be performed.

      Thanks, Michael

  • Hello

    Are you planning to restore all calibration values from factory setting at boottime  and then trigger a one shot calibration in addition to that but want to select which ones to retrigger?

    Or

    Are you trying to decide at boottime which ones to restore vs  retrigger?

    Thank you,

    Vaibhav

  • Morning Vaibhav:

    "Are you planning to restore all calibration values from factory setting at boottime  and then trigger a one shot calibration in addition to that but want to select which ones to retrigger?"

      Yes, this is what I'd like to do.

      -Michael