This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

IWR1443: Signal Amplitude Suddenly Drops at Low Temperatures

Part Number: IWR1443

We are trying to understand a problem we are seeing in which signal amplitudes suddenly drop significantly at low temperatures. In general, we bring the hardware down to -40c from room temperature and hold it there for a while. Then we begin to raise the temperature. When the system gets near 5c, we often experience a significant drop in signal amplitude. If we perform a full (~factory) calibration at this point, the amplitude returns to the correct level, but then drops again within a few seconds. If we continue to raise temperature, the signal amplitude gradually returns to normal by the time we get back to room temperature (where we started the testing).

Note that we rarely see any amplitude problems when bringing the temperature down from room temperature to -40c. Rather, we only tend to see the (sudden) amplitude drop after we start raising the temperature (from -40c) and get to around 5c.

Here is some information on how our system operates that may be helpful.

Our system is power-constrained. Therefore, we power up the IWR to perform a measurement (one frame with ~4 chirps) and then shut it down. We power up the IWR every 1 to 3 seconds to perform a measurement. When the IWR powers up, it loads full (~factory) calibration data from flash, performs a partial calibration, performs a measurement, and then shuts down. If we force the IWR to perform a full calibration when it is experiencing low amplitudes around 5c, we usually see the signal amplitude return to normal on the next measurement, but then gradually decrease back down to where it was after a few measurement cycles (a few seconds). It is as if the full calibration "temporarily" solves the amplitude problem. However, we also see that sometimes forcing a full (~factory) calibration in this situation only partially restores the signal amplitude before it drifts back down after a few measurement cycles.

We would appreciate any insight/ideas regarding this situation.

Best regards, Michael

  • Thanks for reaching out to Texas Instruments Michael. Please allow us some time to look into your query and get back to you. 

    -Shareef

  •   Thanks, Ehtesham.

      Please also note the following forum thread that describes a situation that is somewhat similar to what I am describing. Unfortunately, the thread ends without resolution.

    IWR1443: strange adc data when temperature is ultra low (-40 degree) - Sensors forum - Sensors - TI E2E support forums

      -Michael

  • Hello Michael,

    Do you see the problem disappear when you disable the partial calibration?

    -Shareef

  • Morning Shareef:

      No, disabling the partial calibration does not solve the problem.

      Thanks, Michael

  • Hello Michael,

    How much of power drop are you seeing? 

    Could you clarify what all data you are saving part of the factory calibration and what are the partial calibrations you are doing? Have you enabled run time calibrations? 

    Does your power management take care of the peak current requirements needed for IWR1443?

    regards,
    vivek

  • Hello Vivek:

    How much of power drop are you seeing?

    Our configuration is such that maximum amplitude is 32767. When the amplitude drops around the temperature of 5c, it would be typical to see it fall from about 16300 to about 2400, but it varies. I should also reiterate that this drop in amplitude is sudden, not gradual.

     

    Could you clarify what all data you are saving part of the factory calibration and what are the partial calibrations you are doing?

    The following calibrations are included in the factory (“full”) calibration. The entire calibration data set is saved when a factory calibration is performed and is restored on power-up.

    IWR_CAL_APLL_TUNING     

    IWR_CAL_SYNTH_VCO1_TUNING

    IWR_CAL_SYNTH_VCO2_TUNING

    IWR_CAL_LODIST          

    IWR_CAL_RX_ADC_DC_OFFSET

    IWR_CAL_HPF_CUTOFF      

    IWR_CAL_LPF_CUTOFF      

    IWR_CAL_PEAK_DETECTOR   

    IWR_CAL_TX_POWER        

    IWR_CAL_RX_GAIN

    The following calibrations are specified for the partial calibration.

    IWR_CAL_LODIST

    IWR_CAL_RX_GAIN

    However, it should be noted that the IWR calibration document makes the following statement about the IWR_CAL_TX_POWER calibration (as well as some other calibrations). “TX power calibration is carried out at boot time for all enabled TXs, and can be carried out again at runtime.” In other words, even though we are not specifying IWR_CAL_TX_POWER for the partial calibration, it (along with some others) is already being performed automatically by the system at boot time. Also, in the case of both factory and partial calibrations, the calibration status bits are checked to ensure that each part of the calibration that is specified is successful.

     

    Have you enabled run time calibrations?

    No. Run time calibrations are of no use to us because our run time is too short (essentially zero). Per my prior post, we essentially power-up, perform 1 frame (~4 chirps), and then shut down. The IWR is only running for about 100 ms and only executes 1 frame (~ 1 measurement). Here is my prior description of IWR system operation.

    “Our system is power-constrained. Therefore, we power up the IWR to perform a measurement (one frame with ~4 chirps) and then shut it down. We power up the IWR every 1 to 3 seconds to perform a measurement. When the IWR powers up, it loads full (~factory) calibration data from flash, performs a partial calibration, performs a measurement, and then shuts down.”

     

    Does your power management take care of the peak current requirements needed for IWR1443?

    We do have concerns that one of the PMIC outputs may be drooping during the measurement cycle at lower temperatures and that this may be part of the problem. We are evaluating this hypothesis. In fact, we are looking at everything we can think of with respect to this problem. We use a capacitor bank to supply the IWR board during the measurement cycle. The capacitor bank is charged in-between measurement cycles.

     

    I previously mentioned another problem that we had seen once, but now this problem has recurred during testing. It is also a low-temperature related issue. When operating below -38c, the radar scene sometimes changes to a bunch of noise that does not reflect the physical radar scene. Even though the radar scene is completely useless at this point, the IWR does not report any errors.

    Here is a link to someone else’s post describing what appears to be the very same problem of noise near -40c, but there is no resolution. How was this problem resolved?

    IWR1443: strange adc data when temperature is ultra low (-40 degree) - Sensors forum - Sensors - TI E2E support forums

     

    Hope you have a nice weekend!

      Thanks, Michael

  • Hello Michael,

    I would recommend you try to feed external clean supply with sufficient current capacity and see if that resolves the issue. 

    The following calibrations are specified for the partial calibration.

    IWR_CAL_LODIST

    IWR_CAL_RX_GAIN

    Could you clarify what you mean my "partial calibrations" ? For the boot time calibrations you are restoring the complete calibration file that was saved from the IWR device, is that correct? When you perform the RFinit() do you disable all the bootime calibrations so that they are not run again are are used only from the restored calibration file?

     No. Run time calibrations are of no use to us because our run time is too short (essentially zero). Per my prior post, we essentially power-up, perform 1 frame (~4 chirps), and then shut down. The IWR is only running for about 100 ms and only executes 1 frame (~ 1 measurement). Here is my prior description of IWR system operation.

    You would need to perform the run time calibrations once just before taking the radar measurement (in one-shot mode). The boot time/factory calibration is done at one temperature , so when you are using the sensor at other temperatures you would need to perform the run time calibration once to compensate for the temperature change. 

    regards,
    vivek

  • Morning Vivek:

      Please see my responses/questions below in red.

    I would recommend you try to feed external clean supply with sufficient current capacity and see if that resolves the issue.

    Agreed. I’ve suggested that we consider testing by by monitoring the voltages and/or supplying the PMIC with known good stable power to see if this improves the low temperature behavior. At this point, however, we have already completed some testing and are not seeing any problems with the PMIC output voltages when the IWR is running.

    The following calibrations are specified for the partial calibration.

    IWR_CAL_LODIST

    IWR_CAL_RX_GAIN

    Could you clarify what you mean my "partial calibrations" ?

    “Partial calibration” is terminology I took from one of Zigang Yang’s forum threads. The idea of factory (~full) calibration and partial calibration for power constrained IWR applications are outlined in the following forum thread.

    https://e2e.ti.com/support/sensors-group/sensors/f/sensors-forum/1006053/iwr1443-under-what-circumstances-should-we-re-run-a-full-calibration-when-partial-calibration-is-usually-used?tisearch=e2e-quicksearch&keymatch=IWR1443%20partial%20calibration

    When we power-up the IWR, it is commanded to follow one of two possible startup calibration sequences before making a measurement. We refer to the first sequence as factory (full) calibration, and the other sequence as partial calibration. These sequences are based on the calibration logic found in the High Accuracy Range Measurement Lab.

    Factory (Full) Calibration Startup Sequence:

    • Power on
    • Call rlRfInitCalibConfig() with the following calibration enable mask: IWR_CAL_APLL_TUNING, IWR_CAL_SYNTH_VCO1_TUNING, IWR_CAL_SYNTH_VCO2_TUNING, IWR_CAL_LODIST, IWR_CAL_RX_ADC_DC_OFFSET, IWR_CAL_HPF_CUTOFF, IWR_CAL_LPF_CUTOFF, IWR_CAL_PEAK_DETECTOR, IWR_CAL_TX_POWER, IWR_CAL_RX_GAIN
    • Call rlRfInit()
    • Save factory (full) calibration data by calling rlRfCalibDataStore() and then writing to flash
    • Perform single measurement (1 frame of 4 chirps)
    • Power off

    Partial Calibration Startup Sequence:

    • Power on
    • Call rlRfInitCalibConfig() with the following calibration enable mask: IWR_CAL_LODIST, IWR_CAL_RX_GAIN
    • Call rlRfCalibDataRestore() to restore/load the full calibration data that was retrieved from flash
    • Call rlRfInit()
    • Perform single measurement (1 frame of 4 chirps)
    • Power off

    For the boot time calibrations you are restoring the complete calibration file that was saved from the IWR device, is that correct?

    Yes, I believe we are (see the above sequences). I didn’t know it was possible to restore anything less than the complete calibration file. How and why would this be done?

    When you perform the RFinit() do you disable all the bootime calibrations so that they are not run again are are used only from the restored calibration file?

    I don’t fully understand your question. I do not see a function named RFinit() in the libraries. However, I am using a function named rlRfInit(), but I don’t see any options to disable calibrations. Please fully explain what you are asking and provide references and/or examples for this functionality and why I might need to use it.

    No. Run time calibrations are of no use to us because our run time is too short (essentially zero). Per my prior post, we essentially power-up, perform 1 frame (~4 chirps), and then shut down. The IWR is only running for about 100 ms and only executes 1 frame (~ 1 measurement). Here is my prior description of IWR system operation.

    You would need to perform the run time calibrations once just before taking the radar measurement (in one-shot mode). The boot time/factory calibration is done at one temperature , so when you are using the sensor at other temperatures you would need to perform the run time calibration once to compensate for the temperature change. 

    Please clarify what you mean by “run time calibrations”.

    As you can see from above, our goal is to perform the factory (full) calibration once, and use the factory calibration data, along with the partial calibration when our device is measuring in the field. Also, if necessary, we have implemented logic that can perform factory (full) calibrations in the field, and save the data to flash, if the temperature changes too much from the initial factory (full) calibration (for instance, when temperature changes by 10c from last factory (full) calibration). If you feel the approach we are taking is not adequate or correct, please advise what needs to be changed and how to do so. We have done our best to utilize knowledge from the calibration document and the High Accuracy Range Measurement Lab, but important details are often not clear (in those cases we try to base our approach on the lab logic).

      Thanks, Michael

  • Hello Michael,

    Thanks for sharing the details. Please find my response to some of the questions below:

    I don’t fully understand your question. I do not see a function named RFinit() in the libraries. However, I am using a function named rlRfInit(), but I don’t see any options to disable calibrations. Please fully explain what you are asking and provide references and/or examples for this functionality and why I might need to use it.

    I was refering to disabling calibrations in the rlRfInitCalibConfig() API, which is used to select the set of calibrations that executed are executed in RfInit()  , which is rlRfinit() in the mmwave API. Teh sequence you have shared above answers my question related to this. 

    When you see the dip in the RX level, is it seen on all 4 RX channels at the same time? Do you observe this behavior in all/most of your sensors or is it seen on only one of the sensors?

    regards,
    Vivek

  • Also do let us know if you did manage to perform the check with the clean voltage source with very short wires. The short wires are needed to ensure low inductance . 

    Regards,

    Vivek

  • Hi Vivek:

    "When you see the dip in the RX level, is it seen on all 4 RX channels at the same time"

    Our board only brings out one RX and one TX channel. Therefore, we don't have the ability to test what happens on the other channels.

    "Do you observe this behavior in all/most of your sensors or is it seen on only one of the sensors?"

    The reduced amplitude at 5c behavior is not completely consistent (does not always happen), but we have seen it across multiple IWR's.

    "Also do let us know if you did manage to perform the check with the clean voltage source with very short wires. The short wires are needed to ensure low inductance ."

    Yes, we will let you know. However, at this point we have monitored the PMIC output voltages when the reduced amplitude behavior is occurring and we are not seeing any problems. At the moment we are performing testing to rule out condensation as a potential cause.

      Thanks, Michael