We are trying to understand a problem we are seeing in which signal amplitudes suddenly drop significantly at low temperatures. In general, we bring the hardware down to -40c from room temperature and hold it there for a while. Then we begin to raise the temperature. When the system gets near 5c, we often experience a significant drop in signal amplitude. If we perform a full (~factory) calibration at this point, the amplitude returns to the correct level, but then drops again within a few seconds. If we continue to raise temperature, the signal amplitude gradually returns to normal by the time we get back to room temperature (where we started the testing).
Note that we rarely see any amplitude problems when bringing the temperature down from room temperature to -40c. Rather, we only tend to see the (sudden) amplitude drop after we start raising the temperature (from -40c) and get to around 5c.
Here is some information on how our system operates that may be helpful.
Our system is power-constrained. Therefore, we power up the IWR to perform a measurement (one frame with ~4 chirps) and then shut it down. We power up the IWR every 1 to 3 seconds to perform a measurement. When the IWR powers up, it loads full (~factory) calibration data from flash, performs a partial calibration, performs a measurement, and then shuts down. If we force the IWR to perform a full calibration when it is experiencing low amplitudes around 5c, we usually see the signal amplitude return to normal on the next measurement, but then gradually decrease back down to where it was after a few measurement cycles (a few seconds). It is as if the full calibration "temporarily" solves the amplitude problem. However, we also see that sometimes forcing a full (~factory) calibration in this situation only partially restores the signal amplitude before it drifts back down after a few measurement cycles.
We would appreciate any insight/ideas regarding this situation.
Best regards, Michael