Hello,
Everyone, I am working on a project using the TI mmWave IWR6843ISK-ODS. For my application I am also using a mmWaveBoostIC board and the DCA1000 for data capture. I was running a simple test of the radar by measuring a stationary scene. Throughout the entire measurement that is no change in the scene. Despite this when I look at the measured ADC data for all the chirps taken during the measurement their amplitude seems to significantly taper off with time (see image below).
To my understanding there should not be this large of a variance in the measured data vs time and I wonder if there are any explanations as to why this could be happening. Another important note is that when I take the FFT I see peaks in the image where objects are expected so the data is at least somewhat correct.
Thank You,
Cole Harlow