I am attempting to measure the intensity of returns from an object with the goal of predicting the intensity of a return from the same object at a larger distance. My experiments have given results that don't match theory.
I am recording data from an xWR1443 using ROS. Once I have recorded data using ROS, I am processing it by manually selecting which points are associated with my object and summing up the energy in each point. Because the energy reported by ROS is on a logscale, I convert it to a linear scale before summing up, then convert back. (For the energy of each point associated with the object, x, Sum = Sum+10^(x/10), TotalEnergy = 10*log(Sum)). Is this an appropriate way to determine the amount of RF energy being reflected from a target?
I have performed an experiment where I measure the intensity of the returns from a 6" trihedral corner reflector at various ranges and noticed that the TotalEnergy associated with the object does not decrease according to what I would expect based on the radar range equation Energy = (Pt * Gt * Gr * λ^2 * σ^2) / ( 4 * π )^3 / R^4. Based on this equation I would expect TotalEnergy to decrease at a rate of 1/R^4, but I am seeing a decrease that more closely matches 1/R^2.
What reasons could there be for the sensor behaving this way? I realize that some energy may not be represented because some returns are filtered out. My cfarCfg is set to 854 (10 dB), so I am hoping not much energy is being removed this way. Still I am unsure why the range dependence I am finding is different than what theory predicts.
Thank you,
Andrew