My interest is in the IWR1443 because that is the device we are working with directly, but I'm sure the answer will apply to all the mmWave devices.
The question is simple; does the linearity of the chirp generated by this device degrade at all as the frequency slope is reduced? Do you have any direct mesurement data to quantify the chirp linearity (and particularly as a function of rate)?
I ask because it is sometimes stated that this is a necessary consequence of using a PLL based chirp synthesiser (at least for some PLL deisgns). We are in a situation where we are considering coming down to quite low chrip rates and so I was wondering if there was a trade off between rate and linearity.
Your thoughts on this would be much appreciated.
Regards
Duncan