This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
My interest is in the IWR1443 because that is the device we are working with directly, but I'm sure the answer will apply to all the mmWave devices.
The question is simple; does the linearity of the chirp generated by this device degrade at all as the frequency slope is reduced? Do you have any direct mesurement data to quantify the chirp linearity (and particularly as a function of rate)?
I ask because it is sometimes stated that this is a necessary consequence of using a PLL based chirp synthesiser (at least for some PLL deisgns). We are in a situation where we are considering coming down to quite low chrip rates and so I was wondering if there was a trade off between rate and linearity.
Your thoughts on this would be much appreciated.
Regards
Duncan
Hi Duncan,
We are looking into your query and will get back to you soon.
Regards
Ankit
Hi Duncan,
The linearity of the chirp generated by this device improves as the frequency slope is reduced. Higher frequency slope (higher bandwidth) will require more idle time for settling compared to lower bandwidth. If the device has lower idle time higher slope can lead to non-linearity due insufficient settling.
Users can customise the Ramp timing configurations in mmWave Studio to find the settling percentage vs. ADC start time. In order to have more linearity, we should get maximum settling, which results in longer idle times and longer ADC start times. Therefore, a lower frequency slope will result in better linearity over the same time period.
Regards
Ankit