This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Hi everyone!
I make the radar on the IWR1642 ICs. I program the next chirp parameters (see image#1):
Frequency start: 1425452105 (76.49546177 GHz);
Idle Time: 400 (4 usec);
TX Start Time: -400 (-4 usec);
ADC Valid Start Time: 200 (2 usec);
ADC Sampling Time: 2 usec;
Ramp End Time: 6 usec;
tsr: 0.25 usec;
BWsw: 4 MHz.
Image#2 shows a downconversion chirp frequency vs time. It is seen that in the working area saw nonlinear. My application is sensitive to this. How can a saw be made more linear? Increasing the guard intervals does not correct the situation.
The measurement done on the custom hardware (develop on IWR1642BOOST reference design). The radar chip works from DSX321SH-40MHz crystal. Block diagram of the experiment attached:
I try the increase duration chirps (Ramp End Time = 9 usec) and is it helps. What is the recommended minimum value for this parameter (for slope ~2-3 MHz/usec)?
Thanks for your reply.
A small addition…
I attach pictures of “frequency-time” dependencies for a Ramp End Time = 9 usec with a spectrum analyzer filter band 10MHz and 1MHz:
As well as the dependence of "frequency-power-time" for Filter Bandwidth = 10 MHz:
Do I understand correctly that the PLL shift step is ~ 100 kHz? And riple on the “frequency-time” is a PLL step-locked process?
I would appreciate your clarification.