Hello,
After going through the introduction videos for the mmWave and radar technology, I learned that chirp time (T_chirp) plays an important role for velocity calculations. For example, to calculate the velocity resolution:
V_res = lambda / (2 * N * T_chirp)
Looking at the CFG config file, we have different timing parameters like: idle time, ADC start time and ramp end time that seem to define the aforementioned T_chirp.
When looking at the theoretical level of how the velocities are calculated, there is no idle time or other hardware-specific parameters to consider, so I am trying to figure out which of the CFG parameters define the theoretical T_chirp and how they affect the velocities calculation:
1. Would it be correct to assume the chirp time is the time when the ADC sampling is performed, e.g.: T_chirp = <ramp end time>? Or is it the summation of <idle time> + <ramp end time>?
2. How does idle time reflect onto theoretical formula for calculations of velocities? Does it play any role?
3. What about radar equation which is directly proportional to "T_meas" (measurement time) - is that also <T_chirp * N> and does it include idle time? - e.g., in order to improve SNR we could increase idle time? Or it only includes ramp end time and we have to work with that value directly?
4. Could you also help to understand the term "active chirp time" defined in mmWave SDK user guide: "active chirp time should be <= 50% of frame period"? Active means time during sampling? Is it correct to assume the frame time is given by T_frame = N_chirp * T_chirp - where T_chirp also includes idle time?