Tool/software:
Hi,
I have noticed that the velocity resolution inside the default HECR demo (value from the automotive visualizer) is computed as:
dopplerStep = 0.106392793
I wonder how is this quantity computed? The standard formula: delta_v = lambda_start / (2 * N_chirps * T_chirp) = c / (2 * N_chirps * f_start * T_chirp) yields the following when using default HECR values for the legacy frame:
delta_v = (3e8) / (2 * (128 * 6) * 77e9 * (3.5 + 17.33)e-6) = 0.121481129 m/s
Also when it comes to the max unambiguous velocity v_unambiguous = lambda_start / (4 * T_chirp), yields:
v_unambiguous = lambda_start / (4 * T_chirp) = c / (4 * T_chirp * f_start) = 3e8 / (4 * (3.5 + 17.33)e-6 * 77e9) = 46.76 m/s
Which is more than it is hardcoded into the chirp_design_HECR.h file which has: HECR_GTRACK_MAX_RAD_VEL 40.8781
I do understand that this is the DDMA mode so that these quantities are perhaps computed differently than in standard TDM? But either way, please let me know how to properly compute these quantities for DDMA, so that I understand where do these values used in the demo come from.
Thanks in advance,
Mark