Does the minimum detection distance change as the maximum detection distance of the millimeter wave radar increases?
How is the minimum detection distance determined?
This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
In a mmWave radar system the distance is detected based on fft processing which is performed on the ADC sampled reflected chirp signal
So for example, if we have a maximum range of 100m and we use 256 samples, 100m/256 ~ 0.4m So, the detection accuracy will be 0.4m
The absolute min distance that can be measured is defined by law of phyisics and it is around 3cm. However this can't be achieved with a 100m max range.
There are many threads related to this topic.
For further information I recommend you use google site search as follows
site e2e.ti.com awr minimum range
site e2e.ti.com iwr minimum range