This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

AWR1243BOOST: Why is angle resolution so high?

Part Number: AWR1243BOOST
Other Parts Discussed in Thread: AWR1642

According to the MIMO radar application note, the mmWave training series, and the "Fundamentals of FMCW Radar" white paper, the angle resolution of the radar is equal to 2/N where N=Nrx*Ntx. 

I just completed a test where I configured 2 tx and 4 rx antennas so theoretically my resolution should be 0.25 radians ~ 14 degrees (for a 180 degree field of view). However, after collecting my data, I found that there were object detections that were less than 14 degrees of each other. How is this possible? The radar should only have 8 "bins" to group the angle of arrival into, so how could it possibly detect 3 objects at 0.436m that are only ~1 degree apart from each other? Even if velocity is able to distinguish these 3 objects how does it change the angle calculation?

It just doesn't seem like there can ever be more than 8 angle bins, so I'm confused why it is programmed to be 64 because there aren't 64 virtual antennas. 

  • Hi,

    It is possible that the detection is not correct. How did you determine that the objects are detected?

    Thank you
    Cesar
  • I did not determine that the objects were detected; I was simply reading the point cloud that the AWR1642 transmitted to me via UART. While it does not transmit angle information, it transmits the range and the X/Y coordinates, so it is trivial to calculate the angle of arrival for every detected object. In one frame, I detected an object at the same range but 3 different angles that were very close to each other:

    The leftmost column is the time index of the frame; this is the fourth frame (with a frame periodicity of 100ms) so the unit of the leftmost column is seconds. The second column is range in meters (converted from the range index that the AWR1642 provided to me based on the chirp parameters), the third column is velocity, the fourth column is the peak value, the fifth and sixth are X and Y position, and the seventh is angle. 

    Therefore, you can see the angle of the objects detected were 0 (boresight), 1.7979 degrees, and 3.5923 degrees. 

    In your code, you specify the number of angle bins to be 64, but then pad the azimuthIn[] array with zeros after you perform doppler compensation. I don't understand how angle can be this accurate. 

  • I think these are reflections from same object.

    Cesar

  • >According to the MIMO radar application note, the mmWave training series, and the "Fundamentals of FMCW Radar" white paper, the angle resolution of the radar is equal to 2/N where N=Nrx*Ntx. 

    I think the confusion may be that 2/N does not take into account further processing.  In some demos a Direction of Arrival algorithm is used to estimate object azimuth.  In others, a third Azimuth FFT is performed.  The "data path object" (the global) usually contains a variable that computes the angle resolution. You can see how this is calculated to determine the method being used.

      -dave

  • resolution in radar context is defined to be ability to separate two simultaneous objects in the domain in question. Angle calculations are done on detected bins in the range-doppler matrix which means angle resolution comes into question when we are trying to evaluate two objects that show energy in the same range-doppler bin data i.e if there were two objects that were present at same same radial distance and had same velocities but at different angle, then it is not possible to resolve them if they were less than angular resolution apart (14 deg for azimuth). But accuracy is how close the estimated angle is to the true value of that object and this can be increased through interpolation, which is achieved by 64 bin FFT, so you could not see two objects at same range-doppler separated by less than 14 deg (they will be seen as one object probably at some angle at middle of the two) but if you had only one object at a range-doppler you could estimate the angle with higher accuracy than the resolution because of interpolation. The accuracy is ultimately limited by SNR [there are some formulas in academic literature for this]. FYI, there are some angular estimation techniques that can improve resolution beyond the beamforming case (the 2/N limit) if some assumptions are satisfied but we are doing beamforming in our implementation. Sometimes the resolution jargon can be confusing because in FFT context people use the word resolution to represent a single bin of FFT but when you interpolate (zero padding), you cannot arbitrarily improve the true resolution as in simultaneous presence separation, you are only improving accuracy due to interpolation.

    In the light of above explanations, you can reevaluate your data and report back any surprises.

  • Thank you all for the answers. I will take another look at the angle estimation with your inputs in mind. I will let you know if I have any further questions.