This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

IWRL6432BOOST: Human / Non-human classification queries

Part Number: IWRL6432BOOST

Tool/software:

Hi team,

I tested Human/Non-human classification lab with xwrl6432BOOST EVM from the radar_toolbox_2_10_00_04. The radar sensor (xwrl6432 BOOST EVM) is mounted at 1.7m height with 10 degree tilt downwards. 

Case 1: Radar detects the table fan as Non-human initially which is  placed 1m apart, but when the human crosses beside the table fan, the label to fan changes from Non-human to human and persists as human until sensor/table fan turned off.

Can you please explain the behavior of table fan detection in the above scenario.

Case 2: If we move any object (Box) with uneven doppler infront of radar sensor, it detects and labeled as human. Please explain the classification behavior in this case. 

Thanks in advance

  • Hello, 

    1.  We have seen some similar issues with the classifier labeling a fan as "human" when a human comes too close to the fan. Fundamentally this is an issue with the tracker implementation as the two point clusters become too close and become a single tracked object. This then throws off the result of the classifier and due to some logic in the visualizer the label does not reset to fan. 

    You can modify the visualizer code so that the label would not stay as human after it was incorrectly tagged as human. The modification would be needed here (line 66 of radar_toolbox_2_10_00_04\tools\visualizers\Applications_Visualizer\common\Demo_Classes\out_of_box_x432.py) Please refer to the Applications Visualizer user guide for information on running the visualizer from the python source code. 

    2. Can you help me better understand this scenario? How is the motion for the box generated? 

    Best Regards,

    Josh

  • Hi Josh, 

    Thank you for your quick response.
    As per your suggestion I altered the visualizer code  (I removed the entire if condition to ensure that the tag is assigned in every scenario).

    The label to the fan changes from non-human to human when the human crosses beside the radar. As expected, the fan's tag turns to non-human when the human moves away. Could you please clarify that the above modification impacts detection performance in any other way.

  • Hi, 

    One part of the conditional statement that you removed is checking the track velocity. This was added to the post processing of the classifier data because it was observed that sometimes for people that were moving very slowly the track would be classified as non-human. Other than this there shouldn't be much impact.

    Regards,

    Josh