This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

IWR6843AOPEVM: Demo for outdoor mobile robot

Part Number: IWR6843AOPEVM
Other Parts Discussed in Thread: IWR6843AOP

Tool/software:

Hi

I have previously been testing with IWR6843AOPEVM for a mobile robot application. I opened this thread: https://e2e.ti.com/support/sensors-group/sensors/f/sensors-forum/1405841/iwr6843-mmwave-vs-ultrasonic-proximity-sensors-for-mobile-robots/5409032?tisearch=e2e-sitesearch&keymatch=%20user%3A620822#5409032 and has been trying to test with the 3D people tracking demo.

We had some issues so we decided to go different ways. However, we have now picked this project up once again and has run some tests with the 3D people tracking demo. It seems to be very difficult to obtain any results with this demo, and from the detection tuning guide, there are several assumptions about a static mount and a dynamic scene. The issue is that, we have a dynamic mount as the robot is driving around (in terrain) and dynamic + static objects in the scene. So are you sure, we can use this demo as a starting point? And if so, do you have a good idea on where to start to tune the detection layer parameters?

Best regards

Jens

  • As mentioned in the previous post, the robot is driving at 2 m/s.

  • Hello,

    What kind of targets are you trying to detect and at what ranges. You are correct that inherently radar sensors lacks odometry, so it does not know if it itself is moving or if everything around the sensor is moving. However the key to this, which is how radar ends up being quite prevalent in automotive applications, is being able to get velocity information via a high precision accelerometer or even its own data. For example we have the True Ground Speed example demo where the sensor will know how fast the platform is moving based on the velocity of the floor points. Every single point in a point cloud we are able to get radial velocity, in addition to velocities of point clusters which we call tracks. With all this information, we can subtract/offset the known velocity of the platform to get the true velocities of moving objects, while removing points that are static when the velocity detected is near or exactly the same velocity of the platform itself

    Best Regards,

    Pedrhom

  • Dear Pedrhorm,

    Thank you for your answer. This is mainly for human targets at long range (>6 meters) and at shorter range (4 m) we need to detect a plastic pole with a diameter of 70 mm. The platform is moving at 2 m/s and in grass. Do you think you have a demo suitable for the IWR6843AOP so we can get some idea about performance? 

    Thank you

  • We do have quite accurate odometry information on the robot, so maybe that can be used.