This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

IWR1443BOOST: Can IWR1443Boost do ORBSLAM or tell its location in plane wth point cloud generation?

Part Number: IWR1443BOOST
Other Parts Discussed in Thread: DCA1000EVM

Dear team,

I did ORB_SLAM with my camera sensor to generate 3d coordinates of my camera.

I was checking if anything similar to that can be done on my IWR1443boost also. I have checked the point cloud generation lab. So it will be great if it can tell me its coordinates with that.

Thanks.

  • Hello,

    We have an example using the IWR1443BOOST for object detection for mapping and navigation (collision avoidance) on an autonomous robot on the TI Resource Explorer at the following link.

    Autonomous Robotics with ROS for mmWave lab:
    dev.ti.com/.../

    It relies on the mmWave Robot OS driver included with the ROS Point Cloud Visualizer lab:

    ROS Point Cloud Visualizer lab:
    dev.ti.com/.../

    We do not currently have any examples or experiments of localization (SLAM) using mmWave. You could use the mmWave ROS driver to get the mmWave detected object data into ROS and try out pre-packaged algorithms in existing ROS packages or any type of localization algorithms you create. Another alternative is to capture the raw Radar ADC data from the sensor using the DCA1000EVM capture card and try out any desired post-processing algorithms offline.

    Please mark the thread as answered if your questions is resolved or reply if more support is required.

    Regards,
    John
  • Hi John,
    Thanks for the reply.

    I did test that example lab you mentioned above few days back. At that i was stucked while providing the turtlebot parameters in visualizer and thus was not able to run the demo. I was also told that i need those parameters to run this lab.

    What i want is mmwave as a standalone system that can be manually moved by hands and do the point cloud generation around the environment(similar to the above mentioned lab but excluding motion part as i am not using turtlebot). Is it possible to have that? Can it provide its coordinates by that procedure?
    Thanks.
  • Hi,

    The autonomous robot lab uses the wheel encoders and accelerometer in the robot to know how the sensor is moving. We currently have no examples or experiments using the mmWave radar data to estimate the 3D movement of the sensor. You could try out such an experiment by creating your own post-processing algorithm based on the detected object data from from the sensor (or the raw ADC radar data using the DCA1000EVM capture card).

    Regards,
    John
  • Ok thanks john.