This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

IWR6843ISK-ODS: Offline debugging

Part Number: IWR6843ISK-ODS

Hello,

The radar board we use now is IWR6843ISK-ODS, and the bin file is people_counting.

We are now going to get the original point cloud data from the radar and then do the processing to realize the tracking and gesture recognition of people. However, the point cloud obtained by running the configuration in the demo cannot meet our requirements, so we want to debug the CFAR parameters in the configuration file to obtain more effective point clouds. However, the efficiency of observing the effect by modifying the configuration file in real time is very low, so we want to adjust all the CFAR thresholds in the configuration file to the lowest (that is, do not filter) and save the point cloud offline, then use offline data to simulate real-time data and do offline CFAR filtering to debug CFAR parameters.

Thank you!

  • Hello

    The point cloud data from People counting demo may not be enough to derive the gesture information.

    We would recommend you to use  Raw Data from Sensor and analyze it do detect gesture signatures.

    Thank you,

    Vaibhav

  • Sorry,maybe I didn't describe it clearly.We always use raw data from the sensor.But it doesn't meet our requirements(that is, too much noise or too many empty frames).I wonder if there is a software that allow us to debug the CFAR parameters offline,that is, save raw data offline(do not filter),then use the saved data to debug the CFAR parameters until the filtered point clouds meet our requirements.

    Thank you!

  • Hi,

    For gesture detection, there is a lab which you should check out to understand the gesture detection methodology. When you say raw data, do you mean point cloud or the raw data collected with the DCA1000?

    Regards,

    Justin

  • Hi,

    Thank you for your reply,Justin!

    1.The raw data means point cloud.

    2.The gestures we are going to recognize include standing,sitting,lying and walking.

    Thank you!

  • Hi,

    Some clarification here

    1. Once you have point cloud data, you can't simulate CFAR. Point Cloud is the result of CFAR.
    2. TI provides a module (DCA1000) for collecting raw radar data. This is the data returned from the ADCs on the device. THIS IS NOT POINT CLOUD. 
      1. If you are going to detect things like gestures, you will typically need to operate on data at this level.
    3. When we talk about gestures, our gesture detection demos detect gestures done with the hand, such as swiping left or swiping down, or twirling a finger.

    So if you want to detect standing, sitting, you can do either:

    1. Collect point cloud data, then use this to determine the differences in the point cloud when a person is sitting or standing, etc
      1. You can collect point cloud data with a modified gui application. This is streamed over the usb connection to the device, we have point cloud formats for all of our labs in the user's guides.
    2. Use raw data to build a classifier that detects transitions from one stance to another. 
      1. You would need DCA1000 for this.

    Finally, I want to understand why you want more points. Single chip Radar is not an imaging technology.The datasets available from non-cascaded devices will not provide enough definition to determine what a person looks like.

    Regards,

    Justin