This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

AWR1642BOOST: Which way to go to get better results?

Part Number: AWR1642BOOST
Other Parts Discussed in Thread: AWR1642, , AWR1843BOOST
Hello! I tested the AWR1642BOOST evaluation board with mmwave Demo Visualiser, the results do not suit me.
The maximum detection range of an object does not exceed 15 m.
Maybe I'm doing something wrong, but changing the config settings in Demo did not particularly affect.

I understand that Demo Visualiser is a demo for exploring the radar.

Please tell me what you need to pay attention to in order to achieve better results?
I need to detect objects (obstacles) at the maximum possible distance paired with the video sequence to reinforce the results from the camera.
  • Could you please clarify what this means "Detect Objects".

    • Are the objects moving?
    • What type of objects?
    • Do you just want to see the objects on a GUI?

    There are several demos available. We need to understand what you are trying to achieve

    thank you

    cesar

    • Are the objects moving? Both moving and static are desirable, but only moving.
    • What type of objects? People, cars, obstacles
    • Do you just want to see the objects on a GUI? It is advisable to see and retrieve the position of objects in space

  • For AWR1642, I would recommend these demos: (All are found in the Resource Explorer that is in Code Composer Studio).

    In the mmWave Industrial toolbox:

    Lab13 - Traffic Monitoring

    Lab11 - People Counting

    In the mmWave Automotive toolbox:

    Lab03 - Vehicle Occupancy Detection

    Lab04 - Obstacle Detection (requires an AWR1642BOOST-ODS EVM for best results)

    Lab07 - Medium Range Radar (requires an AWR1843BOOST EVM).

    Two additional points that may help:

    1) This is not an imaging radar - most demos will create and output a list of detected objects in XYZ space. Some demos will cluster detected objects together that can be used to guess a real world object's size.  Some demos (eg VOD) will instead generate a heatmap (actually velocity data) to show position and rough size.

    2) It's all about RCS - the bigger the object, the further away you can reliably detect it.  You should be able to detect vehicles at 150m, but not smaller objects like people. There are several online resources that will provide more details than I can provide here.

      -dave

  • Good day, Dave!

    I saw these cases in the TI laboratory, I am interested in something else, I would like to configure my configuration, and then work with the data.

    Those solutions that are in TI Explorer do not suit me, or require other equipment.

  • So in that case, I would pick a demo that performs similar processing to what you want and experiment with the chirp configuration design using the Sensing Estimator: 

    https://dev.ti.com/mmWaveSensingEstimator

    Understand that no demo can support all possible chirp configurations.  The first step is to get a chirp configuration that doesn't fail configuration with the RF front end (the BSS firmware).  If the MMWave_open, _config, _start throw an error, then it important to find out if the error is coming from the BSS (i.e. due to a chirp mis-configuration) or some check in the demo source.  You can modify the demo source, but you must correct the configuration to make BSS happy with your config.  I recommend starting with a working config and modify as little as possible, step by step.  Once you get the chirp to run in the front end you can make changes in MSS and DSS code to make it work properly starting with 1D (range) FFTs that create the radar cube.

    There is a great deal of information related to mmWave radars on TI.com.  Here is one relevant document:

    Programming Chirp Parameters in TI Radar Devices