This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

AWR1642BOOST: How to distinguish between objects in OOB demo

Part Number: AWR1642BOOST

Hi Team,

I'm working on OOB demo in AWR1642BOOST . I've already hardcoded the code and I'm receiving UART data in 9th pin in J5.

Now , from the data receiving in UART, how can i distinguish that the detected object is pedestrian, car, or truck ?

Is there any  TLV's which has the details related to object cloud as there in SRR demo ?

Thanks in advance

Nithin

  • Hi Nithin,

    There is no TLV which defines the type of detected object. You can try using trackers and clusters.

    Regards,

    Samhitha

  • Hi Samhitha,

    The below mentioned is the structure which declares types of TLV in OOB demo, but there is no TLV which includes details like trackers and clusters.

    typedef enum MmwDemo_output_message_type_e
    {
    /*! @brief List of detected points */
    MMWDEMO_OUTPUT_MSG_DETECTED_POINTS = 1,

    /*! @brief Range profile */
    MMWDEMO_OUTPUT_MSG_RANGE_PROFILE,

    /*! @brief Noise floor profile */
    MMWDEMO_OUTPUT_MSG_NOISE_PROFILE,

    /*! @brief Samples to calculate static azimuth heatmap */
    MMWDEMO_OUTPUT_MSG_AZIMUT_STATIC_HEAT_MAP,

    /*! @brief Range/Doppler detection matrix */
    MMWDEMO_OUTPUT_MSG_RANGE_DOPPLER_HEAT_MAP,

    /*! @brief Stats information */
    MMWDEMO_OUTPUT_MSG_STATS,

    /*! @brief List of detected points */
    MMWDEMO_OUTPUT_MSG_DETECTED_POINTS_SIDE_INFO,

    /*! @brief Samples to calculate static azimuth/elevation heatmap,
    (all virtual antennas exported) - unused in this demo */
    MMWDEMO_OUTPUT_MSG_AZIMUT_ELEVATION_STATIC_HEAT_MAP,

    /*! @brief temperature stats from Radar front end */
    MMWDEMO_OUTPUT_MSG_TEMPERATURE_STATS,

    MMWDEMO_OUTPUT_MSG_MAX
    } MmwDemo_output_message_type;

    Did you mean trackers and clusters in SRR demo?

    Thanks in advance

    Nithin

  • Hi Nithin,

    The below mentioned is the structure which declares types of TLV in OOB demo, but there is no TLV which includes details like trackers and clusters.

    Yes, you are correct. There is no TLV which includes details like trackers and clusters. I meant that you can use clusters and classify the data based on the size of the cluster.

    Did you mean trackers and clusters in SRR demo?

    Yes, you can refer to SRR demo and the visualizer to get understanding about how the objects are tracked and clusters are formed.
    You can also refer to AWR1642BOOST: Object Clustering of SRR demo on AWR1642boost - Sensors forum - Sensors - TI E2E support forums and let me know if this is helpful or not.

    Regards,

    Samhitha

  • Hi Samhitha,

    i referred the link, So In the GUI, we can enable the 'clustered output' of the USRR, to see the results of the clustering algorithm.

    But we've already hard coded the program, and in the output coming in serial port, there is no any TLV explaining the clustering algorithm, then how can i distinguish that the detected object is pedestrian, car, or truck ?
     

    Thanks in advance

  • Hi Nithin,

    The objects are clustered using dBScan.

    But we've already hard coded the program, and in the output coming in serial port, there is no any TLV explaining the clustering algorithm, then how can i distinguish that the detected object is pedestrian, car, or truck ?

    Please refer to clusteringDBscan.c located at short_range_radar/src/1642/dss to understand the algorithm. Try to refer MmwDemo_interFrameProcessing function in short_range_radar/src/1642/dss/dss_data_path.c to understand when clustering is performed.
    You can also refer to C:\ti\mmwave_sdk_03_06_00_00-LTS\packages\ti\alg\mmwavelib\test\lib_dbscanClustering_test.c.

    For more details refer to DBScan with AWR1843 OOB Demo - Sensors forum - Sensors - TI E2E support forums.

    Regards,

    Samhitha

  • Hi Samhitha,

    I've gone through MmwDemo_interFrameProcessing function in short_range_radar/src/1642/dss/dss_data_path.c. 

    1) Now my doubt is , how the dBScan results are parsed out through the UART?

    Because in C:\ti\radar_toolbox_1_30_01_03\tools\visualizers\SRR_GUI\read_file_and_plot_object_location_with_persistance, there is  cluster TLV mentioned as TLV 2 as below:

                   
            for tlvIdx = 1:Header.numTLVs
                [tlv, byteVecIdx] = getTlv(bytevec_cp_flt, byteVecIdx);
                switch tlv.type
                    case MMWDEMO_UART_MSG_DETECTED_POINTS
                        if tlv.length >= OBJ_STRUCT_SIZE_BYTES
                            [detObj, byteVecIdx] = getDetObj(bytevec_cp_flt, ...
                                    byteVecIdx, ...
                                    tlv.length);
                        end
                    case MMWDEMO_UART_MSG_CLUSTERS
                        if tlv.length >= CLUSTER_STRUCT_SIZE_BYTES
                            [clusterObj, byteVecIdx] = getClusters(bytevec_cp_flt, ...
                                    byteVecIdx, ...
                                    tlv.length);
                        end
                    case MMWDEMO_UART_MSG_TRACKED_OBJ
                        if tlv.length >= TRACKER_STRUCT_SIZE_BYTES
                            [trackedObj, byteVecIdx] = getTrackers(bytevec_cp_flt, byteVecIdx, tlv.length);
                        end
                       
                    case MMWDEMO_UART_MSG_PARKING_ASSIST
                        [parkingAssistRangeBins, byteVecIdx] = getParkingAssistBins(bytevec_cp_flt, byteVecIdx, tlv.length);
                       
                    case MMWDEMO_UART_MSG_STATS
                        [StatsInfo, byteVecIdx] = getStatsInfo(bytevec_cp_flt, byteVecIdx);
                         %fprintf('StatsInfo: %d, %d, %d %d \n', StatsInfo.interFrameProcessingTime, StatsInfo.transmitOutputTime, StatsInfo.interFrameProcessingMargin, StatsInfo.interChirpProcessingMargin);
                         displayUpdateCntr = displayUpdateCntr + 1;
                         interFrameCPULoad = [interFrameCPULoad(2:end); StatsInfo.interFrameCPULoad];
                         activeFrameCPULoad = [activeFrameCPULoad(2:end); StatsInfo.activeFrameCPULoad];
                         guiCPULoad = [guiCPULoad(2:end); 100*guiProcTime/Params(1).frameCfg.framePeriodicity];
                         if displayUpdateCntr == 40
                            UpdateDisplayTable(Params);
                            displayUpdateCntr = 0;
                         end
                    otherwise
                end
            end

            byteVecIdx = Header.totalPacketLen;
    2) when i observed the hard coded output , i'm getting 4 tlvs in most time, i think the second tlv i'm getting is of cluster details, am i right?
    Thanks in Advance
    NITHIN
  • Hi Nithin,

    To understand about how the clustering output is sent through UART, please go through the code. You can check SRR_DSS_SendProcessOutputToMSS function in dss_main.c. 

    when i observed the hard coded output , i'm getting 4 tlvs in most time, i think the second tlv i'm getting is of cluster details, am i right?

    Check the TLV type to decide which TLV is corresponding to MMWDEMO_UART_MSG_CLUSTERS. TLV type should be 2 in this case.
    You can see Short Range Radar Reference Design Using AWR1642 (Rev. B) (ti.com) documentation for more information regarding SRR application.

    Regards,

    Samhitha

  • Hi Samhitha,

    I've gone through SRR_DSS_SendProcessOutputToMSS function in dss_main.c and i found that

    TLV type corresponding to MMWDEMO_UART_MSG_CLUSTERS is 2 , 

    TLV type corresponding to MMWDEMO_OUTPUT_MSG_PARKING_ASSIST is 4 and

    TLV type corresponding to MMWDEMO_OUTPUT_MSG_TRACKED_OBJECTS is 3.

    while I'm testing with a moving car ,I'm getting TLVs 1,2 & 4.

    1) Now i want to find ,is there any moving cluster inside the radar range, so if I get TLV 3, it would be easy and convenient because it includes 

    the tracking output -> x co-ordinate

    the tracking output -> y co-ordinate 

    velocity in the x direction

    velocity in the y direction

    cluster size (x direction)

    cluster size (y direction)

    But when i inspected the time stamped data, I'm not getting TLV3 , is there any additional configurations required to get TLV3 in the output ?

  • Hi Nithin,

    Did you observe any trackers in the visualizer? If you are using max-vel enhancement processing path, then you will get the tracked objects TLV. 

    Regards,

    Samhitha

  • Hi Samhitha,

    In the visualizer, i observed trackers by enabling them, but in the output from hard coded system, i can't observe any tracker TLV, that is TLV2. Why ?

    Also, what is max-vel enhancement processing path?

    Thanks in advance

    Nithin

  • Hi Nithin,

    If you are able to see the trackers in the visualizer, then the TLV of type 3 is also transmitted over UART. Can you please check again? 

    Also, what is max-vel enhancement processing path?
    The chirp design of SRR80 and USRR20 are different, so two different 'processing paths' are used to process them. USRR20 is processed by POINT_CLOUD_PROCESSING path and SRR80 is processed by the MAX_VEL_ENH_PROCESSING path.
    Regards,
    Samhitha