This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

IWR6843: Counting number of people in the frame from the TLV packets

Part Number: IWR6843

I am using the IWR6843ISK EVM and I have flashed the 3D people counting bin image into it having a distance range of 14m. I am able to visualize the number of people within 14m using the demo visualizer.

My requirement is to send the number of people detected in the frame information into the cloud. I am also able to read the stream of UART data send by the EVM and tried to decode it as per the TLV data format mentioned in the user guide. But how to calculate the number of people in the scene from the point cloud information? Is the object classification (detecting objects such as human beings from the point cloud information) algorithm already running in the DSP of the IWR6843? If yes, then is that information transmitted in the UART? This will be ideal for me because in this case I just need to know how to decode the UART data stream to count number of people.

My goal/requirement is to send the number of people detected in the frame information to the cloud. Please help.

Thanks and Regards,

Sritam Paltasingh.

  • Hello

    Information about tracks is mentioned in the TLV data format.

    Please check the user's guide and let us know.    Tracks are assigned to per person based on detected  points.    We would like to know what documents have you looked at and despite that figuring out this information was not clear.

    Your feedback will be helpful.

    Thank you,

    Vaibhav

  • Hi Vaibhav,

    I am using the below link to understand the TLV data format in order to count human beings:

    https://dev.ti.com/tirex/explore/node?node=AHoEHc6j2uX.XSbB65w0Tg__VLyFKFf__LATEST

    In the TLV types there is a type called TARGET_LIST_3D as shown below:

    Does this TLV type (TARGET_LIST_3D) contain information about various detected targets based on the point cloud information? My understanding is that a target is classified(let's say as a human being) based on the tracking of many points(forming a group) associated with the target. Please let me know if my understanding is correct or not. In my case I am interested for counting human beings. Will this TLV type(TARGET_LIST_3D) help me in counting the number of human beings in the frame? If yes, then I can do that by counting the number of unique Track IDs (tid) as encircled in the above picture in the received UART data packets.

    Another problem that I face is that I am not able to detect the magicWord sync pattern from the stream of UART hex bytes that I receive. I have attached a csv file which contains all the byte streams that are coming from the UART data port of the mmWave sensor. The capture was made for 3 minutes. I searched for the magicWord sync pattern in it (02 01 04 03 06 05 08 07). Firstly, I start the EVM by using the mmWave Demo visualizer. Then I close it. UART data is still coming. Then I run my python code (very simple) to read UART data and store it in a csv file. Then I analyze the csv file to detect the magicWord sync pattern but I am not able to find it. I have attached the csv file containing the bytes received via UART. Below is the simple python code that I use:

    import serial
    ser = serial.Serial( port='COM4',
    baudrate=115200,
    parity=serial.PARITY_NONE,
    stopbits=serial.STOPBITS_ONE,
    bytesize=serial.EIGHTBITS,
    timeout = None
    )
    ser.flushInput()
    f = open("mmWaveLogData.csv",'a') # write in text mode
    counter = 0

    while True:
    try:
    in_hex = ser.read().hex()
    f.write(in_hex)
    f.write(" ")
    counter = counter + 1
    if counter >= 16:
    f.write("\n")
    counter = 0

    print(in_hex)
    except:
    print("Keyboard Interrupt")
    f.close()
    ser.close()
    break

    Looking forward to your help and guidance.

    Thanks and Regards,

    Sritam Paltasingh.

    mmWaveLogData.csv

    mmWaveLogData.csv 

  • Hi Sritam,

    The tracker code running on the device already send the target information which is used to construct the TLV in the first place. The TLV is constructed based on the number of targets reported by the tracker in the following function TrackerDemo_transmitProcessedOutput in mss_main.c as shown below. You can read the TARGET_LIST TLV header and divide it back by the size of the target structure to get the number of targets. You can do this in your visualizer. Another simpler way would be to just increment a loop counter when parsing this TLV and that's your number of targets:

    The TLV header is constructed at line 1714:

        if(result->numTargets) {
    
            /* Point cloud */
            tl[tlvIdx].type = 7;//TRACKERPROC_OUTPUT_TARGET_LIST;
            tl[tlvIdx].length = sizeof(trackerProc_Target) * result->numTargets;
            packetLen += sizeof(MmwDemo_output_message_tl) + tl[tlvIdx].length;
            outputMessage.numTLVs += 1;
            tlvIdx++;
        }

    and the TLV is sent out (including the header above) below

        if(result->numTargets > 0) {
            /* If any targets tracked, send send target List TLV  */
            UART_writePolling (uartHandle,
                               (uint8_t*)&tl[tlvIdx],
                               sizeof(MmwDemo_output_message_tl));
            Task_sleep(1);
    
            UART_writePolling (uartHandle, (uint8_t *)(tList),
                        sizeof(trackerProc_Target) * result->numTargets);
            
            tlvIdx++;
            
            Task_sleep(1);
        }

    You can also refer to the source of / re-use the people counting python visualizer which works with the long range lab as well (make sure to select the correct demo type in the drop down):

    C:\ti\mmwave_industrial_toolbox_4_4_1\labs\people_counting\visualizer

    Regards

    -Nitin

  • Hi Nitin,

    Thank you for the valuable information. I now use the visualizer python code in my PC to read the TARGET_LIST_3D TLV  in order to detect the number of human beings in the frame. I am able to count them and also track their positions. I use the oob_parser.py to read the TARGET_LIST_3D TLV and get the number of human beings detected in the frame.

    I browsed the mss_main.c taken from "C:\ti\mmwave_industrial_toolbox_4_4_1\labs\people_counting\68xx_3D_people_counting\src\mss\mss_main.c" and could not find the code lines as indicated by you in the previous thread. But I guess I found something similar in which the header is created and the tracker output is sent. The detected objects are sent from line number 671 and the tracker output is sent from line 682. I am attaching the mss_main.c which I have so that you can refer to it.

    I have the following technical doubts:

    1. Are the number of simultaneously tracked human beings restricted to 20 in the frame. We are doing this project for municipality who are interested to detect number of people in an outdoor area and like to have the range of the mmwave sensor to be 50 metres and certainly there can be more than 20 people in the frame.

    2. I tested the 3D people counting lab demo inside my room and it works pretty well. Then I flashed the "long_range_people_det_68xx_demo.bin" into the IWR6843ISK and loaded the appropriate config file "people_detection_and_tracking_50m_2D.cfg" from the visualizer. But the results were quite bad. I was the only person inside my room. But the visualizer was showing more than 4 people getting detected. Will my results be better if I take the sensor outdoor rather than testing it indoors with the "long_range_people_det_68xx_demo.bin". Please advise me on how to use the long range people detection lab so that it correctly counts and tracks the humans in the frame.

    Thanks and Regards,

    Sritam Paltasingh

    5518.mss_main.c

  • Hello

    for #1)  Please see the cfg file param specific to BA demo.   One of the arguments in trackingConfig  sets the track limit.  Always make sure enough memory is allocated to accommodate the tracks.

    For #2) Yes please try it in an open scenario to get similar results to what we have demonstrated.

    Thank you,

    Vaibhav

  • Hello Vaibhav,

    Thank you for the guidance. For indoor people counting I use "ISK_14m_extended.cfg" (3D_people_count_68xx_demo.bin flashed to the EVM) and for outdoor people counting I use "people_detection_and_tracking_50m_2D.cfg" (long_range_people_det_68xx_demo.bin flashed to the EVM). I am attaching the config files for your reference.

    I have the following doubts:

    1. In the config files there is a parameter called "trackingCfg" and the fourth argument to it has value 20 as can be seen from both the config files which I have attached. Is that the argument (argument number 4) of the "trackingCfg" parameter in the config file which I need to set to 50 in order to track 50 people simultaneously in the frame?

    2. How do I make sure that enough memory is allocated to accommodate the tracks? Is there any parameter in the config file which I have to set or do I need to make some modification in the source code of MSS or DSS.

    3. Is there any document which explains and describes the meanings of the parameters and their arguments present in the config files?

    4. For the long range outdoor people detection and tracking there are 2 config files named "people_detection_and_tracking_50m_2D.cfg" and "people_detection_and_tracking_50m_3D.cfg". Which one should I use when I test my sensor setup outdoors?

    Please help me. This project and the mmwave sensors are of great value to us.

    Thanks and Regards,

    Sritam Paltasingh.3644.ISK_14m_extended.cfg5545.people_detection_and_tracking_50m_2D.cfg

  • Hi Sritam,

    1. In the config files there is a parameter called "trackingCfg" and the fourth argument to it has value 20 as can be seen from both the config files which I have attached. Is that the argument (argument number 4) of the "trackingCfg" parameter in the config file which I need to set to 50 in order to track 50 people simultaneously in the frame?

    2. How do I make sure that enough memory is allocated to accommodate the tracks? Is there any parameter in the config file which I have to set or do I need to make some modification in the source code of MSS or DSS.

    Yes, this value configures the maximum number of tracks allocated by the tracker. Based on this value, the code also allocates the required data structures (memory) at start-up. We have not tested setting this value to 50 tracks and cannot comment whether it will work from a memory, MIPS as well as tracking perspective.

    3. Is there any document which explains and describes the meanings of the parameters and their arguments present in the config files?

    Please refer to the following documents for a general description of the tracker parameters. Note that some of the commands have changed over time (for example sceneryParams has been replaced with BoundaryBox) and new commands have been added so it's best to refer to the source for the CLI in the particular demo.

    People Counting and Tracking Reference Design Using mmWave Radar Sensor (Refer to the design guide)

    Tracker PDF Document provided in C:\ti\mmwave_sdk_03_04_00_03\packages\ti\alg\gtrack\docs

    4. For the long range outdoor people detection and tracking there are 2 config files named "people_detection_and_tracking_50m_2D.cfg" and "people_detection_and_tracking_50m_3D.cfg". Which one should I use when I test my sensor setup outdoors?

    The results for both 2D and 3D configuration are provided in the Long range lab user guide. For the best range, you should use the TX beamforming chirp which uses all 3 transmitters i.e. people_detection_and_tracking_100m_2D_advanced.cfg. Please refer to the table provided in the lab user guide for more details.

    Regards

    -Nitin