This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

IWR6843AOPEVM: IWR6843AOPEVM

Part Number: IWR6843AOPEVM

Hi,

I am performing a building occupancy experiment using an IWR6843EVM ceiling mount.

When I installed the board in the open area, it performed as expected and gave good results.

But when I installed the same in a doorway near to glass door. I think the glass door caused multiple reflections due to which we are getting errors in entry and exit counts.

I have tried increasing snrThre and pointsThre but sometimes I am getting double counts in entry and exit and sometimes it is missing entry and exit altogether.

So if anyone there has undergone these scenarios kindly help.

  • Hi,

    Which binary have you flashed the device with? Do you mind sharing the .cfg file you are sending to the device? In the meantime, I would take a look at the 3D People Counting Tracker Tuning Guide:

    https://dev.ti.com/tirex/explore/content/radar_toolbox_1_00_01_07/source/ti/examples/People_Counting/docs/3D_people_counting_tracker_layer_tuning_guide.pdf 

    Thanks,

    Tim

  • Hi Tim,

    I have flashed the device with low bandwidth overhead_3d_people_counting binary from industrial toolbox 4.10.1.
    (C:\ti\mmwave_industrial_toolbox_4_10_1\labs\People_Counting\overhead_3d_people_counting\prebuilt_binaries\low_bandwidth)

    Configurations sent to the device are-

    sensorStop
    flushCfg
    dfeDataOutputMode 1
    channelCfg 15 7 0
    adcCfg 2 1
    adcbufCfg -1 0 1 1 1
    lowPower 0 0
    profileCfg 0 61.2 60.00 17.00 50 328965 0 55.27 1 64 2000.00 2 1 36
    chirpCfg 0 0 0 0 0 0 0 1
    chirpCfg 1 1 0 0 0 0 0 2
    chirpCfg 2 2 0 0 0 0 0 4
    frameCfg 0 2 224 0 110.00 1 0
    dynamicRACfarCfg -1 10 1 1 1 8 8 6 4 4.00 6.00 0.50 1 1
    staticRACfarCfg -1 4 4 2 2 8 16 4 6 6.00 13.00 0.50 0 0
    dynamicRangeAngleCfg -1 7.000 0.0010 2 0
    dynamic2DAngleCfg -1 5 1 1 1.00 15.00 2
    staticRangeAngleCfg -1 0 1 1
    %classifierCfg 1 1 1 500 0.6 1.0 0.95 10 ,
    antGeometry0 0 0 -1 -1 -2 -2 -3 -3 -2 -2 -3 -3
    antGeometry1 0 -1 -1 0 0 -1 -1 0 -2 -3 -3 -2
    antPhaseRot 1 -1 -1 1 1 -1 -1 1 1 -1 -1 1
    fovCfg -1 64.0 64.0
    compRangeBiasAndRxChanPhase 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0
    boundaryBox -2 2 -3 3 -0.5 3
    presenceBoundaryBox -2.5 2.5 -3 3 0.5 2.5
    staticBoundaryBox -1.75 1.75 -2.5 2.5 -0.5 3.0
    sensorPosition 2.7 0 90
    maxAcceleration 1 1 1
    trackingCfg 1 4 800 20 37 33 110 1
    gatingParam 3 1.5 1.5 2 4
    stateParam 3 3 6 20 3 50
    allocationParam 20 20 0.05 15 1.5 20
    sensorStart

    Thanks and regards,
    Samidha

  • Hi Samidha,

    Three parameters you will want to care about for entry and exit would be the gatingParam, stateParam, and allocationParam.

    For gatingParam: I noticed you changed the width/depth/height of the target. Are you trying to track humans? If so, I would recommend changing these back to the default values.

    For stateParam: I see you lowered the sleep2freethre. I would also recommend lowering the active/static/exit2freeThre values.

    For allocationParam: Modifying parameter 1,3,5 here may also improve performance.

    Further information about these parameters can be found in the guide mentioned above.

    Hope this helps,

    Tim

  • Hi Tim,

    I have modified the configurations as per your suggestion, but still not getting satisfying results.
    Whenever the door is closed I am not getting the point cloud data on the entry side as waves seem to reflect back along with the door, which is why the OUT count is getting incremented in place of IN count.

  • Hi Samidha,

    Could you provide a picture of your current setup so I can better understand the space you're working in?

    Also, have you adjusted the boundaryBox parameters in the tracking configuration? In reflective environments like this, you may have better results by adjusting the boundary box to be a little smaller than the actual physical dimensions of the space. You can also experiment with making the box a little larger as well.

    Regards,

    Tim 

  • Hi Tim,

    Please find the video and photos of my current setup




    My observations - 
    Scenario 1 - Whenever we include a glass door in our FOV, there is a ghosting when we open the door and the actual target disappears.
    Scenario 2 - Whenever we set the FOV inside the door (i.e. excluding the door) we are getting Exit but Entry is missing.

    currently, my configuration (2nd scenario - for FOV excluding door)

                "sensorStop""sensorStop \n",
                "flushCfg""flushCfg \n",
                "dfeDataOutputMode""dfeDataOutputMode 1 \n",
                "channelCfg""channelCfg 15 7 0 \n",
                "adcCfg""adcCfg 2 1 \n",
                "adcbufCfg""adcbufCfg -1 0 1 1 1 \n",
                "lowPower""lowPower 0 0 \n",
                "profileCfg""profileCfg 0 61.2 60.00 17.00 50 328965 0 55.27 1 64 2000.00 2 1 36 \n",
                "chirpCfg1""chirpCfg 0 0 0 0 0 0 0 1 \n",
                "chirpCfg2""chirpCfg 1 1 0 0 0 0 0 2 \n",
                "chirpCfg3""chirpCfg 2 2 0 0 0 0 0 4 \n",
                "frameCfg""frameCfg 0 2 224 0 110.00 1 0 \n",
                "dynamicRACfarCfg""dynamicRACfarCfg -1 10 1 1 1 8 8 6 4 4.00 6.00 0.50 1 1 \n",
                "staticRACfarCfg""staticRACfarCfg -1 4 4 2 2 8 16 4 6 6.00 13.00 0.50 0 0 \n",
                "dynamicRangeAngleCfg""dynamicRangeAngleCfg -1 7.000 0.0010 2 0 \n",
                "dynamic2DAngleCfg""dynamic2DAngleCfg -1 5 1 1 1.00 15.00 2 \n",
                "staticRangeAngleCfg""staticRangeAngleCfg -1 0 1 1 \n",
                "classifierCfg""%classifierCfg 1 1 1 500 0.6 1.0 0.95 10",
                "antGeometry0""antGeometry0 0 0 -1 -1 -2 -2 -3 -3 -2 -2 -3 -3 \n",
                "antGeometry1""antGeometry1 0 -1 -1 0 0 -1 -1 0 -2 -3 -3 -2 \n",
                "antPhaseRot""antPhaseRot 1 -1 -1 1 1 -1 -1 1 1 -1 -1 1 \n",
                "fovCfg""fovCfg -1 64.0 64.0 \n",
                "compRangeBiasAndRxChanPhase""compRangeBiasAndRxChanPhase 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 \n",

                "boundaryBox""boundaryBox -0.4 0.8 -0.4 2 -0.5 3 \n",
                "presenceBoundaryBox""presenceBoundaryBox 0 0.6 0 2 0.5 2.5 \n",
                "staticBoundaryBox""staticBoundaryBox 0 0.6 0 2 -0.5 3.0 \n",
                "sensorPosition""sensorPosition 2.5 0 90 \n",
                
                "maxAcceleration""maxAcceleration 1 1 1 \n",
                "trackingCfg""trackingCfg 1 4 800 20 37 33 110 0 \n",
                "sensorStart""sensorStart \n",

                "gatingParam""gatingParam 3 1.5 1.5 2.5 4 \n",
                "stateParam""stateParam 1 1 6 20 2 50 \n",
                "allocationParam""allocationParam 10 20 0.05 7 1.5 20 \n"


    Thanks and Regards,
    Samidha
  • Hi Samidha,

    When you mention FOV, what parameters are you exactly changing? The boundary boxes? Have you tried changing the fovCfg parameter? Right now in the azimuth and elevation you are going out 64* in each direction. I would experiment with lowering that value gradually and incrementally testing results.

    If possible, it would be nice to see a video of the results in the visualizer to further see what issues you are having.

    Regards,

    Tim

  • Hi Tim,


    I have tried changing fovCfg parameters, but it didn't help me a lot.

    In my present configurations, I have observed that when the human speed of walking is less than the normal walking speed, It detects the Entry and Exit with better accuracy. As we gradually increase the  speed of  walking the accuracy deteriorates (more towards the Entry point when we open the door to come in)

    Present Configuration:

    "sensorStop""sensorStop \n",
                "flushCfg""flushCfg \n",
                "dfeDataOutputMode""dfeDataOutputMode 1 \n",
                "channelCfg""channelCfg 15 7 0 \n",
                "adcCfg""adcCfg 2 1 \n",
                "adcbufCfg""adcbufCfg -1 0 1 1 1 \n",
                "lowPower""lowPower 0 0 \n",
                "profileCfg""profileCfg 0 61.2 60.00 17.00 50 328965 0 55.27 1 64 2000.00 2 1 36 \n",
                "chirpCfg1""chirpCfg 0 0 0 0 0 0 0 1 \n",
                "chirpCfg2""chirpCfg 1 1 0 0 0 0 0 2 \n",
                "chirpCfg3""chirpCfg 2 2 0 0 0 0 0 4 \n",
                "dynamicRACfarCfg""dynamicRACfarCfg -1 10 1 1 1 8 8 6 4 4.00 6.00 0.50 1 1 \n",
                "staticRACfarCfg""staticRACfarCfg -1 4 4 2 2 8 16 4 6 6.00 13.00 0.50 0 0 \n",
                "dynamicRangeAngleCfg""dynamicRangeAngleCfg -1 7.000 0.0010 2 0 \n",
                "dynamic2DAngleCfg""dynamic2DAngleCfg -1 5 1 1 1.00 15.00 2 \n",
                "staticRangeAngleCfg""staticRangeAngleCfg -1 0 1 1 \n",
                "antGeometry0""antGeometry0 0 0 -1 -1 -2 -2 -3 -3 -2 -2 -3 -3 \n",
                "antGeometry1""antGeometry1 0 -1 -1 0 0 -1 -1 0 -2 -3 -3 -2 \n",
                "antPhaseRot""antPhaseRot 1 -1 -1 1 1 -1 -1 1 1 -1 -1 1 \n",
                "fovCfg""fovCfg -1 60.0 60.0 \n",
                "compRangeBiasAndRxChanPhase""compRangeBiasAndRxChanPhase 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 \n",

                "frameCfg""frameCfg 0 2 224 0 110.00 1 0 \n",

                "boundaryBox""boundaryBox -0.7 0.7 -1.5 2 -0.5 3 \n",
                "presenceBoundaryBox""presenceBoundaryBox -0.7 0.7 -1.5 2 0.5 2.5 \n",
                "staticBoundaryBox""staticBoundaryBox -0.5 0.5 -1 1.8 -0.5 3.0 \n",
                "sensorPosition""sensorPosition 2.4 0 90 \n",
                
                "maxAcceleration""maxAcceleration 1 1 1 \n",
                "trackingCfg""trackingCfg 1 4 800 20 37 33 110 0 \n",
                "sensorStart""sensorStart \n",

                "gatingParam""gatingParam 3 2 2 3 4 \n",
                "stateParam""stateParam 1 2 3 10 1 25 \n",
                "allocationParam""allocationParam 10 15 0.05 5 1 20 \n"


    I am still working on the visualization part. Will send you once it is ready

     

  • Hi,

    A video would be great to see how the device is performing. A couple other things:

    1. In your config, sensorStart is not at the end - the device is likeley missing the last 3 (important) parameters.

    2. One thing that might cause it to perform worse when walking at a higher speed is that it usually takes a little time to allocate the track. If you made the bounding box larger, as shown below in red, then it might give the processor more time to allocate the track

      

    Regards,

    Tim

  • Hi Tim,

    The sequence of config parameters sent to the IWR processor is correct, ensuring no parameters are missed.
    What I have sent you in my previous reply is just the template from which my controller sends the parameters to the IWR sequentially as shown below. 

    As far as the increment of the boundary box (red part of your figure) is not possible, since there is another glass door at that end. I have set the maximum limit there.

    Is there any way in which we can decrease the processor time to allocate the track? Either by decreasing the frame periodicity or by any velocity parameters that I might be missing.

    In the current setting (mentioned in my previous reply), I have tried decreasing the frame periodicity to 60 from the current 110 for increasing the frame frequency. When I do that I don't even get the output frame. It just stops working after fetching all the configurations.

    I would appreciate your assistance.

    Thank you.

    Regards,
    Samidha

    -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

    resetRadar();

    TransmitCMD(configdata.sensorStop);
    TransmitCMD(configdata.flushCfg);
    TransmitCMD(configdata.dfeDataOutputMode);
    TransmitCMD(configdata.channelCfg);
    TransmitCMD(configdata.adcCfg);
    TransmitCMD(configdata.adcbufCfg);
    TransmitCMD(configdata.lowPower);

    TransmitCMD(configdata.profileCfg);
    TransmitCMD(configdata.chirpCfg1);
    TransmitCMD(configdata.chirpCfg2);
    TransmitCMD(configdata.chirpCfg3);
    TransmitCMD(configdata.frameCfg);
    TransmitCMD(configdata.dynamicRACfarCfg);
    TransmitCMD(configdata.staticRACfarCfg);
    TransmitCMD(configdata.dynamicRangeAngleCfg)
    TransmitCMD(configdata.dynamic2DAngleCfg)
    TransmitCMD(configdata.staticRangeAngleCfg)
    TransmitCMD(configdata.antGeometry0)
    TransmitCMD(configdata.antGeometry1)
    TransmitCMD(configdata.antPhaseRot)
    TransmitCMD(configdata.fovCfg)
    TransmitCMD(configdata.compRangeBiasAndRxChanPhase)

    TransmitCMD(configdata.staticBoundaryBox)
    TransmitCMD(configdata.boundaryBox)
    TransmitCMD(configdata.sensorPosition)
    TransmitCMD(configdata.gatingParam)
    TransmitCMD(configdata.stateParam)
    TransmitCMD(configdata.allocationParam)
    TransmitCMD(configdata.maxAcceleration)
    TransmitCMD(configdata.trackingCfg)
    TransmitCMD(configdata.presenceBoundaryBox)
    TransmitCMD(configdata.sensorStart)

  • Hi,

    Thanks for the prompt response. I see yes your order is correct. Again some output from the visualizer would be great for further assistance.

    Another thing to look at is lowering your TX backoff, this could help to get more points from the human walking and detect it quicker.

    Also, be sure you put your chirp parameters in the Sensing Estimator to make sure you are using valid values and the final parameters are good for your application.

    https://dev.ti.com/gallery/view/mmwave/mmWaveSensingEstimator/ 

  • Thanks Tim, I will definitely try out this mmwaveSensingEstimator and let you know.

    Meanwhile, do you have any documentation on how RADAR behaves when there is glass in its field of view (FOV)? Specifically, I'm interested in how glass affects radio waves. Any application note or record would be helpful.

    The sensor performs well when installed in an open area, but its performance deteriorates when there is a glass partition or glass door in the middle of its FOV. I'm unsure if this is due to refraction or signal strength.

    I would greatly appreciate any documents related to this topic.

    Warm regards,
    Samidha

  • Hi Tim,

    As shown in the images below, I have installed the IWR6843ODS board above the door at a 50-degree angle.

    (referring to https://dev.ti.com/tirex/explore/node?a=VLyFKFf__4.12.1&node=A__ANrhMJSr3bH7JapXg-20CQ__com.ti.mmwave_industrial_toolbox__VLyFKFf__4.12.1).

    Configuration parameters are mentioned below.

    I have recorded the video for single-person Entry and Exit (attached below).

    What I observed is that while exiting, the sensor detects and allocates a single track for the person, which is correct. However, while entering the field of view (FOV) by opening the glass door, it detects the person but allocates two tracks as he moves into the FOV after closing the door behind him

    Please help me to improvise the Entry of the person.

    Thanks and Regards,
    Samidha
    ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------




    ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

    configuration file - 

    sensorStop
    flushCfg
    dfeDataOutputMode 1
    channelCfg 15 7 0
    adcCfg 2 1
    adcbufCfg -1 0 1 1 1
    lowPower 0 0

    % Detection Layer Parameters
    % See the Detection Layer Tuning Guide for more information
    % "C:\ti\mmwave_industrial_toolbox_[VER]\labs\people_counting\docs\3D_people_counting_detection_layer_tuning_guide.pdf"
    profileCfg 0 61.2 60.00 17.00 50 657930 0 55.27 1 64 2000.00 2 1 36
    chirpCfg 0 0 0 0 0 0 0 1
    chirpCfg 1 1 0 0 0 0 0 2
    chirpCfg 2 2 0 0 0 0 0 4
    frameCfg 0 2 224 0 120.00 1 0
    dynamicRACfarCfg -1 10 1 1 1 8 8 6 4 4.00 6.00 0.50 1 1
    staticRACfarCfg -1 4 4 2 2 8 16 4 6 6.00 13.00 0.50 0 0
    dynamicRangeAngleCfg -1 7.000 0.0010 2 0
    dynamic2DAngleCfg -1 5 1 1 1.00 15.00 2
    staticRangeAngleCfg -1 0 1 1
    antGeometry0 0 0 -1 -1 -2 -2 -3 -3 -2 -2 -3 -3
    antGeometry1 0 -1 -1 0 0 -1 -1 0 -2 -3 -3 -2
    antPhaseRot 1 -1 -1 1 1 -1 -1 1 1 -1 -1 1
    fovCfg -1 64.0 64.0
    compRangeBiasAndRxChanPhase 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0

    % Tracker Layer Parameters
    % See the Tracking Layer Tuning Guide for more information
    % "C:\ti\mmwave_industrial_toolbox_[VER]\labs\people_counting\docs\3D_people_counting_tracker_layer_tuning_guide.pdf"
    staticBoundaryBox -0.5 0.5 0.2 3 -0.5 3
    boundaryBox -0.7 0.7 0.2 3.5 -0.5 3
    sensorPosition 2.4 0 50
    gatingParam 3 2 2 3 4
    stateParam 3 3 6 20 3 50
    allocationParam 20 20 0.05 20 1.5 20
    maxAcceleration 1 0.1 1
    trackingCfg 1 4 800 20 37 33 120 1
    presenceBoundaryBox -0.7 0.7 0.2 3.5 0.5 2.5
    sensorStart
  • Hi Samidha,

    Thanks for the video, that helps. A couple other things to try - change the angle to face more away from the door so you do not pick it up as much (you can also play around with the fovCfg parameter to try and block it out of the FOV.

    Another thing would be to maybe shrink the boundary boxes even further closer to the door.

    I think it's a good idea to switch from overhead to wall-mounted for this application.

    Let me know if this helps.

    Tim

  • Hi Tim,

    Per your suggestion, I have flashed the chip with a wall mount binary and installed it above the doorway with a tilt of 15 degrees.

    Wall Mounting did help to increase the accuracy of single-person entry and exit. 
    But multi-person or more than one person entering or exiting simultaneously is causing a lot of inaccuracy.

    Many times it considers 2 people as one or sometimes misses them altogether.

    I have tried reducing snrThreObscured, but it did not help me much.

    Is this the drawback of wall mount? or I am missing something?
    -------------------------------------------------------------------------------------------------------------------------------------------------------

    "sensorStop""sensorStop \n",
                "flushCfg""flushCfg \n",
                "dfeDataOutputMode""dfeDataOutputMode 1 \n",
                "channelCfg""channelCfg 15 7 0 \n",
                "adcCfg""adcCfg 2 1 \n",
                "adcbufCfg""adcbufCfg -1 0 1 1 1 \n",
                "lowPower""lowPower 0 0 \n",
                "profileCfg""profileCfg 0 60.75 30.00 25.00 59.10 328965 0 54.71 1 96 2950.00 2 1 36 \n",
                "chirpCfg1""chirpCfg 0 0 0 0 0 0 0 1 \n",
                "chirpCfg2""chirpCfg 1 1 0 0 0 0 0 2 \n",
                "chirpCfg3""chirpCfg 2 2 0 0 0 0 0 4 \n",
                "dynamicRACfarCfg""dynamicRACfarCfg -1 4 4 2 2 8 12 4 12 5.00 8.00 0.40 1 1 \n",
                "staticRACfarCfg""staticRACfarCfg -1 6 2 2 2 8 8 6 4 8.00 15.00 0.30 0 0 \n",
                "dynamicRangeAngleCfg""dynamicRangeAngleCfg -1 0.75 0.0010 1 0 \n",
                "dynamic2DAngleCfg""dynamic2DAngleCfg -1 3.0 0.0300 1 0 1 0.30 0.85 8.00 \n",
                "staticRangeAngleCfg""staticRangeAngleCfg -1 0 8 8 \n",
                "antGeometry0""antGeometry0 0 0 -1 -1 -2 -2 -3 -3 -2 -2 -3 -3 \n",
                "antGeometry1""antGeometry1 0 -1 -1 0 0 -1 -1 0 -2 -3 -3 -2 \n",
                "antPhaseRot""antPhaseRot 1 -1 -1 1 1 -1 -1 1 1 -1 -1 1 \n",
                "fovCfg""fovCfg -1 70.0 70.0 \n",
                "compRangeBiasAndRxChanPhase""compRangeBiasAndRxChanPhase 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 \n",

                "frameCfg""frameCfg  0 2 96 0 55.00 1 0 \n",

                "boundaryBox""boundaryBox -0.75 0.75 0 4 0 3 \n",
                "presenceBoundaryBox""presenceBoundaryBox -0.7 0.7 0.5 3.5 0 3 \n",
                "staticBoundaryBox""staticBoundaryBox -0.7 0.7 0.5 3.5 0 3 \n",
                "sensorPosition""sensorPosition 2.5 0 15 \n",
                
                "maxAcceleration""maxAcceleration 0.2 0.2 0.2 \n",
                "trackingCfg""trackingCfg 1 2 800 20 46 33 55 \n",
                "sensorStart""sensorStart \n",

                "gatingParam""gatingParam 2.5 1.5 1.5 2.5 3.5 \n",
                "stateParam""stateParam 2 2 8 50 5 50 \n",
                "allocationParam""allocationParam 15 50 0.1 15 0.5 20 \n"



  • Hi,

    If you could provide some more video, that might be helpful. Are the people walking close to each other? When you look at the point cloud in the visualizer are there two distinct "blobs"? How close are the people to the sensor? If there's a lot of obstruction of one person in front of the other, it might have a hard time distinguishing. This would take some more configuring of the tracker.

    Let me know some of the information above. The more I get, the better. In the meantime keep playing around with parameters using the tracker tuning guide.

    Regards,

    Tim