This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

IWR6843ISK: Beamsteering in close proximity

Part Number: IWR6843ISK
Other Parts Discussed in Thread: IWR6843AOP

Hi,

I would like to implement beamsteering to focus the view of the sensor in a particular direction. For example, two people will be sitting 1 meter from the sensor, but at a different angle with respect to the sensor. I want the beam to direct first to one person to get some information (to be later used for Vital Signs estimation) and then the beam must turn to the second person to get radar information. I am using the standard mmWave SDK 03.05 firmware and using the TI visualizer to visualize the range-angle FFT information. I have looked at the Long Range People Detection lab to get more information about the beamsteering using the IWR6843ISK, but all of the config files the lab provides are for long range. I want to beamsteer in the short range. So, I have tried to combine parts of the Long Range lab config files with the config generated by the TI Visualizer, but no luck until now. I have implemented a config file which should be pointing 45 degrees in one direction, but no matter at what angle I sit with respect to the sensor, I am still visible on the range-angle graph.

To put it concretely: I want a config file where I can point the radar to an area of interest, but in the short range (1-2 meters). So if the radar is beamsteering in one direction, persons in another direction are not visible. See below for the config file that I have build until now.

I really hope you can help me, because I am quite stuck at this point. I also would like to mention that I am no expert in radar theory. I know the basics, but my expertise lays in the Embedded Systems field.

sensorStop                                                  
flushCfg
dfeDataOutputMode 3
channelCfg 15 7 0
adcCfg 2 1
adcbufCfg -1 0 1 1 1
lowPower 0 0
%
profileCfg 0 60 359 7 57.14 0 7123968 70 1 256 5209 0 0 158
%
chirpCfg 0 0 0 0 0 0 0 7
chirpCfg 1 1 0 0 0 0 0 7
%
advFrameCfg 1 0 0 1 0
subFrameCfg 0 0 0 2 16 1000 0 1 1 1000
%
guiMonitor 0 1 1 0 1 0 1
%
cfarCfg 0 0 2 8 4 3 0 15 1
cfarCfg 0 1 0 4 2 3 1 15 1
%
multiObjBeamForming 0 1 0.5
%
calibDcRangeSig 0 0 -5 8 256
%
aoaFovCfg -1 -90 90 -90 90
cfarFovCfg 0 1 -1 1
cfarFovCfg 0 0 0 8.92
%
clutterRemoval -1 0
% 
compRangeBiasAndRxChanPhase 0.0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0
measureRangeBiasAndRxChanPhase 0 1.5 0.2
%
bpmCfg -1 0 0 1
extendedMaxVelocity -1 0
%
CQRxSatMonitor 0 3 5 121 0
CQSigImgMonitor 0 127 4
analogMonitor 0 0
lvdsStreamCfg -1 0 0 0
%
calibData 0 0 0
sensorStart

  • Hi Dirk,

    I am going to continue to look into this for the next few days but I believe the subject is too close to the sensor for beamforming to completely cut out the other person from view if they are 1-2 meters away. The beam has two side lobes (seen in figure 1 here: https://dev.ti.com/tirex/explore/node?node=ADB-lBeMHspqQq8ks4cGHg__VLyFKFf__LATEST&r=VLyFKFf__4.8.0&r=VLyFKFf__4.9.0) and those lobes are most likely causing the detections within the 1-2 meter range you are seeing.

    Can you explain your application to me a bit more? Why do you need to get one person's vital signs at a time? How long are you trying to get vital signs for before you move to the next person? Are these people static?

    Thank you,

    Angie Mitchell

  • Hi Angie,

    Thanks for your reply! I want to isolate the two persons using beamsteering, because you need the phase data from the range-FFT to get the vital signs. But when the two people are at the same proximity from the sensor, we can't differentiate between those two people using the range-FFT. Therefore, I want to differentiate between the data of the two people using beamsteering.

    I first want to get the locations from the two persons using the rangeAngle-FFT, then calculate the angle for the beamsteering for the two different persons, and using that information, first record some data in the direction of one person, get the vital signs data, and after that do the same for the other person (switch between them every 5 seconds or so). This seemed to be the best solution to this problem in my head. For the first test the people can be in a static place, but the next step will be moving people, the angles for the beamforming can be updated by running the rangeAngle-FFT periodically. I hope this clears things up a little bit. Feel free to ask more questions.

    Thanks for the help so far!

  • Hi Dirk,

    Thanks for all this information! This is helping me think about other potential solutions for switching between 2 people for vital signs. I have two more questions... 

    1. Which lab are you using? For this I would recommend our 3D People Counting with Vital Signs which has vital signs on top of our people counting lab. This is also our latest vital signs algorithm. However, it works with our IWR6843AOP EVM and it sound like you are using our IWR6843ISK?

    2. Are these two people in fixed positions or are they moving?

    Thanks,

    Angie

  • Hi Angie,

    Glad that I could help with the new insights! Here are the answers to your questions:

    1. I'm using the 68xx_vital_signs lab, for two reasons: I'm indeed using the IWR6843ISK as my sensor, and the source code for the 3D vital signs lab has not been released yet. And to my knowledge, this 3D lab can also only detect one person.

    I'm also using pieces from the Long Range People Detection lab, because in that lab the beamforming is working, and I could not get the beamforming to work in the default 03.05 SDK firmware (the documentation for the SDK denotes that only a value of '0' has been tested for the phase shifters). But like said before, I'm looking for short range, not long range.

    2. For my project to succeed the people can be in static positions. But these positions are not known beforehand, so I want to look for the persons first using a range-angle-FFT.

    I hope this anwers your questions. Thanks for the help so far!

  • Hi Dirk,

    Thanks for this information. If you contact your local TI sales representative they can help you with the process for accessing the 3D vital signs lab.

    I believe this would help your use case as it would give you insight into how the people tracking algorithm from our 3D people counting lab and our vital signs algorithm work together to do what your project is trying to do, but for 1 person. This is also our most up to date vital signs algorithm  

    If you are unable to access the source code I would recommend looking into how our 3D people counting lab could work to detect two people in the instance you described. If you have any questions about how this might work feel free to ask me. 

    Thank you,

    Angie