This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

IWR1443BOOST: Question about the data from demo and gesture sofware

Part Number: IWR1443BOOST
Other Parts Discussed in Thread: IWR1443

Hello,

I started a week ago with the IWR1443 to read the data out, I am able to do this and the get everything read out as it should. My only question about the data is from the objects that is being send. I want to know what some of the data means. 

When you get the data package of the objects you get multiple values I know some of them what they are if you could help me figure out the rest and please correct me if I got anything wrong.

Rangeindex - this is the distance in meters.
dopplerindex - this is the speed in m/s
peakvalue - I thought this was the dB from the object as is shown in the range profile plot on the mmwave visualizer.
X - Is this the distance in m or the angle?
Y - this is the same as the range, so the dinstance of the object.
Z - is this also in m or just the angle?

So my second question is about the gesture demo is there a document that explains the code and what is does? I read through the code multiple times but have a hard time figuring out what all the data means also how to you determine if the gesture is a swipe or a twirl?

  • Hello Vera,

    I can answer the first question. peakValue is actually a unit-less term that represents the returned signal strength from a detected object. You are correct that this is used to construct the range profile on the visualizer.

    The X, Y, and Z coordinates are all distances that are computed in meters.


    Lastly, I've asked our gesture recognition expert to look at your final query. They should have an answer for you in the next few days.


    Cheers,
    Akash
  • Hello Akash,

    Thanks for the answer, I will be patiently waiting on the answer from the recognition expert.
    In the meantime could you answer an other question for me?
    I have been reading out the points of an object that I placed in front of the radar, it is just an box. But when I look at the data I noticed that it looks like the Y and X direction are swapped. So maybe you could verify for me that the Y direction is the distance from the radar to the object and the X axis is the left or right position, is this true? or is it the other way around.

    Thanks,
    Vera
  • Hello,

    A week has past and I don't have an answer to my question, I know this will take some time to get an answer but maybe I could clarify what I actually want. I am still figuring out how the code of the gesture demo works but what I really want to know is what the code in the gesture_swipe.c and the gesture_twirl.c does. It doesn't need to be very exact what it does but just what the data coming out of it means, the data that will be send over to the GUI if I know that I can figure out what I need to do to get a gesture out of it. So could someone just describe these files a bit for me?

    Thanks!
  • Former Member
    0 Former Member in reply to Vera
    Hello Vera,

    In regards to how gesture recognition works on mmWave sensors: The lab leverages the ability of radar to provide range, doppler, and angle information. When a gesture is made, the combination of the range, doppler, and angle signatures relate whether the hand is present, the velocity of the hand, and the position and orientation of the hand. The gesture processing chain is an add on to the standard out of box demo chain. gesture_swipe.c and gesture_twirl.c compute additional signatures needed for gesture recognition. In main_swipe.c and main_twirl.c there is a variable gestureMetrics that is an array that holds the data signatures.

    These signatures are used by the classifier to detect the gesture. For the swipe gesture, a handcrafted classifier was developed by performing the gesture numerous times over and examining the gesture signature for patterns to determine thresholds and valid ranges. These requirements for a valid swipe gesture were implemented in the handcraftedclassifier.c file. In the twirl gesture the data signatures are passed to MATLAB where a neural network handles the classification.

    Best,
    Amanda