This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

IWR6843ISK: how to detect gesture behaviour in "64xx ISK - Multiple Gesture and Motion Detection" demo

Part Number: IWR6843ISK


Dear support team,

I am investigating "64xx ISK - Multiple Gesture and Motion Detection" demo, by comapring "68xx AoP_ODS - Multiple Gesture and Motion Detection".

It seems "86xx ISK" version is supporting only 2 kinds of gesture; "Right to Left" and "Left and Right", which is realized by not machine learning but tracking algorithm.

Am I right ? If I am wrong, please kindly advise to me.

Best regards,

Taka

  • Hi, 

    There is another thread where the discussions are on going with the same customer. Please let me know if this for the same group or different. 

    https://e2e.ti.com/support/sensors-group/sensors/f/sensors-forum/1029571/iwr6843-what-is-the-value-range-profile-of-relative-power-of-mmwave-demo-visualizer-in-demo-code

    Regards, 

    Sudharshan K N 

  • Hello Sudharshan KN,

    He is my colleague but purpose of his question is different from me.

    Best regadrs,

    Taka

  • Hi, 

    The models is based on NN and implements an inference model. The model parameters are defined in C:\ti\mmwave_industrial_toolbox_4_7_0\labs\gesture_recognition\68xx_multi_gesture_and_motion_det\src\common\ANN

    float gestureInferenceProcessing (float * featureVecLog, int32_t gestureCnt[], int32_t gestureCntPrev[])
    {
    float gestureOut;
    int32_t maxCheckFlag;
    /* Augment to the feature vectors with some filtered features. These new features
    * note the change in angle (in both azimuth and elevation) around the peak doppler. */
    angleDelta(&featureVecLog[AZIM_DELTA_IDX], // AzimDelta, the first new feature.
    &featureVecLog[ELEV_DELTA_IDX], // ElevDelta, the second new feature.
    &featureVecLog[MSS_FEATURE_IDX_WTDOPPLER*FEATURE_LENGTH], // WtDoppler[]
    &featureVecLog[MSS_FEATURE_IDX_WTAZIM*FEATURE_LENGTH], // WtAzim[]
    &featureVecLog[MSS_FEATURE_IDX_WTELEV*FEATURE_LENGTH]); // WtElev[]

    maxCheckFlag = maxCheck(&featureVecLog[MSS_FEATURE_IDX_WTDOPPLER*FEATURE_LENGTH]);


    /* Perform inference. */
    Inference(&featureVecLog[0], &gANN_struct_t);


    /* Some additional heuristics on the inferred probabilities. */
    gestureOut = postProcessing(gANN_struct_t.prob, gestureProbabilityLog, gestureTimeStamp, gestureCnt, gestureCntPrev, maxCheckFlag);

    return gestureOut;
    }

    Regards, 

    Sudharshan K N 

  • Hello Sudhrahan  N,

    Excuse me that my query was wrong. Let me correct as below ;

    wrong> It seems "86xx ISK" version is supporting only 2 kinds of gesture; "Right to Left" and "Left and Right", which is realized by not machine learning but tracking algorithm.

    corrected> It seems "64xx ISK" version is supporting only 2 kinds of gesture; "Right to Left" and "Left and Right", which is realized by not machine learning but tracking algorithm.

    I suppose you replied about ANN used in "68xx AoP_ODS - Multiple Gesture and Motion Detection" demo, but I would like to know about "64xx  ISK - Multiple Gesture and Motion Detection" demo.

    Best regards,

    Taka

  • Hi, 

    Yes for IWR64xx the implementation doesnt seem to use NN approach. 

    Regards, 

    Sudharshan K N