Other Parts Discussed in Thread: MMWAVE-SDK
Tool/software: Code Composer Studio
Hey,
I'm trying to port the gesture_swipe demo given in mmWave industrial toolbox 3.3 to the newest SDK. I am aware that the current version of the gesture demo supports SDK1.2 and I'm trying to make it run on SDK2.1.
Right now I'm trying to understand the process with which the gesture is interpreted. Here's what I found, please correct me if I'm wrong:
In comparing the main_swipe.c in the gesture demo and main.c of the newest visualizer demo, apart from the difference due to the fact they implement different versions of SDK, the only difference is that in 'MmwDemo_dataPathTask,' the gesture demo has these additional lines:
//Gesture_findNumDetections(dataPathObj);
//numDetectedObjects=Gesture_findNumDetections(4000,dataPathObj );
//counterstart = Pmu_getCount(0);
Gesture_findNumDetections(dataPathObj->numRangeBins, dataPathObj->numDopplerBins, dataPathObj->rangeDopplerLogMagMatrix,8000 ,gestureMetrics,maxIndices);
//counterend = Pmu_getCount(0);
counterstart = Pmu_getCount(0);
angleIdx=Gesture_angleEst(dataPathObj,maxIndices);
counterend = Pmu_getCount(0);
...
The above code calls the procedure in "gesture_swipe.c".
There's also a file called "handCraftedClassifier.c" where it predicts detected gestures.
Those are my initial views of the code. If it is correct, I think my question is: how does the gesture detection pipeline work and how does it identify different gestures? Getting on with this lab has been hard because of the lack of comments in the code.
I would much appreciate it if someone helps me understand this lab.
Many thanks in advance.
Ziheng