Hi Team,
May I ask for help? Kindly see below for our customer concern:
"I am an engineering student working on my bachelor’s thesis. My task is to recognize gestures with radar through a wall of ceramic tiles. The goal is to program everything in python.
To do so, I am using your mmWave “IWR6843ISK-ODS” evaluation kit.
My chosen way to go is to calculate features that I can feed a neural network with. I am using a python framework which directly provides a Range Doppler Heat Map and a Range Azimuth Heat Map. Similar to your online tool “mmWave Demo Visualizer”.
Inspired by the demo software “68xx AoP-ODS – Multiple Gesture and Motion Detection” on your Resource Explorer, I managed to implement the first few RDI-based features. From the Range Doppler Heat Map, I calculate all of the features also used in the demo software, like “wtSum”, “dopplerAvg”, “rangeAvg” and so on.
However, when it comes to the DOA-based features I struggle with understanding the concept behind the data processing used in the demo software. How is it possible to calculate features like “wtMeanAzim” or “wtElevSq” from the Range Azimuth Heat Map? Would it be possible to get these features from the raw sensor data? If yes, which data do I need?
A conceptual and concise explanation would really help me to understand the problem and accomplish this task.
Thanks in advance for your reply."
I hope you can help us with this. Thank you so much.
Kind regards,
Gerald