I have been trying to add tracking to the SDK 3.4.0.3 demo project so that, in addition to the detection point cloud, target tracking information will be output. As a tracking example, I have been using the Overhead 3D people counting demo. In this example, the detections are made available in polar coordinates and since the tracker expects its inputs in Polar format, that works.
In the out-of-box demo, detections are in Cartesian format. Adding a tracker to this project has proven to be very complicated. Apart from having to convert the detection format, the lab and demo project architectures are very different and integrating functions such as the tracker configuration (DPU_TrackerProc_config) is hard to do, since it does not easily fit into the demo structure.
What is the project structure philosophy that should be followed? The gtrack library is included in SDK 3.4.0.3, but I see no examples of how it is to be used and the demo project's Cartesian format detections make it difficult to implement. Which architecture will be supported in future? Should I start with the 3D people counting lab and stick to it as a starting point? It seems that those Industrial Toolbox lab examples do not persist for very long, while the SDK demo is evolving over time. I would prefer to stick with an architecture that TI is also following because over time I'd like to upgrade parts of the project as a new SDK is released. I'm looking for a recommendation for which project to use as a starting point if I want to develop a people tracking application for the IWR6843. Or some tips on how to add the provided tracking library to the SDK demo. I currently have the IWR6843 ODS device but would like to migrate to the AOP with ES2.0 silicon
Thank you.