This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

MMWAVE-SDK: Radar Toolbox: Applying CaponBF to the full image (including static clutter)

Part Number: MMWAVE-SDK

Starting from the People Counting demo (Overhead 3D People Counting, AOP), I would like to know if it is theoretically possible to do similar processing to what is done on moving/static objects (i.e. zoomed-in CaponBF 3D AoA estimation, or at least fine range-azimuth-heatmap), but for the whole scene (including static clutter / background).

Based on the (incredibly helpful) demo implementation guide, it seems it could be done via the static processing part, but without eliminating the scene background information.

From what I understand, static clutter removal is what removes the background scene information, but it does not rely on previous frames so it cannot distinguish between background scene and static objects. Based on this, static objects should be removed by static clutter removal, which is clearly not the case (since static detection seems to work).

How is the background scene separated from the static objects added later? Is it possible to apply the static detection to a cube while keeping the background scene?