This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

IWR1443BOOST: Object Classification based on IWR1443BOOST

Part Number: IWR1443BOOST
Other Parts Discussed in Thread: IWR1642, IWR1443

Hi all,

I am happy to get the quick responses from your team in mmWave device for any issue i got till now.

Now i want to know is there any possibility to classify the objects in different catogory with IWR1443BOOST device?

If yes then is there any demo available for that? If not can i approach for the same? How should i start?

Thanks.

  • Hi Rahul,

    The IWR1642 which includes a C667x DSP core is a better candidate for running advanced algorithms such as clustering, tracking and classification. While we do not have a demo which shows object classification, we do have the People Counting Demo which implements association and tracking which are typically needed before classification.

    You may also want to look at the Water vs Ground classification demo which is based on IWR1443 and differentiates between water (liquid) and ground using the displacement on water surface due to fine ripples. This is not object classification, but depending upon what you're trying to achieve, it could be of interest.

    Regards

    -Nitin

  • Hi Nitin,

    Thanks for the suggestion.

    If i go in deep with the concept. Then the mmwave works on delay in between the received signal coming from an object and the transmitted signal. Also for water-ground classification, it works on the dsturbance due to fine ripples. So that means it doesn't work on the type of reflected signal(correct me if i am wrong).

    Will the current concept be sufficient for object classification? As i am getting all possible data through UART on my PC and want to incorporate some algorithms for object classification there.

    I need to know is mmwave capable enough to send the data required for object classification?

    Thanks.

  • Hi Rahul,

    At a fundamental level, TI mmwave sensors provide you a point cloud where each point contains range, intensity, velocity and angle of arrival information. Different objects reflect radar signal differently depending upon their shape, size, material, surface etc. The radar reflectivity coefficient of an object is quantified as the RCS (Radar Cross Section) value. For context, the typical RCS of a human is 1 while that of a car is 10, and a truck is 100.

    Besides the number of points, shape and size of the point cloud and intensity of points, you also have velocity associated with each point. So besides the number of points, position and inter-point seperation, and shape of the point cloud, velocity can be used for identifying charecteristics such as overall velocity, and/or relative velocity differences between the parts of an object. e.g. human arms move differently w.r.t. the rest of the body or compared to the arms of an animal. Such information can be used to classify between say a human and a dog to reduce false triggers in security systems.

    The more indicators an object provides from the above list, the better chances there are to develop a classification algorithm based on those.

    Regards
    -Nitin

  • Hi Nitin,

    As you told about the RCS value, Does this value also come with the incoming data? It can help me to dig around for material classification also.

    Thanks.

  • HI Rahul,

    RCS value is not returned in the point cloud information. It is a characteristic of the reflecting body as a whole and not associated with an individual reflection. This value is used in designing the front-end parameters for a radar system. You can refer to this app note to understand how it is used in the Radar equation for designing  chirp parameters for an an FMCW radar: Programming Chirp Parameters in TI Radar devices

    There may be specialized techniques for determining the RCS of an object and you can search the web for literature on those.

    Thanks

    -Nitin