This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

AWR1642: How to calculate available volume of a container using mmwave sensor

Part Number: AWR1642

Hello,

We are working on a usecase of finding the available volume of a box container made of cardboard/plastic fibre. The procedure for the same is described below.

1. The volume of unfilled container is known beforehand.Divide the volume in uniform smaller units.
2. Fill the container with materials like cloths, paper etc.
3. Get the point cloud data by projecting mmwave on the partially filled container, it will reflect the points of only the area which is full while leaving unfilled area.
4. Calculate the volume by adding up the smaller units which have point clouds present.

Right now we are using only 2d point cloud to calculate the area which will be later extended 3d to calculate volume.
We have following queries based upon above description

a) How can we make the point cloud dense and uniformally scattered around the box?
b) Does point cloud has any effect if the target is stationary rather than moving?
c) How can we filter the reflections of empty box boundary and the reflection of the material inside it?
d) Which mmwave demo configuration is best suited for this use case?

Thanks in advance

Vikas

  • On TI-Rex you can find many of chirp configuration which may cater your requirement.
    dev.ti.com/.../

    And other point to note that reflection from cloths/paper may not be strong for mmWave to detect but still you need to do experiment with given configurations.
    Further you can modify mmWave Demo Visualizer as per your requirement, (menu-> Help-> Clone Visualize).

    Regards,
    Jitendra
  • Hi Vikas,

    Please look at the following thread on a similar topic:

    IWR1443: 3D map from the top view

    As Jitendra pointed out, the reflections from cloths/paper may not be strong enough. Besides that you also need to consider the Field of View, Range resolution and Angular resolution requirements based on the sub-regions in your container.

    Range Resolution is defined as the smallest distance between two objects that allows them to be detected as separate objects. Similarly, Angular resolution is defined as the smallest angular separation between two objects which are at the same range, that allows them to be separated in angular domain. Please refer to the following resources to understand the concepts of Range and Angular accuracy and resolution in more detail:

    Whitepaper: The fundamentals of millimeter wave sensors

    E2E: Range, velocity, angle accuracy

    Regards

    -Nitin