This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

IWR6843ISK: 2D vs 3D people count

Part Number: IWR6843ISK
Other Parts Discussed in Thread: IWR6843

Hi

In people count lab (Industrial Toolbox 4.1.0), you have both 2D and 3D applications. What is the main difference between the two labs? Can I use IWR6843ISK for both applications? I notice that the GUI in 3D people count lab has the option to run 2D people count as well. Can I use this GUI for both 2D and 3D people count labs?

Thanks!

Kai 

  • Hello Kai,

    Please download the latest Toolbox 4.2.0.

    3D People Counting is the only supported version now, although it can be used in a 2D configuration.

    Would you be able to discuss your application?

    Cheers,

    Akash

  • Hi Akash

    Thanks for your quick reply.

    My application is to check the number of people in the room.

    Thanks!

    Kai

  • Hi Kai,

    The 2D version is our more mature offering - however the 3D version provides elevation information. So it is your choice which you want to use. You can use the 3D People Counting gui for both.

    Regards,

    Justin

  • Hi Justin

    Can I use 6843ISK to run 68xx ISK_ODS - 3D People Counting? I only found POINTCLOUD_2D in the output data format (the following link). Where is the elevation information?

    http://dev.ti.com/tirex/explore/node?node=AAri63jHa5Rw4TNhkvql9A__VLyFKFf__LATEST

    Thanks!

    Kai 

  • Hi Kai,

    This lab supports 3D data. In the point cloud output, there is elevation, azimuth, and range, which is a 3D data set. Please make sure you have the latest version of the toolbox open.

    Regards,

    Justin

  • Hi Justin

    I am trying to parse the sensor output for 3D people counting in matlab based on the data format in the following link. The bin and cfg files I use are 3D_people_count_68xx_demo.bin and ISK_6m_default.cfg. But I am having trouble in parsing the first 48 bytes frame header. I managed to develop the similar parser for the out of box demo. I changed the data format part accordingly based on the data format in 3D people counting. I was wondering if the first 48 bytes from the sensor output is the frame header. If not, what is the output data format?

    dev.ti.com/.../node

    Thanks!

    Kai

  • Hi Kai,

    The first 48 bytes are the frame header. The output format is explained here: http://dev.ti.com/tirex/explore/node?node=AAri63jHa5Rw4TNhkvql9A__VLyFKFf__LATEST

    Regards,

    Justin

  • Hi Justin

    Is it possible to save the data including point cloud, tracked target from the GUI in the following location into a ascii file?

    C:\ti\<mmwave_industrial_toolbox_install_dir>\labs\people_counting\68xx_3D_people_counting\gui

    Thanks!

    Kai

  • Hi Kai,

    See the visualizer documentation here: http://dev.ti.com/tirex/explore/node?node=AINQDBLWEvn7AE47v97xNg__VLyFKFf__LATEST

    Follow the steps in Developer's guide to setup to run the .py files. (Ensure you install 64 bit python).

    Then do the following:

    1. In oob_parser.py, line 69, set "self.saveBinary" to 1.
    2. In the gui folder, create a folder called "binData" (make sure you spell this so that it matches the spelling in oob_parser.py, line 617
    3. When running the visualzier, every 1000 frames, a binary file will be generated which contains all of the raw uart output from the device.

    Regards,

    Justin

  • Hi Justin

    Thanks for your quick reply. I have additional questions on the GUI code.

    1. In oob_parser.py, line 634, why the baud rate for 3d people count is doubled?

    2. In oob_parser.py, line 591, does the GUI read 4666 bytes data at each frame? Why do we need to read such a large number of byte?

    3. Why do we need to use Unit to calculate the parameters of point cloud?

    Thanks!

    Kai

  • Hi Kai,

    1. The 3D People Counting firmware outputs at double the baudrate, so the visualizer is configured to match it.
    2. This is the maximum number of bytes the old 2D People Counting demo could send per frame - the pyserial function to read the whole UART buffer doesn't work properly, so the visualizer just reads a large chunk to ensure it gets to the next frame header(it reads again if it doesn't have as much data as the frame header reports in packetLenght)
    3. Each point is 5 floats - this is 20 Bytes, and we send hundreds per frame. Because that is a lot of data, we compress the point to just a few bytes. Then we multiply by the point unit values to decompress the points.

    Regards,

    Justin

  • Hi Justin

    On my second question, what is the size of whole uart buffer, 64kB per frame? I am trying to read the uart output using matlab. Not sure if the same issue exists for matlab as well.

    Thanks!

    Kai

  • Hi Kai,

    Matlab should have a similar UART read function. Since we will have a different amount of data each frame, it is preferable if you can just read all data from the UART buffer. If you can't, just read a large value and make sure you have timeout set between 10 and 30 ms.

    Regards,

    Justin

  • Hi Justin

    What is the size of the UART buffer? If I read all the data in the UART buffer, it means the data could include the data at multiple frames, correct?

    Thanks!

    Kai

  • Hi Kai,

    That is correct, you could read multiple frames at once. It is likely that you may get data from two frames at once, especially if your visualizer falls behind the device. 

    You set the size of the buffer on your local machine, make sure this is large enough. Most data you could send in one frame would be:

    48 Bytes - frame Header

    3*8 Bytes - 3 TLV Headers

    1150*8=9200 Bytes (all points)

    20 Bytes - Point Units

    40 *20=800 Bytes - tracks

    1150 Bytes - point cloud indexes

    12042 Bytes total

    Regards,

    Justin

  • HI Justin

    I assume 1150 is the max number of point cloud and 20 is the max number of tracks. If one frame only has 100 point cloud and 10 tracks. The size of this frame should be 48+24+100*8+20+40*10+100=1392 Bytes, and 1393th byte should belong the next frame, correct?

    Thanks!

    Kai

  • Hi Kai,

    That looks correct.  You can also try the following if you are worried about reading too much data. The frame header struct contains a packet length value. If you read the frame header, you can parse out the packet length value, then read exactly that much. So your code would look like.

    uart_read(frameHeaderSize) //48 bytes

    parseHeaderData() //extract packet length here

    uart_read(packetLength)

    We have tried this in the past, but it has been unnecessary for our applications.

    Regards,

    Justin

  • Hi Justin

    Yes, that is exactly what I am doing now.

    For some frames, the numTlv I got is only 1 rather than 3, i.e., only point cloud exists. Does that mean I should disregard this frame of data?

    By the way, I am able to send the cfg file to the board line by line via teraterm, but not able to repeat the same using fprintf in matlab. I am following the matlab GUI example in the lab of people counting in industrial toolbox 4.1.0. Any suggestions?

    Thanks!

    Kai

  • Hi Kai,

    Every frame we send out frame header TLV. Then we determine if  we need to send out the Point Cloud, Tracker, and Tracker Index TLVs. In 3D People Counting, there is a static chain that always produces points, so we always have points and therefore always output the point cloud TLV. If the tracker is not tracking anything, there is no reason to output the tracker or tracker index TLVs. So practically, you will receive frames with either 1 TLV or 3 TLVs. 

    For the Matlab parser, we did the gui in Matlab 2017a, if this doesn't match your version you may need to implement differently. Here is the 2019B example: https://www.mathworks.com/help/matlab/matlab_external/getting-started-with-serial-port-communication.html

    Regards,

    Justin

  • Hi Justin

    I am using 2018b. So, I think fprintf and fgetl should still work as readline and writeline are only introduced in 2019b.

    The problem I am having is after I write each line of the configure file using fprintf. I cannot read the echo from UART as the matlab GUI does using fgetl and always get time out warning in reading it. Does the UART output the echo? What is the terminator used by the board?

    Thanks!

    Kai

  • Hi Kai,

    When sending a command to the board using the UART COM Port, you will get an echo.

    • Done - if command is successful
    • Error Message - if command fails

    If you are not getting an echo, it is probably one of the following:

    1. You do not have a next line character at the end of the transmission. Make sure to end each command with \r
    2. You are sending the commands to the DATA COM port, when you should be sending commands to the UART COM Port
    3. The device did not boot properly, and CLI is non-responsive

    You can rule out 3 if the visualizer works. 

    You can rule out 2 by checking the UART and DATA COM Port numbers

    Regards,

    Justin

  • Hi Justin

    You are right. I need to add \r at the end of each command. Now, I have another issue. matlab is able to send the command to the sensor only for the first time after the sensor is on. After that, I have to restart the sensor (power off and on). Otherwise, I get the same time out warning when read the echo and done. I am sure both UART and Data COM ports are closed and deleted, but the sensor is still running and the first command I sent is SensorStop.

    Thanks!

    Kai 

  • Hi Kai,

    This is the expected behavior. The Capon Algorithm needs a lot of memory to work. The code base is also very large. Some of the initialization code is stored in L3 Ram; after the configuration is sent, this code is overwritten with radar data. This means that the sensor no longer has the code required to change configuration after the first configuration is sent. This only applies for the People Counting labs based on Capon BF. The labs based on the Out of Box demo should allow you to resend the configuration.

    Regards,

    Justin

  • Hi Justin

    I see. It is true even for the python GUI you developed. Now, I am able to collect the data using the matlab GUI. I am doing a simple test where I put a metal plate in front of the sensor about 0.4 meter away, but the point cloud I got are on the cm scale. I was wondering if it is because the point unit I parsed is not correct, but I followed the python code you provided.

    Thanks!

    Kai

  • Hi Kai,

    The results should not be in cm scale. You are saying that the values are 40? Are you converting from polar to cartesian?

    Regards,

    Justin

  • Hi Justin

    I fixed this issue. The problem is that the point structure has different data type for different parameter but I used an array to parse them.

    My another question is the numberOfPoints in Target Index TLV  at frame N should be the same as the numberOfPoints in Point Cloud TLV at the frame N-1 (previous frame), correct? I am asking it because I don't see such a relation between the two variables.

    Thanks!

    Kai

  • Hi Kai,

    That is correct, target Index at frame N should equal points at N-1.

    Regards,

    Justin

  • Hi Justin

    What does that mean when I have different number of points from the two TLVs at frame N and N-1?

    Thanks!

    Kai

  • Hi Kai,

    Are you using the toolbox 4.2.1 version? Previously we had found that there was a race condition in the 4.2.0 version, that meant the tracker could update during point cloud transmission time - if this is the case, it could potentially change the index TLV output.

    Regards,

    Justin

  • Hi Justin

    I am using 4.2.0. I didn't notice you updated the toolbox to 4.2.1. Let me try that to see if it solves the issue.

    Thanks!

    Kai

  • Hi Justin

    Yes, 4.2.1 version solves this issue. So, all the point cloud are the detected points at the current frame but the target tracker and target index are calculated based on the point cloud at the previous frame, correct?

    Thanks!

    Kai 

  • Hi Kai,

    That is correct. From a visualization standpoint it doesn't really matter, as there is only a  50 ms difference, but when building out your application keep this in mind.

    Regards,

    Justin

  • Hi Justin

    Can I use 3D people count for the application of stance detection using the regular IWR6843 antenna board? Or, do I have to use ODS board?

    Thanks!

    Kai

  • Hi Kai,

    You can use the ISK (regular antenna) or ODS board.

    Regards,

    Justin