Hello Team,
I have a project to work on. For the People counting lab/demo can you please provide me details how I can collect Post Processed data from the radar.
Thank you.
Fahad
This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Hello Team,
I have a project to work on. For the People counting lab/demo can you please provide me details how I can collect Post Processed data from the radar.
Thank you.
Fahad
Hi Fahad,
Exactly what data are you trying to collect? The People Counting demo outputs point cloud and tracking data over UART. If this is the data you are interested in collecting, please see this thread on the same topic: https://e2e.ti.com/support/sensors-group/sensors/f/sensors-forum/1130339/awr1642boost-ods-how-to-capture-the-point-cloud-data-using-matlab-or-python/4193798
Best Regards,
Josh
Hi Josh,
Thank you for your response on it. I want to collect both Raw Radar data and post processed data. But it's better to have post processed. I have DCA1000 and mmwave studio and know very well about it.
Thank you
Fahad
Hi Josh,
so I went through the Thread and I'm interested in below option can you please explain it to me a bit more.
"
1. Capture the output UART data stream from the device, which can then be parsed to get the output data.
I typically recommend option 1, as it is easier and it ensures that you capture ALL output data, in case you change your mind on which pieces of the output that you want. If you want to do option 1, you can simply connect to the device via a serial terminal and enable logging on the data port (ensure it is logging using binary data format). I recommend using Teraterm for this, but i'm sure other serial terminal programs offer similar functionality.
"
Kind regards,
Fahad
Hi Josh,
so I went through the Thread and I'm interested in below option can you please explain it to me a bit more.
"
1. Capture the output UART data stream from the device, which can then be parsed to get the output data.
I typically recommend option 1, as it is easier and it ensures that you capture ALL output data, in case you change your mind on which pieces of the output that you want. If you want to do option 1, you can simply connect to the device via a serial terminal and enable logging on the data port (ensure it is logging using binary data format). I recommend using Teraterm for this, but i'm sure other serial terminal programs offer similar functionality.
"
Kind regards,
Fahad
Hi Fahad,
Which part is unclear? I can provide more detail if needed.
You'll first want to flash the device with the people counting demo binary. The instructions for this can be found in the Flash the EVM section of the 3D People Counting Demo User Guide.
If you are on Windows operating system then Tera Term is a good choice for saving the UART output and can also be used to send the configuration file (.cfg) to the device. If you're not on Windows there should be other serial terminal programs that can do the same. Just open two instances of Tera Term, one for the config port (In device manager it's listed as the "Enhanced COM Port") and another for the data port ("Standard COM Port" in device manager). You'll want to make sure to set the correct Baud rates for each serial terminal also. The baud rate for the configuration terminal should be 115200 and the baud rate for the data terminal should be 921600. Since you want to save the output UART data, make sure you enable logging on the data terminal (binary format). Then from the other terminal you can send the configuration file and the device will start running the demo. Use the configuration file (.cfg) located at: <INDUSTRIAL_TOOLBOX_INSTALL>/labs/People_Counting/3D_People_Counting/chirp_configs/
The resulting log of UART data can then be run through a parsing script. If you are looking for examples on how to parse the different TLVs output from the device, I would recommend looking at the source code for the Industrial Visualizer which is located at: <INDUSTRIAL_TOOLBOX_INSTALL>/tools/Visualizer
I hope this helps!
Best Regards,
Josh
Hi Josh,
Thank you so much for helping me.
> First I never ever used Tera Term software so I want to let know more about it.
> How I can send the .cfg file to the device through tera term or through People counting Gui.
> what do you mean by two instances of tera term please explain?
>How I can send .cfg file to tera term?
> Where is Parsing script I'm sharing the figure which you listed but I can't find there?
Thank you and Kind regards,
Fahad
Hi Fahad,
what do you mean by two instances of tera term please explain?
I mean one TeraTerm window which is used to connect to the config port and a separate TeraTerm window which is used to connect to the data port. The first is needed to send the .cfg file to the device. The second is needed if you wish to log the output data.
How I can send the .cfg file to the device through tera term or through People counting Gui.
Both methods require the EVM connected to the PC via USB and device in functional mode (see Industrial Toolbox for setting the device in functional mode).
For sending the .cfg file through Tera Term:
For sending the .cfg file via People Counting GUI:
Where is Parsing script I'm sharing the figure which you listed but I can't find there?
The visualizer code was moved with the latest release of the Industrial Toolbox (4.12). It appears you are on the previous release (4.11). For that version, the visualizer source code can be found in <INDUSTRIAL_TOOLBOX_INSTALL>/labs/People_Counting/visualizer
Best Regards,
Josh
Hi Josh,
Thank you very very much for helping me out. So in this way I get the cloud point.
> Actually my main task is to recognize human activity (walking, sitting etc.) so what you suggest by people counting lab this will full fill my needs or I want to try any other lab/demo. Because my main focus on sensing or Localization so is there any specific lab/demo for it.
Kind regards,
Fahad
Hi Josh,
I got my answer what I want to ask just tell me one thing.
> If I'm doing the Lab/demo of vital sign with people tracking so how I can get post processed data from them by using the same steps as mentioned?
> please also explain to me the steps of How I can parse the data by using visualizer?
Kind regards,
Fahad
Hello Josh,
I'm still waiting for your reply please sort my problem as soon as you can.
Thank you
Fahad
Hi Fahad,
I apologize for the late response.
If I'm doing the Lab/demo of vital sign with people tracking so how I can get post processed data from them by using the same steps as mentioned?
Yes, the same steps can be used to collect data from the Vital Signs with People Tracking demo. However, that demo uses it's own custom GUI which we do not provide the source code for. The TLV structure is described in the Vital Signs with People Tracking User Guide.
please also explain to me the steps of How I can parse the data by using visualizer?
The visualizer source code can be used as a reference to create your own parsing script which you can use to save the output data. Look at the file oob_parser,py. This file does the TLV parsing for various demos. The parsing for the 3D People Counting Demo is done in the function called Capon3DHeader().
Best Regards,
Josh
Hi Josh,
Thank you for your help.
The visualizer source code can be used as a reference to create your own parsing script which you can use to save the output data. Look at the file oob_parser,py. This file does the TLV parsing for various demos. The parsing for the 3D People Counting Demo is done in the function called Capon3DHeader().
> Can you please explain it more for me, because how I can do the parsing in that function or if possible share with the lines of scripts with details may be I'm able to understand.
> And I'm sharing a screenshot where I didn't find the oob_parser.py.
Thank you!
Fahad
Hi Josh,
The resulting log of UART data can then be run through a parsing script. If you are looking for examples on how to parse the different TLVs output from the device, I would recommend looking at the source code for the Industrial Visualizer which is located at: <INDUSTRIAL_TOOLBOX_INSTALL>/tools/Visualizer
One other point I want to add as we discuss to get the output data from it. You told me to use and follow the steps above for tera term to get the .bin file then how that .bin/log file be run through a parsing script.
I hope you understand my point.
Thank you
Fahad
Hi Josh,
I'm still waiting for your response on it. Please do so as soon as possible I'm stuck in my work.
Kind regards,
Fahad
Hi Josh,
I hope you are doing well. But still I'm waiting for your response Please help me in this regards.
Kind regards,
Fahad
Hi Fahad,
And I'm sharing a screenshot where I didn't find the oob_parser.py.
I was giving the path to the visualizer/GUI in Industrial Toolbox 4.11 as that is what you were using before. It appears you have now updated to Industrial Toolbox 4.12. In ITB 4.12 the visualizer source code is located at <TOOLBOX_INSTALL>\tools\Visualizer and the parsing is done in gui_parser.py.
Can you please explain it more for me, because how I can do the parsing in that function or if possible share with the lines of scripts with details may be I'm able to understand.
The GUI connects to the serial ports, then it reads the UART data stream looking for a magic word and handles the data for that frame accordingly (this is done in function readAndParseUart()). Since you only need to parse saved UART data then you must import the saved data and similarly read through looking for the magic word. Once the data is parsed you may save it in whichever format you wish (csv, txt file etc...) and there should be examples online of how to do this.
One other point I want to add as we discuss to get the output data from it. You told me to use and follow the steps above for tera term to get the .bin file then how that .bin/log file be run through a parsing script.
Explained above.
Regards,
Josh
Hi Josh,
The GUI connects to the serial ports, then it reads the UART data stream looking for a magic word and handles the data for that frame accordingly (this is done in function readAndParseUart()). Since you only need to parse saved UART data then you must import the saved data and similarly read through looking for the magic word. Once the data is parsed you may save it in whichever format you wish (csv, txt file etc...) and there should be examples online of how to do this.
> So I start the GUI directly by clicking the Mmwave industrial visualizer or may I run gui_main.py?
>Actually I'm still struggling for that point "Since you only need to parse saved UART data and how I can Import please explain these lines in details for me.
and also share with me the online examples?
> As in the beginning you told me about Tera term but now there is no discussion about it.
The resulting log of UART data can then be run through a parsing script. If you are looking for examples on how to parse the different TLVs output from the device, I would recommend looking at the source code for the Industrial Visualizer which is located at: <INDUSTRIAL_TOOLBOX_INSTALL>/tools/Visualizer
> Please explain this point as well as I'm now in confusion because both things mixed together.
By the way Thank you so much for your help.
Kind regards,
Fahad
Hi Fahad,
So I start the GUI directly by clicking the Mmwave industrial visualizer or may I run gui_main.py?
This is how you would start the GUI if using it to visualize data coming out from the device. Keep in mind simply running the GUI as it is provided will not allow you to save post-processed data. I am only pointing you to look at the visualizer code as reference to see how we parse the TLVs and implement the same parsing in your own script.
Actually I'm still struggling for that point "Since you only need to parse saved UART data and how I can Import please explain these lines in details for me.
and also share with me the online examples?
It would be outside the scope of this forum for me to give you details on the specific lines in our code which must be modified in order to do the task you wish. To see the examples that I refer to you can simply search online for "save data to csv in python."
As in the beginning you told me about Tera term but now there is no discussion about it.
Please explain this point as well as I'm now in confusion because both things mixed together.
Tera Term would only be used to save the stream of UART data. This is because you mentioned you would rather save the full stream of UART data and then subsequently run that data through a parsing script which parses and saves the post-processed data.
Alternatively, you could use the script that you write, to connect to the radar device (similar to the visualizer) and parse the stream of UART data live, saving the parsed data however you wish.
Regards,
Josh
Hi Josh,
Ok I understand now fully thank you very much but still have little confusion please.
Tera Term would only be used to save the stream of UART data. This is because you mentioned you would rather save the full stream of UART data and then subsequently run that data through a parsing script which parses and saves the post-processed data.
Alternatively, you could use the script that you write, to connect to the radar device (similar to the visualizer) and parse the stream of UART data live, saving the parsed data however you wish.
> I Understand now Tera term will be used to save the stream of UART data right?
> Please can you help me how I can import this saved data to parser script as I didn't found any import script in gui_parser.py.
> Will data from tera term will be the post processed data right?
> Just need to parse in order to make it readable and understandable right?
I hope you understand what I want to do.
Kind regards,
Fahad
Hi Fahad,
I Understand now Tera term will be used to save the stream of UART data right?
This is correct. Tera Term or other terminal emulator software (PuTTy, etc...) can be used for this.
Please can you help me how I can import this saved data to parser script as I didn't found any import script in gui_parser.py.
I apologize for the unnecessary confusion. I was previously unaware, but we do provide example scripts in the SDK which offer the functionality that you are looking for but they are only compatible with the Out-of-Box demo. You can modify these scripts so that one of the other demos is supported (People Counting, Vital Signs, etc..) by adding the parsing for the TLVs which are specific to those demos.
Please find these scripts at the following path: <SDK_INSTALL>/packages/ti/demo/parser_scripts/
Will data from tera term will be the post processed data right?
Yes.
Best Regards,
Josh
Hi Josh,
Thank you very much for your support and help.
So it means that log/bin file from tera term will be parse by using the parser script from SDK at the given location right?
But I have a question then what about the PPL counting scripts and parser scripts and I can import data to gui_parser.py to TLVs out of it and make it readable?
Kind regards,
Fahad
Hi Fahad,
So it means that log/bin file from tera term will be parse by using the parser script from SDK at the given location right?
Yes, this is correct. However that is only compatible with the Out-Of-Box demo i.e. it will not recognize the TLVs which are output as part of our other demos (People Counting, Vital Signs). You will need to modify these scripts to handle the data output from the demo that you run.
For example, if you want to run the people counting demo you would want to add support to those scripts to parse the Target List and Target Index TLVs which are custom to the people counting demo. General information about these TLVs can be found in the demo's user guide however this parsing is already implemented in the mmWave Industrial GUI which is why I directed you to use that code as reference.
Regards,
Josh
Hi Josh,
Thank you very much it helps me a lot.
But actually I tried on the people counting lab but I got confused because the script was too large and I also tried for ITB 4.12 visualizer but I didn't find any good thing. So that's why I asked you because you people know which line of script will be modify or used for saving the UART data or import UART data for parser. so it will be easy for me to modify the parser script in SDK for people counting lab.
> also if possible please indicate me is there any line of script for which we used print command to see the UART data in the output or terminal of python may be then we saved that data.
Kind regards,
Fahad
Hi Fahad,
The parsing code for the Target List and Target Index TLVs are located in <ITB_4_12_INSTALL>/tools/visualizer/parseTLVs.py, in the functions parseTrackTLV() and parseTargetIndexTLV(). You can add the same parsing code to the SDK example parsing script to add support for the People Counting demo.
Regards,
Josh
Hi Josh,
Thank you very much for your help will try that and check.
In the mean while can you please provide me any documentation related to cloud point(x,y,z and v) details and explanation.
Also please suggest what is difference between 3D people counting and mmwave demo visualizer as both of them can I used for sensing to find out the cloud data?
Thank s and regards,
Fahad
Hi Fahad,
You can find documentation on the TLV output formats in the Industrial toolbox linked here. The difference between the 3D people counting visualizer and the mmwave demo visualizer is what labs they support. The 3D people counting visualizer supports the 3D people counting labs, small object detection, vital signs, out-of-box demo and a few more. The demo visualizer only supports the out-of-box demo.
Best,
Nate
Hi Nate,
The difference between the 3D people counting visualizer and the mmwave demo visualizer is what labs they support. The 3D people counting visualizer supports the 3D people counting labs, small object detection, vital signs, out-of-box demo and a few more. The demo visualizer only supports the out-of-box demo
No I didn't mean that. My question is:
> Which one is better to use for finding the point cloud data because I want to use for sensing and localization of different human position(Stand, sit, walk etc.)so mmwave demo visualizer lab is good or 3D people counting lab/demo to find the angle, distance, range and point cloud data?
Kind regards,
Fahad
Hi Fahad,
3D People Counting is better for this. It does additional computation for a denser point cloud and more precise beamforming.
Best,
Nate
Hi Nate,
Thank you so much for your help. But please also help me in collecting of raw data and processed data as I was still strugling I collect through tera term but didn't get any result from it through parsing script.
Kind regards,
Fahad
Hi Fahad,
We will handle that in the other thread. Closing this one now.
Best,
Nate