This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

BEAGLE-3P-BBONE-AI: Missing scheme_generated.h file in tidlModelImport

Part Number: BEAGLE-3P-BBONE-AI
Other Parts Discussed in Thread: TEST2

Hi!

I want to import a TFLite model in my BBAI.

I cloned the tidl-utils git and installed the ProtoBuf compiler, but when compiling tidlModelImport, I obtained a fatal error:

compiling tidl_tfLiteImport.cpp 
tidl_tfLiteImport.cpp:76:30: fatal error: schema_generated.h: No such file or directory 
 #include "schema_generated.h" 

I checked, and there is no schema_generated.h file in the git.

Did I missed one step in the process? Is there any other library I should have downloaded first?

Thanks in advance!

  • I don't know if it is ok, but I copied the schema_generated.h file from the Tensorflow package and it seems to run (at least I get a different error).

    My problem now is that it crushes when compiling caffeImport/caffe.pb.cc

    arm-linux-gnueabihf-g++: internal compiler error: killed (program cclplus)

     

    I think now my problem is a memory issue, because when I run "dmesg" in the terminal it shows the next message:

    Out of memory: kill process 9799 (cclplus) score 446 or sacrifice child

    killed process 9799 total-vm:303592kB, anon-rss:278380kB, file.rss:476kB, shame-rss:0kB

     

    How can I solve this?

  • Hi Alvaro, a question, did you follow readme steps from tidl-utils/src/importTool?.

    Also, for your use case do you need to rebuild import tool? or can you use prebuild binaries?.

    If prebuild libraries are OK, let me know. I haven't worked on a Beagle-AI, but for AM57x you can find them on <PSDK>/linux-devkit/sysroot/x86_64-arago-linux/usr/bin/tidl_model_import.out

    thank you,

    Paula

  • Hi Paula!

    Thank you so much for your comment!

    I followed the steps from the readme, yes:

    • I setup the environment variables
    • Generated the .cc and .files in caffeImport and tfImport
    • Run the makefile in tidlModelImport

    But I keep having problems with protoc, even though I am using a prebuild binary (I had errors when I tried to compiled it)

    Answering your question, I don't mind using prebuild binaries as long as they allow me to import my TFLite model.

    If that is the case, would you mind telling me what should I do? I am taking a look at the PSDK you mentioned.

    Thanks a lot for your help! I really appreciate it

    Regards,

    Álvaro

  • Hi Alvaro, I am not expert on BB-AI Debian image (which I suspect is what you are using).. maybe TIDL import tool is there.. but if you don't mind, you can use prebuilt PC TIDL import tool binary from AM57x PLSDK

    http://software-dl.ti.com/processor-sdk-linux/esd/AM57X/latest/index_FDS.html

    thank you,

    Paula

  • Hi Paula, 

    Thank you for your reply.

    I downloaded AM57xx Linux SDK Essentials, but can I execute the binary file using my BBAI or do I need a Linux Host computer (which I don't have)?

    I am using the BBAI as a standalone device, connected to my keyboard and my screen.

    Thanks for your time, 

    Álvaro

  • Hi Alvaro, I guess your BB-AI is running Debian image.. If you have an SD card to spare (>32GB) you can have Linux AM57x PSDK running on your BB-AI (is one of the supported EVMs).

    From PLSDK download link you can get prebuilt image (am57xx-evm-linux-06.03.00.106.img.zip). Steps on how to create SD card in Windows

    So, you can run BB-AI with TI's Linux PSDK and use TIDL import tool from there. Actually, it is in the system path and you can call tidl_model_import.out directly. More details: Linux PSDK UserGuide -> TIDL. -> 3.15.1.4.7. Import Process

    Ex: tidl_model_import.out ./test/testvecs/config/import/tidl_import_jseg21.txt

    If you are planning to have a Linux machine for development (maybe a VM) the recommended system is Ubuntu. Just FYI, my current Ubuntu version is 18.04.

    Thank you,

    Paula

     

  • Dear Paula, 

    I followed your instructions and run BBAI with TI's Linux PSDK (I bought a 32GB SD card). It launches an interface with different buttons/applications. 

    It is true that the tidl_model_import.out is in the system path, but what it does not exist is the 'test' folder. Do I need to install something else?

    I found out that there is a Machine Learning folder but it seems that I cannot run any of the examples.

    I am using the BBAI as with a standalone setup, so no host PC is connected. I would like to keep it that way, because I do not have a Linux PC/VM.

    So, what am I doing wrong? 

    I am finding extremely difficult to use a simple Machine Learning model in the BBAI, and It shouldn't. I though this BB was meant for Deep Learning applications.

    Thanks in advance

    Alvaro

  • Hi Paula, 

    Sorry for the spam, but I think its better if I try to simplify my questions:

    1) Can I run the TI's PSDK on my BBAI or do I need a host PC?

    2) Is there any restriction in the standalone layout in comparison with the host PC layout?

    3) Can I run tidl_model_import.out or do I need to configure something first?

    4) If you remember, before this SDK approach, I was trying to compile the till-utils in my BBAI (Debian), is there any guideline that can help me in case I want to continue this path?

    5) Or in other words, is the TI's PSDK the only way to run a TFlite Model on my BBAI?

    Thank your for the support.

    I am confident that answering these questions will help resolving my issue.

    Álvaro

     

  • Hi Alvaro, my comments below

    1) Can I run the TI's PSDK on my BBAI or do I need a host PC?

    PC-- you can run PSDK on target.. no need of a host PC. Host PC is desirable for development, thought.

    2) Is there any restriction in the standalone layout in comparison with the host PC layout?

    PC-- From TIDL tools point of view, no really, probably not all folders and configuration files are copied on target's filesystem, but the main functionality is there.

    3) Can I run tidl_model_import.out or do I need to configure something first?

    PC-- you need to take car of modifying TIDL import tool configuration files, specially paths to files. You can take a look a of those config files and folders structure from TIDL git you were using before. I could be wrong, but I believe the "test" folder path is under /usr/share/ti/tidl/. Let me know if you find it. If not, I will find an AM57x and give a try.. unfortunately, I don't have handy an BB-AI board.

    4) If you remember, before this SDK approach, I was trying to compile the till-utils in my BBAI (Debian), is there any guideline that can help me in case I want to continue this path?

    PC-- Honestly, I haven't tried BBAI debian. No sure how to help there..

    5) Or in other words, is the TI's PSDK the only way to run a TFlite Model on my BBAI?

    PC-- No necessary, you can use TI's PSDK to get your model imported (TIDL binaries) and then use those files in Debian demos. TI's PSDK is a proposed workaround to a linux PC for TIDL import tool... Of course, you can also use TI's PSDK for inference.. but I guess you would like to stay with Debian. Either way is OK.

    Thank you,

    Paula

  • Hi Paula, 

    Thank you so much for your time and for the detailed explanations. I really appreciate your help.

    Just a couple of things:

    - I confirm that the "test" folder for the tidl_model_import.out example is: /usr/share/ti/tidl/utils/test/testvecs/config/import

    - However, when I try to run any of the txt files (e.g. tidl_import_jseg21.txt) I get the same error: "Couldn't open inputNetFile file: ./test/testvecs/config/caffe-jacinto-models/trained/image_segmentation/cityscapes5_jsegnet21v2/sparse deploy.prototxt"

    When I try to run a different .txt, I get the same error pointing at a different file.

    Any clue of what could be the cause?

    Thanks!

    Álvaro

  • Hi Paula, 

    I managed to solve my previous error.

    Not only there were some files missing, but also, most of the paths from the 'tidl_import_jseg21.txt' were wrongly pointing to a different folder within the package (I think someone should check that).

    When I run it, it starts working, but after a few seconds, I get the following error:

    TIOCL FATAL: Failed to open EVE message queue

    Any idea?

    Thank you!

    Álvaro

  • Hi Alvaro, most of the folders and paths are by default to work on PC, so some care has to be taken when running on target as you discovered..

    With respect to the error, could you send me the complete output log?

    thank you,

    Paula

  • Hi Paula, 

    I don't know if there is a log file in the TIDL package, I can't find any.

    I send you attached a capture of my terminal after executing tidl_model_import.out. I hope it helps.

    Let me know if you need anything else or where to find the log file to show you the error.

    Thanks you, 

    Álvaro

  • Hi Alvaro, I will ask to a colleague. In the meantime could you please run below command and send me the results? also, please send me the example you are running, please share used commands,  and if possible used config file.

    # cat /proc/cmem

    thank you,

    Paula 

  • Hi Paula, 

    - I downloaded the tflite model and saved it in the proper path:

    /usr/share/ti/tidl/examples/test/testvecs/config/tflite_models/inception_v3.tflite

    (I was testing a pre-trained tflite model because the application we're developing it is also based on a tflite model)

    - I run the following line:

    root@am57xx-evm:/usr/share/ti/tidl/examples# tidl_model_import.out ./../utils/test/testvecs/config/import/tflite/tidl_import_inception_v3_jpg.txt

    - The output after running the command " cat /proc/cmem":

    Block 0: Pool 0: 1 bufs size 0x18000000 (0x18000000 requested)

    Pool 0 busy bufs:

    id 0: physl addr 0xa0000000 (cached)

    Pool 0 free bufs:

    I send you attached the config file.

    If you need anything else, please don't hesitate to ask me.

    Thank you!

    Álvaro

    tidl_import_inception_v3_jpg.txt
    # Default - 0
    randParams         = 0
    
    # 0: Caffe, 1: TensorFlow, 2: ONNX, 3: TensorFlow Lite, Default - 0
    modelType          = 3
    
    # 0: Fixed quantization By tarininng Framework, 1: Dyanamic quantization by TIDL, Default - 1
    quantizationStyle  = 1
    
    # quantRoundAdd/100 will be added while rounding to integer, Default - 50
    quantRoundAdd      = 50
    
    numParamBits       = 8
    
    inputNetFile       = "./test/testvecs/config/tflite_models/inception_v3.tflite
    inputParamsFile    = "NA"
    outputNetFile      = "./test/testvecs/config/tidl_models/tflite/tidl_net_tflite_inception_v3.bin"
    outputParamsFile   = "./test/testvecs/config/tidl_models/tflite/tidl_param_tflite_inception_v3.bin"
    
    inWidth  = 299
    inHeight = 299
    inNumChannels = 3
    preProcType = 2
    sampleInData = "./test/testvecs/input/airshow.jpg"
    tidlStatsTool = "eve_test_dl_algo_ref.out"
    

  • Hi Alvaro, thank you!, let me check what you shared and come back to you. In the meantime, is it possible for you to share inception_v3.tflite with me? if so, I could give a try to Import it in my PC and in my AM57x to see if I face any issues.

    thank you,

    Paula

  • Hi Paula, 

    The file is too big to attach it. 

    I downloaded it from this link

    I hope it helps! 

    Álvaro

  • Hi Alvaro, I was able to reproduce the error (TIOCL FATAL: Failed to open EVE message queue) on Linux PSDK 6.3, however, I see binaries are created.

    I will ask internally if this is a message can be ignored, as I believe is coming from TIDL stats tool and not from the import tool per se. As a quick test, I commented below line inside the configuration, and I don't get the error. 

    root@am57xx-evm:/usr/share/ti/tidl/utils/test/testvecs/config/import/tflite# vi tidl_import_inception_v3_jpg.txt

    #tidlStatsTool = "eve_test_dl_algo_ref.out"

    I was wondering if you can give a try to output binaries, and see if they work for your application?


    # ls -l57xx-evm:/usr/share/ti/tidl/utils/test/testvecs/config/tidl_models/tflite#
    -rw-r--r--    1 root     root        484384 May 26 14:17 tidl_net_tflite_inception_v3.bin
    -rw-r--r--    1 root     root      23837618 May 26 14:17 tidl_param_tflite_inception_v3.bin

    Thank you,

    Paula

  • Hi Paula, 

    I commented the line you mentioned, but it still crushes:

    I will check the binaries, as you proposed, and see if it's enough to run the model.

    Thank you!

    Álvaro

  • Hi Alvaro, the message is OK, we are commenting tidlStatsTool, so it is expected. Let me know how your test of the binaries goes =)

    Paula 

  • Hi Paula, 

    I am trying to run the binaries, but think I need to run eve_test_dl_algo.out first. If not, I don't know how to run an imported Deep Learning model.

    I'm following the steps described here. So, correct me if I'm wrong but I need to create two txt files:

    - tidl_config_networkName.txt: it includes the path of the binaries obtained with till_model_import.out. There are some examples in a folder called "infer".

    - config_list.txt: it includes the path of the previous text file. 

    Since there is no config_list.txt file anywhere in the folder tree of the package, I guess that's why I got the TIOCL FATAL: Failed to open EVE message queue 

    Does it make sense?

    Thank you!

  • Hi Alvaro, I will connect my board later in the afternoon to give you a better guidance. But, for now, few comments.

    • You don't need to run eve_test_dl_algo.out for inference. This is a testing tool we used for sanity check after importing and before running/testing binaries inside an application. So, useful, but not required. 
    • As you pointed it out, you need a configuration file for inferences. They should be some examples in PLSDK /usr/share/ti/tidl for TFLite you can use.. I will check this later
    • With respect to a config_list.txt, need to double check, I guess it depends a bit how are you going to test it.. if you are going to replace this model instead of another one in an already working application (example: tidl_classification demo), you won't need it, or if you have any other simpler test_application to use..

    Maybe it has changed a bit in latest Linux PSDK version, but, as a reference, from an older TI design document (www.ti.com/.../tidueb6) we have:

        

    So, maybe we can try to run some of the OOB demos in Linux PSDK (classification/segmentation/ssd), and then when you get more familiar, modify the application to use your new inception_v3 binaries instead of JacintoNet11... or if you have another application that you can use as baseline in Debian image, it should be also OK.. or a simpler test_application.. 

    Thank you,

    Paula

  • Hi Paula, 

    Thank you again for your detailed response.

    However, I tried running the demo exactly as you put it:

    ./tidl_classification -n 2 -t d -l ./imagenet.txt -s -/classlist.txt -i -/clips/test1.mp4 -c ./sream_config_j11_v2.txt

    and I keep getting the same TIOCL FATAL: Failed to open EVE message queue.

    Even when I run ./tidl_classification -h I get the same error.

    What could be the reason? 

    I tried running other demo examples and they all crush showing the same error.

    If you remember, when running eve_test_dl_algo.out I got the same error and I solved it by creating the config_list.txt I mentioned in a previous post. Is it possible that there is also a missing file somewhere?

    Thank you

    Álvaro

  • Hi Alvaro, you are absolutely right, there is an issue for running TIDL demos and tools on PLSDK6.3... I was able to reproduce the issue. I will contact our PSDK team to understand better this issue and how it wasn't catch during our automation test..

    As I mentioned before I had another SD card with Linux PSDK 5.x which works OK. For your reference what I did was:

    • Connect and Display via HDMI
    • cd /usr/share/ti/tidl/examples/classification
    • /etc/init.d/weston start
    • ./tidl_classification -g 1 -d 2 -e 0 -l ./imagenet.txt -s ./classlist.txt -i ./clips/test2.mp4 -c ./stream_config_j11_v2.txt

    More details:

    The oldest PLSDK version that supports BB AI, as one of the supported platforms, is 6.2 .. I will give a quick try and let you know if OOB demos works for me

    Thank you,

    Paula

  • Hi Alvaro, after checking with our OpenCL expert, it seems the issue in Linux PSDK6.3 and 6.2 is related with how we created the SD card. If we create the SD card on a Linux PC by running "create-sdcard.sh" script, this script takes care of copying the correct IPU1 firmware on the SD card.

    To fix this mistmatch, could you please try the following?:

    • cp /lib/firmware/dra7-ipu1-fw.xem4.opencl-monitor /run/media/mmcblk0p1/dra7-ipu1-fw.xem4
    • reboot
    • After rebooting and root login run: /etc/init.d/weston start
    • cd /usr/share/ti/tidl/examples/classification
    • ./tidl_classification -g 1 -d 2 -e 0 -l ./imagenet.txt -s ./classlist.txt -i ./clips/test2.mp4 -c ./stream_config_j11_v2.txt

    Please let me know if this works.. this should fix also TIDL tools issues you previously experience.

    thank you,

    Paula

     

  • Hi Paula, 

    Ok. I'll give it a try and I'll tell you if this solves it (fingers crossed).

    Thank you so much!

    Álvaro

  • Hi Paula, 

    It worked!! Finally!

    You were right, now I can run every example.

    Next steps will be importing and running a customized model on the BBAI, but I think we can close this thread.

    Thank you so much!!

    Álvaro