This thread has been locked.
If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.
Dear Sir
Linux SDK: ti-processor-sdk-linux-j7-evm-08_02_00_03
I am trying to encode the capture data from camera(yuv/rgb) to .h264 stream using ffmpeg. Considering it save the cpu usage, I plan to use the TDA4VM hardware video encoder, could you help instruct to me how can I use it to simply test with ffmpeg tool?
Thank you~
Jason
Hi Jason,
Do you have a preference to use ffmpeg? There is vision app that is already written that accomplishes what you are asking. Furthermore, there are quite simple gstreamer pipelines that also accomplish this. I'd be happy to share those with you if you are interested.
I can also look into ffmpeg solutions if you are determined to choose this route.
Best,
Brandon
Hi Brandon
Thanks for your attention this issue.
ffmpeg is not the final option, we are evaluate it currently.
But if you can share the vision app & gstreamer as well as their usage you mentioned above, that will be a good idea. And we will try to test them after you sharing.
My Best Regards
Jason
Hi Jason,
Here is the vision apps link that covers the application with the same functionality you are looking for: https://software-dl.ti.com/jacinto7/esd/processor-sdk-rtos-jacinto7/latest/exports/docs/vision_apps/docs/user_guide/group_apps_basic_demos_app_multi_cam_codec.html.
Best,
Brandon