This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Regarding next EZSDK 5.03 and Gstreamer lib

Greeting,

 

We need do a  HDMI video capture- H.264 Encoding- Video streaming via RTP  chain on the evm816x.

After reading the following two links and getting the TI-Gstreamer library for DM816x, I become more confused.

           link     1. http://e2e.ti.com/support/dsp/davinci_digital_media_processors/f/716/p/138035/514972.aspx#514972

           link     2. http://processors.wiki.ti.com/index.php/DM81xx_Gstreamer_Plugin

Wish there are some TI employee could clarify some confusions of me. Any other input are highly appreciated and welcome.

--1.Where is the omx_ctrl in the Gst-omx lib set?

      All the openmax components will be encapsulated into the new gst-omx lib, am I right?

      However, after going through the whole gst-omx lib, why i cannot find the one similar with the one in openmax lib "omx_ctrl", which is required for TVP video capture.

 

--2.    Should I modify the old example like video-capture of ezsdk 5.02 with the new gst-omx lib?

     Or, in the next EZSDK 5.03, all previous examples like video capture-encode, decode-display will be rewrote with new gst-omx lib?

      I understand Gstreamer will need v4l2 to do the video capture, while in omx currently using omx_ctrl to do it.

      If I want to do a whole chain I mentioned above, What should I do right now?

 

--3 Will the C code example of Video streaming RTP protocols be provided in EZSDK5.03?

     And what is the release date of next SDK5.03. This Friday or the middle of Nov?

 

BR,

 

Jun

 

 

 

 

 

  • We have not tested any capture component with gstreamer yet.

    There is plan to support v4l2 in the near future where camera and component input would be supported.

    How do you plan on using HDMI. Do you have your own board or are you using the TI EVM?

    From what I know the TI EVM has only component and DVI inputs.

  • Hi Prashant

     

    Thanks for you reply very much. For the HDMI, we are going to use our own board.

    I get the information that the v4l2 driver will not be released untill the Dec,

    However in theory if the gstreamer encapsulate every components of omx, it can also do the video capture even without v4l2, right?

     

    Could you please give me some info like, whether the old omx examples like " video capture" be rewrote with the new gstreamer-omx lib?

     

     

     

  • Hi Jun,

    Right now there is no plugin that uses the omx capture component.

    We plan to use v4l2 for capture. And this would be available in the end of december.

    However we have learnt that one of our 3Ps has developed plugins for omx capture. We shall be having a discussion with them and by the end of this week, we will come to know what exactly has been done and how it can be shared with the community. Shall keep you posted on this.

     

    And the old omx examples would remain as they are, they are to demonstrate the usage of openmax APIs(In case some one is looking to use this directly).

     

    -Prashant.

     

     

  • Hi prashant,

    I guess you might had worked on the v4l2 for capture.

    I am struggling a lot to capture video from web cam (Logitech C170) and audio from mic and save the output into an avi container.

    If you were successful with your video recording, can you please help me out.

    Thanks in advance

    Baz

  • Hello Baz,

    Could you provide more details, please.

    What is the software release that you are using here?

    For capturing are you using gstreamer?

    What is your video source and how is connected?

    BR

    Margarita

  • Hi Margarita,

    I am really glad for your help.

    Well, my aim is to capture the video and audio and save them into a AVI container.

    I am using Ubuntu 10.04 and Linux 2.6.32-38-generic on i686 GNU/Linux and logitech C170 webcam.

    My Target is to write a C code with two threads, one for Video capturing and other for audio capturing(Synchronization may not be perfect).  I was able to capture audio coming from the mic, using GStreamer api's. I don't know how to capture the video and place both audio and video into a single file. I have been restricted in using any command line tools like Mplayers, ffmpeg...

    I am trying to use the gst_element_factory_make in order to set the video caputre settings. But unfortunately, I see an error when I run the program. I read an error saying " One element could not be created. Exiting "

       encoder = gst_element_factory_make ("ffenc_mpeg4","mpeg-decoder");

    This specific line give me that error.

    The video recording code has been attached. Please help me..!!

  • Hello,

    basavaraju Gandla said:
       encoder = gst_element_factory_make ("ffenc_mpeg4","mpeg-decoder");

       source = gst_element_factory_make ("v4l2src", "file-source");
     
       vrate = gst_element_factory_make ("videorate", "video-rate");
     
       filter = gst_element_factory_make ("capsfilter", "filter");
     
       conv = gst_element_factory_make ("ffmpegcolorspace","converter");
     
       encoder = gst_element_factory_make ("ffenc_mpeg4","mpeg-decoder");
     
       sink = gst_element_factory_make ("udpsink","audio-output");

    Probably you are seeing the error because some of the previous elements are not correct. Please check what are the caps which you are passed. Also if you wish to be sure that is correct just execute the pipeline without your program.

    gst-launch v4l2src ..... ! .....

    Best Regards,

    Margarita

  • Hello,

    Also I would recommend to you to use omx_mpeg4enc not ffenc_mpeg4. The ffdec_enc it is implemented in the gstreamer plugin, so it doesn't use hardware acceleration. Same is for ffmpegcolorspace element. The uses of ffmpegcolorspace could leads to ARM load at least this element does colour conversion.

    gst-launch v4l2src ..... ! caps ! omx_mpeg4enc ! qtmux ! filesink/udpsink


    BR

    Margarita

  • Margarita,

    As you recommended, I was using this line for test :-

    gst-launch v4l2src ! 'video/x-raw-yuv,width=640,height=480'! ffmpegcolorspace ! ximagesink

    This gives me access to webcam, I was able to open the webcam.

    When I tried to use this command, A warring appears.

    gst-launch v4l2src num-buffers=1000 ! video/x-raw-yuv,format=\(fourcc\)NV12,width=640,height=480,framerate=\(fraction\)60/1 ! omx_mpeg4enc ! video/mpeg,mpegversion=4,systemstream=false ! gstperf ! filesink location=sample.mpeg4


    WARNING: erroneous pipeline: no element "omx_mpeg4enc"

    Can you help me how to get rid of this warning.

    Thanks in advance

    Baz

  • Hello,

    basavaraju Gandla said:

    gst-launch v4l2src num-buffers=1000 ! video/x-raw-yuv,format=\(fourcc\)NV12,width=640,height=480,framerate=\(fraction\)60/1 ! omx_mpeg4enc ! video/mpeg,mpegversion=4,systemstream=false ! gstperf ! filesink location=sample.mpeg4

    Are you executing this on the board or you are trying to execute it on the PC?

    BR

    Margarita

  • Baz,

    What is your use case?

    Is it:

    PC side:

    capture->encoding->streaming

    dm8168 side:

    decoding->display

    or

    reverse?

    BR

    Margarita

  • Margarita,

    Its PC side:-

    capture->encoding->streaming-> save to avi container.

    BR,

    Baz

  • Hello,

    then the previous pipeline won't work. You should use ffenc_mpeg4.

    Are you will decode on the dm8168?

    What is the EZSDK version that you are using also?

    BR

    Margarita

  • Hi,

    As of now I am not working on board.

    I have been asked to write a C application using which we can capture the audio and video.

    Later stages, this application will be made to run on DM8148 board.

    BR,

    Baz

  • Margarita,

    I tried to use ffenc_mpeg4, But it saying " One element could not be created. Exiting" .

    I dont Know how to solve this.

    BR

    Baz

  • Hello,

    It would be nice to read, what is supported in omx-gstreamer for dm8148.

    I do not see in your code nothing about the pads.

    Please refer to http://gstreamer.freedesktop.org/, there is a documentation regarding gstreamer including example how to build our own application(http://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/chapter-helloworld.html).

    Let me know is this working for you.

    BR

    Margarita

  • Hello,

    basavaraju Gandla said:
    I tried to use ffenc_mpeg4, But it saying " One element could not be created. Exiting" .

    encoder = gst_element_factory_make ("ffenc_mpeg4","mpeg-decoder");

    Encoder.

    BR

    Margarita

  • #include <unistd.h>
    #include <string>
    #include <iostream>
    using namespace std ;
    
    #include <gst/gst.h>
    #include <gst/gstelementfactory.h>
    
    int nbrec = 0 ;
    
    
    GstElement *pipeline ;
    GstElement *src, *tee, *queue, *sink ;
    GstElement *videobin ;
    GstElement *queue2, *enc, *mux, *filesink ;
    GstPad *teepad ;
    GstPad *gpad ;
    
    void pad_blocked(GstPad *pad, gboolean blocked, gpointer user_data)
    {
      g_printerr ( "pad blocked\n ");
    }
    
    void create_video_bin()
    {
      // First record : create videobin
      g_printerr ("create video bin\n" ) ;
      videobin = gst_bin_new("videobin") ;
      gst_object_ref(videobin) ;
      
      queue2 = gst_element_factory_make("queue", NULL) ;
      gst_bin_add(GST_BIN(videobin), queue2) ;
      
      enc = gst_element_factory_make("theoraenc", NULL) ;
      g_object_set(G_OBJECT(enc), "quality", 16, NULL) ;
      gst_bin_add(GST_BIN(videobin), enc) ;
      
      mux = gst_element_factory_make("oggmux", NULL) ;      
      gst_bin_add(GST_BIN(videobin), mux) ;
      
      filesink = gst_element_factory_make("filesink", NULL) ;
      g_object_set(G_OBJECT(filesink), "async", FALSE, NULL) ;
      gst_bin_add(GST_BIN(videobin), filesink) ;
      
      // Create vide bin ghost src pad
      GstPad *pad = gst_element_get_pad(queue2, "sink") ;
      gpad = gst_ghost_pad_new("sink", pad) ;
      gst_element_add_pad(videobin, gpad) ;
      gst_object_unref(pad) ;
      
      gst_element_link_many(queue2, enc, mux, filesink, NULL) ;
      
      gst_element_set_state(videobin, GST_STATE_READY) ;
    }
    
    void start_recording()
    {
      nbrec++ ;
      char fname[32] ;
      sprintf(fname, "Test_output%d.avi", nbrec) ;
      g_object_set(G_OBJECT(filesink), "location", fname, NULL) ;
      
      // Get tee pad and block
      g_printerr( "Start Recording : get tee request pad\n" ) ;
      teepad = gst_element_get_request_pad(tee, "src%d") ;
    
      // Add videobin
      g_printerr ("Start Recording : add videobin\n") ;
      gst_bin_add(GST_BIN(pipeline), videobin) ;
      
      // Link tee -> videobin
      g_printerr ("Start Recording : link tee -> videobin\n") ;
      gst_pad_link(teepad, gpad) ;
    
      // Go playing
      g_printerr("Start Recording : go to PLAYING\n" ) ;
      gst_element_set_state(videobin, GST_STATE_PLAYING) ;
        
    }
    
    void stop_recording()
    {
      g_printerr ("Stop Recording : send eos\n" ) ;
      gst_element_send_event(enc, gst_event_new_eos()) ;
    
      g_printerr ("Stop Recording : go to NULL\n") ;
      gst_element_set_state(videobin, GST_STATE_NULL) ;
    
      g_printerr ("Stop Recording : unlink tee -> videobin\n") ;
      gst_pad_unlink(teepad, gpad) ;
    
      g_printerr ("Stop Recording : remove videobin\n");
      gst_bin_remove(GST_BIN(pipeline), videobin) ;
    
      g_printerr ("Stop Recording : release request tee pad\n") ;
      gst_element_release_request_pad(tee, teepad) ;
    }
    
    int main(int argc, char *argv[])
    {
    
      gst_init(0,0) ;
    
      pipeline = gst_pipeline_new(NULL) ;
    
      g_signal_connect (pipeline, "deep_notify",G_CALLBACK (gst_object_default_deep_notify), NULL);
    
      src = gst_element_factory_make("v4l2src", NULL) ;
      g_object_set(G_OBJECT(src), "pattern", 0, NULL) ;
      gst_bin_add(GST_BIN(pipeline), src) ;
    
      GstCaps *caps = gst_caps_from_string("video/x-raw-yuv, format=(fourcc)I420,width=640,height=480") ;  
      GstElement *filter = gst_element_factory_make("capsfilter", NULL) ;
      g_object_set(G_OBJECT(filter), "caps", caps, NULL) ;
      gst_bin_add(GST_BIN(pipeline), filter) ;
    
      tee = gst_element_factory_make("tee", NULL) ;
      gst_bin_add(GST_BIN(pipeline), tee) ;
    
      queue = gst_element_factory_make("queue", NULL) ;
      gst_bin_add(GST_BIN(pipeline), queue) ;
    
      sink = gst_element_factory_make("xvimagesink", NULL) ;
      gst_bin_add(GST_BIN(pipeline), sink) ;
    
      gst_element_link_many(src, filter, tee, queue, sink, NULL) ;
    
      create_video_bin() ;
    
      gst_element_set_state(pipeline, GST_STATE_PLAYING) ;
    
      sleep(3) ;
    
      start_recording() ;
    
      sleep(5) ;
    
      stop_recording() ;
    
      return  0;
    
    } ;
    

    Hi Margarita,

    For last few days I was studying on Gstreamer API's to capture Video in PC.

    I came up with a C code that was able to record the video in AVI container.

    I am not sure if this code, when executes on DM8148 board will do the same recording.

    I have been suggested to use Openmax for the codec changes but, I don't how to use Openmax.

      Can you please help me with some example codes for using Openmax.

    I have attached the video capturing code. please check it.

    BR,

    BAZ

  • Hello Baz,

    I would suggest you to download the EZSDK.You could check the demos there.

    Here is the link, where you could download it:

    http://software-dl.ti.com/dsps/dsps_public_sw/ezsdk/latest/index_FDS.html

    But also you could check for information for DVRRDK & IPNC and to chose regarding your use cases.

    In EZSDK  there is a OMX demos in:

    ezsdk/component-sources/omx_05_02_00_48/examples

    and the documentation about the demos and the OMXIL in:

    ezsdk/component-sources/omx_05_02_00_48

    OMX_05_02_00_48_UserGuide.pdf

    But it would be better to start with:

    /ti-ezsdk_dm814x-evm_5_05_02_00/docs

    DM814x_EZ_Software_Developers_Guide.pdf

    In component-sources folder you could find gst-openmax_GST_DM81XX_00_07_00_00 also.

    We are using gstreamer plugin 0.10 with omx elements.

    ezsdk/component-sources/gst-openmax_GST_DM81XX_00_07_00_00/omx

    I would recommend you to use  gstomx encoder/decoders not ffmpeg. The ffmpeg it is implemented in the gstreamer plugin, so it doesn't use hardware acceleration.

    You could find gstreamer example pipelines here:

    http://processors.wiki.ti.com/index.php/DM81xx_Gstreamer_Pipelines

    Regarding video capturing you could use OMX or v4l2. 

    There is a OMX demo for capture_encode.

    Regarding v4l2 you could check:

    http://processors.wiki.ti.com/index.php/TI81XX_PSP_VIDEO_CAPTURE_Driver_User_Guide

    There are v4l2 sample application which you could find in:

    example-applications/linux-driver-examples-psp04.04.00.01/video

    More information you could find in:

    ezsdk/board-support/docs

    Please let me know if you need further details.

    BR
    Margarita