This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

DM365 EVM LI-5M02 camera board...

All,

I just purchased/received a LI-5M02 HD camera board from Leopard Imaging.  

I have taken the following steps:
 1. have installed the LI-5M02 in my Spectrum Digital DM365 EVM
     then powered the board up, everything seems normal, i.e., my
     developed applications all run fine.
 2. next I tried to run the TI "encodedecode application" using the
     following command line: CE_DEBUG=2 ./encodedecode -y 3 -I 4 -p
 3. unfortunately, I receive the following output on the DM365 EVM terminal:
     @2,435,038us: [+2 T:0x41886490] ti.sdo.dmai - [Capture] Camera input selected                                                                                                            
     @2,435,753us: [+7 T:0x41886490] ti.sdo.dmai - [Capture] Failed to set video input to 3 (Invalid argument)                                                                               
     Error: Failed to create capture device 
Of course, the encodedecode application stops shortly thereafter.  
My DM365 EVM configuration is:
 1. DVSDK 4.01
 2. My bootargs are:
       EVM :>printenv                                                                                                                                                                 
       bootdelay=4                                                                                                                                                                                                                           
       baudrate=115200                                                                                                                                                                                                                      
       ethaddr=00:0e:99:02:cc:7a                                                                                                                                                                                                             
       bootfile=uImage-dm365-evm.bin                                                                                                                                           
       root=/dev/nfs                                                                      
       bootcmd=setenv serverip 192.168.2.2;tftpboot;bootm                                                                                                                             
       bootargs=console=ttyS0,115200n8 rw mem=54M ip=192.168.2.127:192.168.2.2:192.168.2.1:255.255.255.0::::off root=/dev/nfs  \
       nfsroot=192.168.2.2:/home/cimarron/targetfs video=davincifb:vid0=OFF:vid1=OFF:osd0=720x576x16,4050K dm365_imp.0
       ipaddr=192.168.2.127                                                                                                                                                                                                                  
       serverip=192.168.2.2                                                                                                                                                                                                                 
       rootpath=192.168.2.2:/home/cimarron/targetfs                                                                                                         
       nfsroot=192.168.2.2:/home/cimarron/targetfs                                                                                                                
       nfshost=192.168.2.2:/home/cimarron/targetfs                                                                                                                             
       stdin=serial                                                                                                                                                                                                                         
       stdout=serial                                                                                                                                                                                                                        
       stderr=serial
What am I doing wrong?  Please help!
Thank you,
-Chuck
  • Chuck,

    Did you set the vpfe_capture.interface to 1 so it registers the mt9p031 driver on boot?

    Have a look at this page:

    http://processors.wiki.ti.com/index.php/UG:_DaVinci_PSP_Installation_on_DM36x_EVM#Video_Input.2FOutput_Source_Configuration

    -Randy

  • Randy,

    Thank you for your reply and suggestion.  

    Since there is little or no document - at least that I have been able to find - regarding how to interface the LM-5M02 camera board to the Spectrum Digital DM365 EVM, I naively thought that since I was up-to-date using DVSDK 4.01 the encodedecode application would work from the command line.  Leopard Imaging support has since informed me that I will have to rebuild the Linux kernel using the latest version from the Arago Project Git site following the instructions found at this link: http://processors.wiki.ti.com/index.php/UG:_DaVinci_PSP_Installation_on_DM36x_EVM#Building_and_using_the_MT9P031_CMOS_image_sensor_driver

    I also understand that I will have to modify my bootargs as you have suggested.  Do you happen to know if this is the proper approach to use in this case?  Do you have any other suggestions?

    Thanks,

    -Chuck

  • Hey Chuck and Others,

    have you found a solution according to the problem?

    I'm facing the same issue, but I think, my problem is the 'loadmodules.sh'.

    Can anyone give me your actual bootargs and the content of your loadmodules.sh?

    Mine looks as follows:

    Bootargs:

    bootargs=console=ttyS0,115200n8 noinitrd rw ip=dhcp root=/dev/nfs nfsroot=$(nfshost):$(rootpath),nolock mem=60M video=davincifb:vid0=OFF:vid1=OFF:osd0=720x576x16,4050K davinci_display.cont2_bufsize=1843200 dm365_imp.oper_mode=0 vpfe_capture.interface=1 vpfe_capture.bufsize=1843200

    loadmodules.sh:

    rmmod cmemk 2>/dev/null
    rmmod irqk 2>/dev/null
    rmmod edmak 2>/dev/null
    rmmod dm365mmap 2>/dev/null

    insmod cmemk.ko phys_start=0x83C00000 phys_end=0x88000000

    pools=1x384,2x5984,2x3133440,1x16384,1x48952,1x20480,1x60288,1x74,1x28,1x2048,1x6785280,1x146,1x896,1x65536,1x98,1x296,29x56,2x24,1x624,4x62,1x1456,1x18321120,1x65792,5x3523584,1x4194304,1x8355840,1x28672

    insmod irqk.ko
    insmod edmak.ko
    insmod dm365mmap.ko

    rm -f /dev/dm365mmap
    mknod /dev/dm365mmap c `awk "\\$2==\"dm365mmap\" {print \\$1}" /proc/devices` 0

    When I run CE_DEBUG=1 ./encodedecode -y 3 -I 4 -p, I get the following output:

    Encodedecode demo started.
    CMEMK Error: Failed to find a pool which fits 28672
    CMEM Error: getPCMEMK Error: get_phys: Unable to find phys addr for 0x00000000
    ool: Failed to gCMEMK Error: get_phys: get_user_pages() failed: -14
    et a pool fittinCMEMK Error: GETPHYS: Failed to convert virtual 0x0 to physical.
    g a size 28672
    CMEMK Error: get_phys: Unable to find phys addr for 0x00000000
    CMEM Error: getPhys: Failed to gCMEMK Error: get_phys: get_user_pages() failed: -14
    et physical addrCMEMK Error: FREE: Failed to convert virtual 0x0 to physical
    ess of 0
    CMEM Error: free: failed to free 0
    @0,712,459us: [+6 T:0x4001f040] CE - Engine_init> CE debugging on (CE_DEBUG=1; allowed CE_DEBUG levels: 1=
    min, 2=good, 3=max)
    CMEMK Error: Failed to find a pool which fits 1382400
    CMEM Error: getPool: Failed to gCMEMK Error: Failed to find a pool which fits 1382400
    et a pool fitting a size 1382400
    @0,840,407us: [+7 T:0x41335490] OM - Memory_contigAlloc> ERROR: CMEM alloc failed
    Failed to allocate memory.
    @0,840,741us: [+7 T:0x41335490] ti.sdo.dmai - [BufTab] Failed to allocate buffer 0 for BufTab
    Error: Failed to create buftab
    CMEM Error: getPool: Failed to get a pool fitting a size 1382400
    @0,847,956us: [+7 T:0x41b35490] OM - Memory_contigAlloc> ERROR: CMEM alloc failed
    Failed to allocate memory.
    @0,848,256us: [+7 T:0x41b35490] ti.sdo.dmai - [BufTab] Failed to allocate buffer 0 for BufTab
    Error: Failed to create BufTab for video encode

    Does anyone know, where's the problem here?

    I tried adding the missing pools in the loadmodules.sh, but unfortunately without success.

    I hope anyone can help me.


    Edit:

    I got the encodedecode-demo working, with the following insmod of cmemk:

    insmod cmemk phys_start=0x83C00000 phys_end=0x88000000 pools=1x16539648,1x4841472,4x1843200,14x1646592,1x282624,1x176128,1x147456,1x69632,1x61440,1x32768,2x20480,1x16384,1x12288,4x8192,69x4096 allowOverlap=1 phys_start_1=0x00001000 phys_end_1=0x00008000 pools_1=1x28672


    And extending the "pools" with 1x1382400 even allows recording with the encode-demo.

    Decode doesn't still work for me, but I hope I get it working soon.

    BR,

    Simon

  • Simon,

    Yes, I finally did get my LI-5M02 camera board working with my DM365 EVM although neither Leopard Imaging nor TI issues documentation with this product - for US$ 300, I guess we aren't supposed to expect very much (I'll give bootargs details below).  Of course the camera is out of focus because, I assume, the default focal point setting for the camera is set at about 1/2 inch; however, objects at the focal point are crisp and the colors are correct (with proper lighting).  Interestingly, I notice that the CMOS sensor has about 5 defective pixels, i.e. those that are pure white or pure black - I believe that this is probably typical for this size sensor. 

    I'm looking at both af_example.c, aew_example.c, and the Aptina MTP031 driver to understand how to manipulate the camera's focus and white balance: TI and Leopard Imaging documentation in for these applications is non-existent plus the build configuration is out-of-sync with the Arago Linux Git.

    I am able to run both the encodedecode and encode DMAI sample applications distributed with TI's DVSDK 4.01:for both applications I run /etc/init.d/loadmodule-rc restart first then ./encodedecode -I 4 -p -y 3 or ./encode -I 4 -v test.264 (see encodedecode.txt, encode.txt, and decode.txt for details about setting application parameters from the command line).

    Here is the command I use to set my bootargs for operation with the LI-5M02 camera board: 

    setenv bootargs console=ttyS0,115200n8 rw mem=60M ip=192.168.2.127:192.168.2.2:192.168.2.1:255.255.255.0::::off root=/dev/nfs nfsroot=192.168.2.2:/home/cimarron/targetfs video=davincifb:vid0=OFF:vid1=OFF:osd0=720x576x16,4050K dm365_imp.oper_mode=0 davinci_capture.device_type=3 vpfe_capture.cont_bufsize=6291456 davinci_enc_mngr.ch0_output=COMPONENT davinci_enc_mngr.ch0_mode=480P-60 vpfe_capture.interface=1 

    the highlighted parameters are key - the application should do the rest.

    Also, ./decode plays fine for me I do /etc/init.d/loadmodule-rc restart first then ./decode -v test.264 and the H.264 decoded video looks normal.

    I hope this helps,

    -Chuck 

  • Hey Chuck,

    thank you for your quick response.

    As I mentioned above, encodedecode and encode are working for me now. Decode needed further pool extensions, but works now fine, too.

    My bootargs look a bit different, I don't have inserted the 'davinci_capture.device_type=3', which you've highlighted, but as it seems I actually don't need it. Additionally my 'vpfe_capture.cont_bufsize' is much smaller than yours. The focus of my camera can be adjusted manually, so I got really good pictures, only the white balance has to be adjusted.

    Further plan for me is to test gstreamer. I hope with the correct bootargs and insertion of the cmemk module, this won't be too hard.

    Furthermore I plan to extend my EVM with a LCD, connected to the digital video output, as the future custom hardware should display videos on a LCD.

    Which output are you using? Do you know, if it's possible to record a 720p video with the sensor and output it simultaneously with a lower resolution on some output, e.g. digital video out, or S-Video?

    BR,

    Simon

    PS: Did you get your EVM and your Li-5m02 for US$300? I envy you guys, I paid ~€600 for the EVM and ~€200 for the Cam-Board. I should think about a change of my location;)

  • Simon,

    Thank you for the tip: due to the lack of documentation, I had no idea that the LI-5M02 camera lens was adjustable. Adjusting the lens manually, I now have crystal clear H.264 720p video on the component outputs.

    My current application is an IP camera with basic object recognition/tracking analytics that streams H.264 720p video using RTP/RTCP over both wired and wireless networks. We plan eventually to add DM8148/DM8168 based cameras that are capable of performing advanced 3D object recognition/tracking analytics.

    As to your question: I believe that the DM365 probably - if you are careful with your software design - has the capability (ARM/HW MIPS) to both encode and locally record H.264 720p video then output a down-scaled (resized) version of the real-time camera input to a composite or S-Video output. The DM368 certainly has the MIPS to due what you want to do.

    Regards,

    -Chuck

    P.S. The cost of my DM365 EVM was US$605 from Avnet in mid 2010.

  • Simon,

    Were you able to set the camera's white balance? If so, how did you do that?

    BTW: when we tried to use GStreamer about a year ago, we found it to be very difficult to use, way behind in feature set implementation, and unreliable (full of bugs). As a result, we decided to do our own implementation of the H.264 720p encoder, RTP/RTCP streaming server, and RTP/RTCP client.

    Regards,

    -Chuck

  • Hey Chuck,

    I didn't try to do the white balance till now, but there is a 'semi-auto white balance' function in the 'capture_prev_rsz_onthe_fly_bayer_mew' application, which is described here: http://processors.wiki.ti.com/index.php/UG:_DaVinci_PSP_Installation_on_DM36x_EVM#Building_and_using_the_MT9P031_CMOS_image_sensor_driver

    Maybe the source of this application helps. Unfortunaltely I didn't have time to look at it, yet.

    BR,

    Simon

  • Simon / All,

    FYI: I have successfully implemented a 2nd order RGB gain loop for the LI-5M camera board.  I use the AEW Engine to generate statistics regarding the exposure / white balance of the camera in real time then use this data to adjust the RGB gain setting either up or down.  I did not use the approach in capture_prev_rsz_onthe_fly_bayer_mew.c since this approach uses a single-shot gain adjustment mode and we need a gain adjust loop that works continuously.

    If you plan to implement an exposure / white balance function, either one-shot or continuous mode, here are a few tips:

     1. You will need to set up the AEW Engine to get exposure / white balance statistics, for example:

        /* Open the AEW Engine Driver */

        aewFD = open(DRIVER_NAME, O_RDWR | O_NONBLOCK);

        if (aewFD < 0) {

            printf("Error in opening device file\n");

            goto cleanup;

        }

        /* Allocate memory for Configuration Structure */

        configSet = (struct aew_configuration *) malloc(sizeof(struct aew_configuration));

        configGet = (struct aew_configuration *) malloc(sizeof(struct aew_configuration));

        /* Configure window parameters */

        configSet->window_config.width = 126;

        configSet->window_config.height = 126;

        configSet->window_config.hz_line_incr = 8;

        configSet->window_config.vt_line_incr = 8;

        configSet->window_config.vt_cnt = 2;

        configSet->window_config.hz_cnt = 2;

        configSet->window_config.vt_start = 300;

        configSet->window_config.hz_start = 500;

        /* Configure black window parameters */

        configSet->blackwindow_config.height = 4;

        configSet->blackwindow_config.vt_start = 11;

        /* Enable A-law and set Saturation Limit */

        configSet->alaw_enable = H3A_AEW_ENABLE;

        configSet->saturation_limit = 255;

        /* Set AEW statistics output format mode */

        configSet->out_format = AEW_OUT_SUM_ONLY;

        bufferSize = ioctl(aewFD, AEW_S_PARAM, configSet);

        if (bufferSize < 0) {

            printf("Error setting AEW Engine parameters: %d\n", bufferSize);

            goto cleanup;

        }

        /* Call get parmaters to obtain AEW Engine parameters */

        aewResult = ioctl(aewFD, AEW_G_PARAM, configGet);

        if (aewResult < 0) {

            printf("Error in ioctl AEW_G_PARAM: %d\n", aewResult);

            goto cleanup;

        }

     2. Then you'll need to implement a new function - implemented it in DMAI Capture.c - to adjust the RGB gain, for example:

    /******************************************************************************/

    /* Acapture_gain_set                                                                                           */

    /******************************************************************************/

    Int Acapture_gain_set(Capture_Handle hCapture, Int32 rgb_gain)

    {

        struct v4l2_control  ctrl;

        assert(hCapture);

        if (hCapture == NULL) {

            Dmai_err0("Failed to allocate space for Capture Object\n");

            return NULL;

        }

        /* Set the V4L2 Control ID to RGB Gain */

        ctrl.id = V4L2_CID_GAIN;

        /* Load the control value for RGB gain */

        ctrl.value = rgb_gain;

        /* Issue the call to the MT9P031 driver */

        if (-1 == ioctl (hCapture->fd, VIDIOC_S_CTRL, &ctrl)) {

            Dmai_err0("VIDIOC_S_GAIN failed\n");

            return Dmai_EFAIL;

        }

        return Dmai_EOK;

    }

     3. If you want to operate in real time, you'll need to implement a RGB gain adjust loop: I used the DMAI capture.c thread to do this such that I adjust the RGB gain once every 120 frame captures.

     4. Also, you will need to obtain - under NDA - the TI document entitled " VPFE Programmer's Guide" to get details about the AEW Engine.

     5. aew_example.c is also helpful.

    I hope this is helpful,

    -Chuck

  • Hi Chuck,

    Thank you very much for these tips, that will save me a lot of work. I hope it's as simple to implement, as it sounds ;)

    BR,

    Simon 

  • Simon.

    Glad to help - if you get stuck, let me know...

    -Chuck

  • Hi Chuck,

    I have written a auto exposure control loop using the AEW Engine to generate statistics and adjusting the camera gain and exposure accordingly.

    My configuration is very similar to the one you have posted but I find the the configuration setting configSet->window_config.height has no effect.  

    My test involves saturating the sensor with a very bright light and examining the AE packet return when doing a read from the dm365_aew driver handle.   The value returned for the first window (Data: Sub Sample Accum[3] Sub Sample Accum[2] Sub Sample Accum[1] Sub Sample Accum[0] ) is always the same regardless of the height setting.   I would expect the values to increase as the number of pixels increases assuming they are all saturated with a value of 255. 

    I'm assuming that my understanding of the AE packet as defined in VPFE user guide (SPRUGU5B) is incorrect?  Is the Sub Sample Accum and average or just the summation of all the pixels in the window?

    Regards

    Scott

  • Scott,

    It has been quite awhile since I wrote/visited my code for the AEW functionality so let me look into this so that I can give you a meaningful answer.

    -Chuck

  • Scott,

    After taking a look at my code, I have a couple of observations and suggestions:

    Note #1: I use the AEW Engine with the following settings "alaw_enable = H3A_AEW_ENABLE"; "saturation_limit"= 255"; and "output_format = AEW_OUT_SUM_ONLY".

    Note #2: I sum the 8 received AEW Accumulator values (they are variable depending on the sub-window location, scene, lighting conditions, etc.), apply a programmable gain factor, then adjust the sensor RGB Gain - using a non-linear gain look-up table - such that I'm always operating the sensor in the linear mode, i.e., 40 < RGB Gain < 80; however, my typical RGB Gain range is typically between 50 and 70 - depending generally on ambient lighting conditions.

    Suggestion: in order to better understand AEW Engine operation, try operating the sensor - and, therefore, the AEW Engine, also - in the linear mode then make your configuration changes.  After you understand the functional characteristics of the AEW Engine, then you can tune your AEW algorithm to give you the functionality and performance you desire at the intense lighting conditions you describe.

    I hope this answers your question but, if it doesn't, please send more detail regarding you AEW Engine configuration and numbers that you are seeing from the AEW Accumulator reads.

    -Chuck

  • Hi Chuck,

    Many thanks for the info and your time, it's appreciated.

    Scott

  • Hi Chuck,

    Thanks for your sharing. You code enable and config the AEW. But how to obtain the statistics? Thanks again.

  • Hao,

    As is describe earlier in this post, in order understand how to configure the  AEW Engine, you must obtain the DM36x VPFE User Guide document from TI under NDA - it will explain all...

    Good luck!

    -Chuck

  • Hi Chuck,

    Is the NDA document number SPRUGU5B? I do have NDA with Ti and I asked local FAE. But they could not find it from archive..

    Thanks

  • Hao,

    Yes, I believe that that is the document number.  You should ask you FAE to keep looking for the document within TI - IT IS available from the document archive.

    -Chuck

  • Guess I have to ask FAE again...

    Meanwhile, can anyone enlighten me which is black window please? or what is AEWINBLK register about? Can not find any descriptions. Is it a mask? or drawing of black line around a AEW window? What is it?

    Thanks a lot.

  • Hao,

    In order to understand these 'Black Window' and these registers, you need the NDA version of the DM36x VPFE Users Guide - you really cannot proceed without it, so get the document!

    -Chuck

  • Hi Chuck, appreciate your advise..

    Could you please check the NDA document you have and what is the EXACT document number and document title/name please?

    Last time I asked FAE but he could not find it. Maybe I provided him the wrong title/name. But how could I know the correct one since I never seen the real NDA document....hehehe

  • Hao,

    Yes, the title is: "TMS320DM36x Digital Media System-on-Chip (DMSoC) Video Processing Front End (VPFE) User's Guide".  The document number is: SPRUGU5B.

    Good luck!

    -Chuck

  • Thank you!

    Oh I thought it's called DM36x VPFE Programmer's Reference Guide (PRG)..

    So the NDA document name is same as the NON-NDA document name..

    FAE told me SPRUFG8C replaced SPRUGU5B..

    But I dont see anything useful in SPRUFG8C

  • Hao,

    I document title I sent you is correct - hopefully your FAE can find it quickly...

    -Chuck

  • finally i got the document. it does explain in more details. but i also realize some strange behaviors of the H3A module. maybe i/c bugs or limitations. it's definitely not something that fully working.

  • Hi Chuck:

       I am about to deal with the 3A founctionality in our system (based on DM365). I have some questions regarding the implementation of this module.  I got some advice from TI guys about the way to implement it. They said that the statistics from the AEW driver is used to configure the IPIPE parameters (RGB gains). While you suggest to issue a VIDIOC_S_CTRL call which calls the image sensor's driver to configure the sensor's RGB gain not that in IPIPE. The ioctl to PREV_S_PARAM should be called to configure RGB gain in the IPIPE. In addition, although I have looked into the awe_exmple.c, I still have little idea on how to properly configure those parameters.

    1. Would you give me an example how this statitics from the AEW driver looks like?

    2. Which RGB gain should be adjusted at run-time? IS OR IPIPE?

    3. Does NDA VPFE document show details regarding the way to configure the AEW parameter?  

    Many thanks.

    Regards,

    Jerry

  • Jerry,

    In order to understand how to set-up and use the AEW statistics, you need the NDA VPFE document (see my earlier post).

    -Chuck

  • Hello Chuck,

    Hope you could help me. I am trying to do the same and trying to use the AEW module to gather statistics for autoexposure. I have taken care to ensure that the number of windows is a multiple of 8 and check if all the parameters  i give to aew_config are valid. I have the NDA doc as well and understand the output of a aew_read. But in my application I need to start stop capturing video and therefore I start-stop the thread which collects stats. But I keep getting a 'Bad page map' error in the thread used to capture aew_stats. Did you change anything in the dm365_aew.c driver to take care of this? Or  your CMEM settings take care of this? Any help/suggestions would be highly appreciated.

    Thanks.

  • Gan,

    You have an interesting problem.  My initial thoughts are that, since your application starts/stops video capture, the AEW Engine my not be fully initialized when your thread tries to gather AEW statistics.

    To answer your questions: 1) No—I did not change anything in the dm365_aew.c driver and 2) No—I did not change any of my CMEM settings.  In my implementation, I gather AEW statistics inside the video capture thread then average the statistics over a number of video frames (>120) then use the average value to index into a look-up table containing RGB gain values that I then use to adjust the RGB (or leave its value the same if the difference is within a certain delta value).

    My thinking is that, if you have a separate thread that gathers AEW statistics, you may have to run that thread a number of times until you get valid statistics, use them to set the RGB gain value, then capture the video frame(s) or some variation of this approach.

    I hope this helps,

    -Chuck

  • Thanks Chuck for your prompt response.

    I had to make a small change in the driver ( dm365_aew.h  - output data size of each aew stats window from 16 to 18 in SUM_ONLY_MODE) which I was pretty sure and confirmed by this discussion -       http://e2e.ti.com/support/dsp/davinci_digital_media_processors/f/100/t/7640.aspx

    The problem that it works for a few start-stops and then crashes makes me think of a memory leak somewhere. I will try to put some checks for init_complete as you suggest. In the meantime is it possible for you to please share only the CMEM settings you have used (Start,end address, number of pools and sizes etc.) ?

    Again, thanks for your help.

    Gan

  • Gan,

    It's been more than a year since I implemented this so I went back to look at my code—here are some thoughts.

    Just to be clear: my RTP Streaming Client/Server Applications run on the DM365 EVM (and Ubuntu 10.04 Host)—running DVSDK 4.01—encoding and streaming both H.264 video and AAC audio.  I used the DMAI "encode application" as a starting point for this RTP Streaming Server Application.

    I initialize the EVM with different bootargs for the component vs the camera inputs—the bootargs for the camera are:

    setenv bootargs console=ttyS0,115200n8 rw mem=60M ip=192.168.2.127:192.168.2.2:192.168.2.1:255.255.255.0::::off root=/dev/nfs nfsroot=192.168.2.2:/home/cimarron/targetfs video=davincifb:vid0=OFF:vid1=OFF:osd0=720x576x16,4050K dm365_imp.oper_mode=0 davinci_capture.device_type=3 vpfe_capture.cont_bufsize=6291456 davinci_enc_mngr.ch0_output=COMPONENT davinci_enc_mngr.ch0_mode=480P-60 vpfe_capture.interface=1

    As previously discussed, I implement the automatic exposure algorithm within the "capture thread" (with no modification to the AEW driver).  My application runs without errors—I've had it running continuously for weeks at a time with no errors.  Also, after reviewing my code, I see that I had to write a new "capture_gain_set" ioctl function for the V4L2_CID_GAIN Linux I/O Driver that resides in ../dmai_2_20_00_14/packages/ti/sdo/dmai/linux/dm365/Capture.c and ../dmai_2_20_00_14/packages/ti/sdo/dmai/Capture.h in order to properly set the RGB gain (as I recall, I had to do this in order to get the RGB Gain to adjust to the range I needed).

    Also, I do a malloc() for my aewConfigSet and aewConfigGet functions when I initialize the capture thread but don't touch them again until thread cleanup.   Here is the code snippet that runs when capture.c initializes, i.e, at the top of the thread before the thread begins to loop:

    ...

    /* Open AEW Driver */ aewFD = open(AEW_DRIVER, O_RDWR | O_NONBLOCK);

    if (aewFD < 0) {

        printf("Error in opening device file\n");

        goto cleanup; }

    /* Allocate memory for Configuration Structure */

    aewConfigSet = (struct aew_configuration *) malloc(sizeof(struct aew_configuration));

    aewConfigGet = (struct aew_configuration *) malloc(sizeof(struct aew_configuration));

    /* Configure window parameters */

    aewConfigSet->window_config.width = 126;

    aewConfigSet->window_config.height = 126;

    aewConfigSet->window_config.hz_line_incr = 8;

    aewConfigSet->window_config.vt_line_incr = 8;

    aewConfigSet->window_config.vt_cnt = 2;

    aewConfigSet->window_config.hz_cnt = 2;

    aewConfigSet->window_config.vt_start = 300;

    aewConfigSet->window_config.hz_start = 500;

    /* Configure black window parameters */

    aewConfigSet->blackwindow_config.height = 4;

    aewConfigSet->blackwindow_config.vt_start = 11;

    /* Enable A-law and set Saturation Limit */

    aewConfigSet->alaw_enable = H3A_AEW_ENABLE;

    aewConfigSet->saturation_limit = 255;

    /* Set AEW statistics output format mode */ 

    aewConfigSet->out_format = AEW_OUT_SUM_ONLY;

    aewBufferSize = ioctl(aewFD, AEW_S_PARAM, aewConfigSet);

    if (aewBufferSize < 0) {

        printf("Error setting AEW Engine parameters: %d\n", aewBufferSize);

        goto cleanup; }

    /* Call get parmaters to obtain AEW Engine parameters */

    aewResult = ioctl(aewFD, AEW_G_PARAM, aewConfigGet);

    if (aewResult < 0) {

        printf("Error in ioctl AEW_G_PARAM: %d\n", aewResult);

        goto cleanup; }

    aewStatistics = (void *) malloc(aewBufferSize);

    if (aewStatistics == NULL) { printf("Error allocating aewStatistics buffer\n");

        goto cleanup; }

    /* Enable the AEW Engine */

    aewResult = ioctl(aewFD, AEW_ENABLE);

    if (aewResult < 0) {

        printf("Error enabling AEW Engine: %d\n", aewResult);

        goto cleanup; }

    ....

    Finally, I didn't do anything with my CMEM but here are are my settings (default):

       allocated heap buffer 0xc8000000 of size 0x4400000

       heap fallback enabled - will try heap if pool buffer is not available

       CMEM Range Overlaps Kernel Physical - allowing overlap

       CMEM phys_start (0x1000) overlaps kernel (0x80000000 -> 0x83c00000)

       cmemk initialized

    I hope this helps but if not please post back with more details regarding your development environment, application flow. etc.,

    -Chuck

  • Gan,

    Were you able to solve the problems you were experiencing?  If so, what was your solution?

    -Chuck

  • Hi Chuck,

    Yes, the system isn't crashing since Monday.  I have left the system for a long test and wanted to confirm before posting. Although I am pretty confident it won't crash.

    There were two modifications needed in driver - one which I already mentioned  in 'dm365_aew.'

    /*    Changed from 16 to 18. To count for extra 2 bytes for
     *  each window's unsaturated count.
     */
    #define AEW_WINDOW_SIZE_SUM_ONLY            18   

    Second was in dm365_aew.c

    if (aew_dev_configptr->config->out_format == AEW_OUT_SUM_ONLY)
            buff_size = (aew_dev_configptr->config->window_config.hz_cnt) *
                    (aew_dev_configptr->config->window_config.vt_cnt + 1) *
                    AEW_WINDOW_SIZE_SUM_ONLY;

    The plus one is to account for the 'black line' which contains same number of horizontal windows as in the selected area of interest. So it needs as much memory as one row of windows. Also I had to make sure the blackwindow vt_start is exactly one before or one after the region of interest. The hardware and the driver makes it mandatory to use the black line so the extra memory has to be accounted for. I guess the reason this bug is not seen often is because the memory obtained is always page-aligned and thus more than actually requested. So you end up with sufficient memory. I was using 32 horizontal windows and maybe made it fall over often.

    In my case I am using SUM_ONLY but this change is applicable to even other modes for black line.

    I would like to hear your thoughts and hopefully this will help anyone in future.

    Gan

  • And I used aew module description from omap4460 trm for better understanding.

  • Gan,

    It's good know that you found the AEW driver bug. I suggest that you submit this bug and your fixes to TI so that it can be verified then propagated in the next DVSDKs so that other will get the benefit.

    -Chuck