This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Using non Micron CMOS sensor with DM355 EVM

Other Parts Discussed in Thread: TVP5146

I'm trying to connect an Omnivision CMOS sensor based camera module to the DM355 EVM, is there code in the VPFE driver or posibbly in the tvp5146 driver to switch the MUX from the tvp5146 to the imager interface? If not how can I accomplish this?

  • The driver does look to handle this, though it is not in the /lsp/ti-davinci/drivers/media/video/dm355_vpfe.c file, but rather the /lsp/ti-davinci/drivers/media/video/mt9t001.c file, strangely the dm355_vpfe.c file has a comment saying this code should be added but it does appear to exist in the mt9t001.c file. If you look in the mt9t001.c file you should be able to find a function called mt9t001_configpca9543a() that performs an I2C write i2c_write_reg(&mt9t001_i2c_client, ECP_REGADDR, ECP_REGVAL, ECP_I2C_CONFIG) that looks like it would be setting the proper bit in the MSP430 on the EVM to activate the imager daughtercard interface. Details on the MSP430 code and what actually needs to happen over the I2C to activate the imager interface is given in the Spectrum Digital document below.

    http://c6000.spectrumdigital.com/evmdm355/revd/files/EVMDM355_ECP_VA4.pdf

  • I noticed that function and also the function enable_ccdc2tvp5146 in the tvp5146.c file. They appear to do the exact same thing. I tried to modify the enable_ccdc2tvp5146 function by setting data[1] = 0x80. But this did not work and the encode demo still takes input from the tvp5146 encoder. After the edit I simply rebuilt the kernel and target software according to sections 4.5 and 4.6 of the getting started guide. Is there something else I need to do?

    /*This function is used to write value into register for i2c client. */

    static int enable_ccdc2tvp5146(struct i2c_client *client)

    {

    int err = 0;

    struct i2c_msg msg[1];unsigned char data[2];if (!client->adapter) {

    err = -ENODEV;

    }
    else {

    msg->addr = 0x25;

    msg->flags = 0;

    msg->len = 2;

    msg->buf = data;

    data[0] = 0x8;

    data[1] = 0x0;

    err = i2c_transfer(client->adapter, msg, 1);

    }

    dev_dbg(tvp5146_i2c_dev,
    " i2c data write \n");

    return err;

    }

  • I am not sure if the value is reset elsewhere, or if this particular function is even called, I was just interpreting the code as it is written as opposed to trying to run it. You could put some printf statements in there to see what is actually running if you want to delve further into it, if you are just using a different imager chances are you will want to start with the existing driver and just modify it anyway.

     You are correct that to get the updated driver you just have to rebuild the kernel, however also note that in addition to building it you need to get it to the board somehow, most commonly through a TFTP boot defined in U-Boot, and discussed in 4.7 of the getting started guide.

  • Yes that is what I thought also (that the function may never be called). I traced it back to the vpfe_init() function in the davinci_vpfe.c file, as long as the device_type = TVP5146 it should be called from there. I can't find where or if the vpfe_init() function is called or if the value is reset elsewhere, the only other place I see it set is in the mt9t001.c file but there it is set to the value I want anyway (to select the imager). I always boot from TFTP using NFS so now I know the drivers are being updated. My feeling is that the function must not be getting called. I'll try to adjust it to ensure it is called. Thanks.

  • FYI, device_type is passed in via u-boot bootargs.  This is detailed in the LSP User Guide (SPRUFG0 included in DVSDK 1.30.00.40); tvp5146 is the default device, but you can always make sure it is selecting this by passing in

     v4l2_video_capture=device:TVP5146

    via bootargs.  Also, I would suggest you add a printk statement in the function on interest just to make sure it is indeed being called. 

     

  • I added some prink statements inside the vpfe_init function and my debug message did not print when I ran the encode demo. I'm assuming the encode demo uses the davinci_vpfe.c driver? I even set the v4l2_video_capture:device=TVP5146 in the bootargs.

  • Please note that the debug printk messages will likely print during driver initialization (during boot process) and not when you run the demo.  Can you check the boot log to see if you find the printk statement you added?

  • Ok the printk statements are being printed, so I know that the function is being called. I changed the value being written to address 0x25 offset 0x08 from 0x00 to 0x80 but this still doesn't work and my video is coming from the tvp5146. According to the docs at spectrum digital the typical write cycle is [S]01001010[A]yyyyyyyy[A]wwwwwwww[A][P] where S is a start sequenc, A is an ack, and P is a stop sequence. The y bits are the offset, and the w bits are the value to be written to the register. From this it looks like 0x25 shifted left by one, so I tried address 0x4a but this gives me an i2c nack detected message when booting and the encode demo runs fine but still takes video from the tvp5146. I don't see anywhere else that the different values are rewritten. Are these the correct values? I'm assuming they are correct since these are the ones also in the mt9t001.c driver. What else could the problem be?

  • The last bit before the ACK is technically not an address bit by the I2C specification (it actually uses a 7 bit address), that would actually be a R/W bit, where 0 means a write out to the slave device. Therefore although the value in that bit sequence looks like it is 0x4A, the address actually used there is 0x25 which is what it should be per the SD document. This being said, I believe the values are correct, so I could not say for sure why it is not changing, one possible explanation is that code somewhere else we have not come across is changing it back.

    Unfortunately this is complicated to debug because this is an I2C access to another processor, and not just a GPIO pin or CPLD that could be easily seen over the EMIF. If you have the hardware around, you could take a scope and probe the DECODER_IMAGER signal shown on the schematic, ultimately setting the I2C bit should toggle this signal, and you could prove to yourself that your write is working. Unfortunately there is not a great place to probe it, though it comes out to a few different ICs, and a resistor R4 on the bottom of the board, any probe looks like it will be a tedious hand held pointy probe.  

    http://c6000.spectrumdigital.com/evmdm355/revd/files/EVMDM355_Schematics_RevD2.pdf

    If you could see the signal toggling during the driver bring up you will at least know if the code is working, and that something somewhere else is disabling it again.

    You could also try using what Juan mentioned with device=MT9T001 in your U-Boot arguments, as that is tested it with a imager EVM must be switching the mux.

  • It's strange when I tried that, setting device=MT9T001 in the bootargs, it does give me a different result. I don't if it changed the mux or not. It gives me this error: Failed to set video input to 0. I still had my video input connected to the composite video in and my camera was also connected. So either it changed the mux and it just couldn't read my camera, or setting device=MT9T001 changed the configuration of the VPFE so it can no longe read in the composite from the decoder.

  • It likely does both changing the mux and changing the VPFE so it no longer reads from the on board decoder, it seems that it fails if it cannot successfully communicate with the MT9T001, so I suppose that test unfortunately proves little on this issue.

     Do you have a scope to probe the DECODER_IMAGER signal? That should tell you if and when the imager header is enabled.

  • I won't be able to get into the lab until Tuesday to put it on the scope. But I don't think setting the bootarg to MT9T001 changes the mux either. For some reason that error went away and the video is still coming from the tvp5146. This really doesn't seem to make any sense. I've looked through just about every driver file for the davinci and none seem to do anything to the MSP430 so it doesn't appear to be restting. Even thought the i2c_client is called the tvp5146_i2c_client and the i2c_driver is called the tvp5146_i2c_driver I should still be able to communicate with any i2c compliant device (such as the MSP430 or my OV7620 CMOS sensor) that is connected to the i2c bus using the functions in the tvp5146.c file correct? It doesn't seem like this should be very hard to do but I've yet to find anybody that knows how to do this, even though many people are trying.

  • I was told that there was a problem with the firmware for the MSP430 and that the only way to control the mux is externally by switching the DECODER_IMAGER pin of the mux. Is this true, can anybody confirm this?
  • Tony,

    I am not aware of this issue but will look into it; have you tried writing to MSP430 to control LEDs just to make sure I2C is working properly?  This is something that can be tried quickly as a sanity check. 

  • Tony,

    I forgot to ask, where did you here about the firmware issue?  Perhaps knowing the origin of this would help us confirm its validity in a more timely manner. 

  • This was told to me by somebody I've been talking to through the mailing list, we were both trying to connect external devices to the dm355, I don't know if he assumed this or where he got this info. I'll ask him. Meanwhile I'll try to test the I2C by working with the LEDs.

  • I could not find any record of a bug like this, but on the other hand I have never actually tried using the imager with the DM355 as we do not have one in our office. I will try to look into this a bit further, but it looks like there is at least code in the MSP430 firmware that should be handling this. You can find the firmware source at the URL below, and the file that contains the code to change the imager mux is ECP_I2C_HW.s43 in the function ProcessI2CWrite. Unfortunately this is all written in assembly so it is a bit hard to follow, as I do not work with MSP430 assembly much I can not say for certain if I am overlooking a bug in there, but it seems reasonable to me.

    http://c6000.spectrumdigital.com/evmdm355/revd/files/MSP430-VA5.zip 

     Below is a code snippet showing where it actually changes the GPIO pin state:

    ; Offset 0x08 - Video Input Mux Control
    WriteOffset0x08   and.b   #MUX_IMAG_5146z,R5          ; Mask other bits
                      mov.b   &P3IN,R7                    ; Get current port state
                      bic.b   #MUX_IMAG_5146z,R7          ; Clear current state
                      bis.b   R5,R7                       ; Set new state
                      mov.b   R7,&P3OUT                   ; Write it out
                      jmp     I2CWriteProcessed           ; We are done

  • I checked with the factory folks on this and they were able to test the EVM with a MT9T031 imager EVM successfully, with no issues changing from the TVP to the imager header. This being said I do not believe there is a bug in the MSP430 firmware that would prevent this.

  • You wouldn't happen to know how they did it would you? Because modifying the I2C write functions that write to address 0x25 offset 0x8 in both the tvp5146.c and mt9t001.c driver files did not work for me. Nor have I been able to find anybody that has done this without doing it manually (ie physically pulling the imager select line on the mux low or high).

  • They test with the drivers as is on the DM355 EVM. For using the imager interface they would put davinci-vpfe.device_type=0 in the bootargs, which tells the driver that it should be looking for a MT9T001 sensor. If it is not properly switching to the imager when you run this than I am guessing that the driver is defaulting back to the TVP5146 somehow if it cannot initiate communication with the MT9T001, unfortunately I dont have one of the MT9Txxx sensor EVMs to actually try this out

  • All this functionality is built into the drivers in DVSDK 1.30.  You just need to enable the drivers in the kernel (if they are not enabled by default already) and pass in the appropriate u-boot settings via bootargs (see SPRUEP7 included in DVSDK).

  • Bernie has a good point, if the appropriate hardware is not connected (e,g, micron MT9001), then you should see errors.  Also, were you able to test to I2C interface to MSP430 by toggling LEDs?

  • As a move of desperation I installed the previous version of the LSP just to see what would happen and I am now able to set different values to the LEDs. I tried my original approach (i.e. setting data[1] = 0x80 in enable_ccdc2tvp5146 in the tvp5146.c file) and this seems to be working. With this value pin 1 (select) of each mux is set to 3 V, when I put the original value data[1] = 0x0 back pin 1 of each mux is pulled low. And now when I try to run the encode demo I get an error - Error: Cannot open /dev/video0 (No such file or directory) even if there is a valid source connected to the tvp5146 and I also get a bunch of NACK detected warnings when the kernel is booting, whereas before the demo would run fine from the tvp5146 regardless of the value of data[1]. When I change data[1] back to 0 the demo runs fine. I'm guessing this error is due to the fact that it no longer sees the tvp5146 and is looking at the imager interface. I want to use an OV7620 camera sensor with the same output as what the tvp5146 gives, CCIR656 (I believe is the same as bt.656 output by tvp5146). Is there a way I can configure the OV7620 for use with the dm355 without writing my own driver? Could I just modify the tvp5146.c file by adding code to simply write the appropriate values to the OV7620 via I2C? I just want to get something working quickly.

     

    Thanks

  • You could probably modify the TVP5146 driver to do that, as it looks like the imager supports BT.656, the rest of the VPFE settings will probably work properly. Usually when writing a new driver you will want to take an existing driver to start with anyway, so if the TVP5146 outputs like your imager than it would probably be the best one to start with.

    EDIT: You say the previous version of the LSP worked, but the one you were using would not, what versions are you working with?

  • It is my understanding that BT.656 applies only to standard TV modes (NTSC/PAL); I am not certain if I am looking at the right OV7620 data-sheet, but the data-sheet I quickly found on the web suggests CCIR656 VGA.  Just something to keep in mind if things do not work.

    Also, normally chip vendors provide drivers for their corresponding part which provides a good starting point; did you get a driver for the OV7620 from the manufacturer?  If not, I would at least ask if they have one available as porting is often easier than writing the main body of the driver on your own.  The tvp6146 driver will probably be useful in providing the framework (driver entry points, i2c communication...), but the main body would likely be much different for the OV7620 device.

    Anyway, just some random thoughts to keep in mind.

  •  I am currently using version 01.20.00.004.1, I was previously using the most current version available for download from TI's updates version 01.20.00.014.

     I wonder if the only change I need to make for my sensor, since it's vga, is to change the resolustion from 720x480 to 640x480? Any ideas about the cannot open /dev/video0 error? I've read on the mailing list people doing similiar things using non micron sensors. And most have simply changed tvp5146.c to I2C write to their sensor, but I'm getting this error. Maybe I need to adjust the settings? There are some drivers with the LSP for this particular OVCAM chip, but they seem to be using V4L1 and aren't incorporated with the VPFE. There are located in drivers/media/video/ovcamchip but I can't follow them I've never written drivers or used V4L. Like I said I don't want to have to write my own driver. Too bad they didn't provide support for OV devices.

  • Tony,

     I think you are referring to the LSP version; I believe the only DVSDK version available for DM355 in our software update site is 1.30.00.40  I have that version installed and can run the demos without any issue.

     Is it possible you are not updating all software components correctly; please see the following post

    https://community.ti.com/forums/t/75.aspx

     Failing to open drivers (normally represented by /dev/xxx device nodes) usually means drivers did not load correctly (hardware not present, drivers are disable in kernel, software not upgraded correctly...) 

  •  Yes I was telling him the LSP version I'm using because that has the tvp5146.c, davinci_vpfe.c etc. drivers and this verison of the LSP allows me to switch the mux by modifying tvp5146.c whereas before the same code change did nothing. The dvsdk I am using is 1.30.00.23. Basically I am using only the software on the cds and not using any of the updates. The demos run fine, but when I switch the mux I get the error : cannot open /dev/video0, when I switch back no error.

     Also I found these drivers here which apear to be for 2.6 kernel with v4l2,  http://ovcam.org/ov511/download.html I don't if I could simply uses these for my sensor. Any idea?

  • Thank you for clearing that up.  I had a quick look at the latest source code from the link you provided.

    Normally, so long as drivers comply to Linux standards (defined APIs for the various technologies), such as I2C, V4L2, USB..., things should work.  This driver appears to be written for a USB webcam product and depends on I2C, V4L2, and USB drivers; fortunately we have all three for our EVM.  Therefore, there is a good chance that the driver will work, but since this was written for a USB Webcam, you would have to use our EVM + OVxxx daughter card + EVM connected to PC via USB port.  Also, you will need an application to exercise all these drivers.

    FYI, since you mentioned you are somewhat new to drivers, may I suggest the Linux Device Drivers book (available for free online):  http://lwn.net/Kernel/LDD3/  

  •  Oh, I was thinking I could just use the ov7620 specific driver files and that it would work. I'll have to figure something else out.

  • I am also trying to do something unusual , I want to know whether I can change the receiving window size dynamically ??

    I want to receve frame of 1450 x 6 continuously and after some finite number the size will change to 1450 x 1450 , How can I do that ?

    BCos we have to set frame fmt & buf size & all in advance !!

     

    Plus what if I end my VSYNC abruptly ?? how linux driver will respond ?

    &

    Last thing is, how can I process some arithmetic operation like adding two pixelsor averaging or subtracting two pixels with out saving the buffer into file-system ??

    Can DDR storage buffer allowed such things from user space ?? or we have to do all this from driver ??

     

  • Digant Desai said:

    I am also trying to do something unusual , I want to know whether I can change the receiving window size dynamically ?? 

    Plus what if I end my VSYNC abruptly ?? how linux driver will respond ?

    Generally speaking, you have some time (e.g. vertical blanking period, perhaps more if using double or tripple buffering ) to do some work; it does not matter if this work involves saving video data to a file, compressing a video frame, pixel averaging to pixles, chaging video resolutions (stop driver, reconfigure, and restart), or something else....therefore, un summary, if vsync is provided by an external device (DM3x is the slave), then you just need to make sure you have done everything you need before the vsync comes around; if DM3x is the master (provides vsync to external device), then I suppose you can take as much time as you need...

    Digant Desai said:

    Last thing is, how can I process some arithmetic operation like adding two pixelsor averaging or subtracting two pixels with out saving the buffer into file-system ??

    Can DDR storage buffer allowed such things from user space ?? or we have to do all this from driver ?? 

     Most buffer handling happens in DDR (both in kernel and user space) and you can certainly do pixel averaging using buffers in DDR space.  Normally, when you allocate a buffer in the stack or via 'malloc', you are creating a buffer in DDR space.