This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

Single frame capture with ISS with viewfinder video stream?

Hi,

I'm using the Appro IPNC v2.0 software compiled for the Mistral 8184 EVM. I have an Aptina sensor (connected to the ISS parallel input) streaming 1080p bayer which is currently being encoded to h.264 and streaming to a remote display (like a viewfinder). When the user triggers the remote shutter, I'd like to reconfigure the Aptina sensor to the full 10 megapixel output with global shutter, trigger the shutter, capture the single frame (preferably in RAW bayer mode), and then revert the configuration to the low-resolution viewfinder mode.

How can I accomplish this? Can it be done quickly, and preferably without bringing down the entire Links architecture? And preferably without having to stop/restart the h.264 stream.

So:

  1. how do I configure the ISS to capture a single high resolution image and save it to RAM?
  2. how do I handle the switching within the Links architecture?

Thanks,
Chris

  • Chris,

    We have not attempted doing it in software so there might be some changes that you may need to do in the mcfw / driver layers. I would first recommend the following:

    1. For attempting dynamic change of resolution, I think M2M mode of ISP driver operation will work better. This decouples ISIF (sensor capture) from BAYER to YUV processing and helps maintain the capture continuity and mcfw chain for YUV.

    2. You should move to ver.3.0 of ipnc software release - this release already implements the M2M driver mode, so your changes will be minimal.

    3. How much frame loss is acceptable when you move from 1080p -> 10M -> 1080p? It would also depend upon how fast you are able to stream the 10M to the soc.

    Regards

    Rajat

  • Rajat,

    Thanks for responding. A few follow ups:

    1) What exactly do you mean by M2M mode? Do you mean routing the raw bayer image directory to memory and then later performing bayer conversion as a second step as an M2M operation while the 1080p stream is back in operation? Or did you mean something else? Is it possible to perform the M2M bayer conversion process concurrently with the 1080p video capture? How would I go about decoupling the ISIF stage from the ISS -> memory pipe for the still capture operation?

    2) I'm in the processing of moving to IPNC 3.0 and have seen the new links.

    3) I don't have a number for the amount of acceptable frame loss -- I need to minimize the frame loss as much as possible because I absolutely must capture a megapixel still image.

    Thanks again,
    Chris

  • Rajat please correct if i'm wrong.

    Chris,

     we have continuous mode as well as M2M mode support here. 

    FYI,in Continuous mode,

    1,Sensor->ISIF[raw bayer pattern] ->(IPIPEIF)IPIPE[raw->YUV]->RSZ->DDR

    In M2M mode, there will be two stages,

    1,Sensor->ISIF->DDR[raw bayer pattern]

    2DDR->(IPIPEIF)IPIPE[raw->YUV422]->RSZ[YUV422->YUV420]->DDR

    so you can stop after finishing the 1st stage, do the ISP/sensor configuration needed for resolution change from 1080P->10M,

    trigger to enter the stage two, and do the same thing when switch it back to 1080P after one frame, so there should be no frame loss,

    in our new RDK 3.0 release,  we are doing some kind of Digital WDR process on Raw data in GLBCE link, By selecting Dynamic Range Enhancement

    from the web GUI, GLBCE link will be activated,and it start to work in M2M mode, chain is like cameralink->GLBCE link ->ISP link,

    Camera link will do stage one, ISP link will do stage two, between which GLBCE link will process Raw data in DDR,

    so you may need to connect cameralink -> ISP link directly if you don't need GLBCElink, 

    since we also support 10M resolution for Aptina mt9j003 sensor in our 3.0 release,

    so you could find the 10M Sensor register/ISP register configuration from the  source.

     

  • Hi Mingda,

    I have created a chain like you advised, using Camera link -> ISP Link, enabling M2M procesing

    The ISP link outputs two 1080p streams to a merge link which leads to encoder and RTSP stream.

    and I have a few questions regarding this:

    1) It seems that the Bayer pattern is reversed when using this method, what is the reason for this? In addition, in the first stream, the image is surrounded by a gray frame and also occasional green blinking on the sides (see attached image).

    2) I wish to create a use case where one stream is sent directly to memory and the other is processed in on-the-fly mode in IPIPE, how can this be implemented using Link API?

    3) Is there a way to perform RAW -> YUV conversion on the stream that is sent directly to memory, without using IPIPE?

  • Hi,

     

    ISP cannot introduce this green pixels, it might be coming from the sensor itself. can you check if this is some blanking data from the sensor? We could crop this blanking data from the sensor in either ipipeif or ipipe.

    ISP can work either in M2M mode or in OTF mode, so it is not possible to have two streams, one on the fly and other in M2M mode

    it is always possible to bypass all the modules in the IPIPE, so you could get RAW to YUV. But please note that IPIPE does most of the conversion from RAW to YUV. So you will have to enable some minimal modules.

     

    Regards,

    Brijesh

  • Hi Brijesh,
    Thanks for your reply.

    Regarding the image artifacts: When using a regular chain with OTF processing (Camera -> Encoder -> RTSP) I see no such effects (gray frame, green blinking). It only happens when I enable M2M mode, and even then the second stream did not show such artifacts. The green artifacts are similar to what can be seen when a resizer overflow happens. Could this be an issue with DDR bandwidth?

    Regarding the data path, I wish to use the following use case:

    Parallel sensor -> ISP on the fly -> camera link -> encoder -> rtsp stream
    MIPI sensor -> straight to memory -> (RAW to YUV?) -> encoder -> rtsp stream

    But the problem here is that encoder can only accept YUV data, and ISP is already used by the first path.
    Is it possible to do RAW -> YUV conversion on the second stream while the first stream is using the ISP already in OTF mode?
    If so I'd like to know how this can be done.

    Eli
  • It looks like cropping is enabled somewhere, but it is not when M2M mode. Here could you compare register settings of IPIPEIF, IPIPE and Resizer for both the cases? Also what do you mean by second stream?
    No, it is not possible to use ISP in both the mode. I think in this case, you should ISP in m2m mode only and convert raw data from both the streams into yuv using ISP in m2m mode.
    Regards,Brijesh
  • Right now, the second stream comes out from the camera link, as defined in low-power tri-streaming usecase, except I have both streams at 1080p. They enter a merge link and then encoder and RTSP stream.

    Regarding the ISP issue, can the ISP process two separate streams simultaneously using M2M mode?

    If so, does that apply to all modules or only RAW -> YUV ?

    To clarify, this is the scenario I am referring to

    MIPI -> DDR -> ISP(RAW->YUV) ---------------------> Merge

         |---------------->Encoder---->RTSP

    CPI -> DDR -> ISP (IPIPE + H3A + RAW->YUV) -> Merge

  • I think the second stream is down scaled version of the first stream itself. so if the first stream does not have any artifacts, second stream also should not have any artifacts. I think there is something else which is introducing these artifacts.
    H3A output is not possible since h3a output comes from the ISIF and ISIF will be used in CPI->DDR path.
    Regards,Brijesh
  • Yes, right now the second stream is a clone of the first one, but the artifacts appear only on the first stream and only when using M2M mode. What can be causing this?

    If I understand you correctly, aside from H3A, the ISP can process streams from both sensors using M2M?
    Is this supported in MCFW?