This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

DM8148 ISS

Other Parts Discussed in Thread: TPS65911

1. The ISS questions:

Reference data: SPRUH30_ISS. PDF

1-1. The ISS (imaging subsystem working position)

ISS receive signals from sensor output data, because we use ar0331 parallel output, and successfully configuration ar0331 register default output for 1080 p30 frame, make sure ar0331 work clock, vertical, horizontal sync signal and video output have signal

Question 1: ISS work in what position, is located in the A8, VPSS M3, video M3 or C64X DSP core

Question 2: VPSS M3 core of capture - display module, RGB bayer YUV format turn in ISS module ISP (ISIF IPIPE) processing or in VPSS core processing, which is a correct data processing procedure:

Sensor - ISS --VPSS-- VIDEO M3

Sensor - ISS -- VIDEO M3

Sensor -- VPSS - VIDEO M3

Question 3: VPSS and VIDEO M3 core how to control the ISS  such as lens distortion correction, noise reduction, H3A, have more related information

Question 4: how to validate the ISS module normal work, which register need configuration,register configuration process

  • Sun,

    Are you using EZSDK or DVRRDK? Answers to your questions as far as I know.

    1. ISS driver is running on VPSS-M3 core, where VPSS driver is also present.

    2. The right sequence is ISS --> VIDEO M3. ISS driver doesn't use VPSS hardware for its working. 

    3. There will be APIs provided and if you wish to control from A8, there will be Linux driver support is required. 

    4. Its a complex procedure. I don't think, it will be easy to explain it over forum.

  • thanks

    now i use dm8148 IPNC appro Reference design ,Debugging own hardware development board

    sensor : Ar0331 parallel 1080p 30fpbs/s

    reply1:  eg:    capture --- display 

                   1)  sensor output rgb bayer data to IIS ,IIS module throuth ISP (ISIF,IPIPE) RGB to YUV data processing   

                   2) VPSS M3 HDVPSS driver throuth proxy server (FVID2 Capture api ,FVID2 display api) as can also to do capture---dispaly (rgb bayer conversion yuv )

    question1:  

                    1) What is the difference between above ?

                     2) rgb bayer conversion yuv  data  process   working in ISS or HDVPSS ,it's sensor rgb bayer data output which The following correct data flow

                          case 1 : sensor----(rgb bayer output) ---- IIS ---(ISIF  IPIPE  rgb bayer to yuv) ---CBUFF

                          case 2:  sensor -- (rgb bayer output) ---- HDVPSS--(FVID2 Capture api  rgb bayer to yuv) ----(FVID2 display ap)

                          case 3:  sensor -- (rgb bayer output) ----  IIS ---(ISIF  IPIPE  rgb bayer to yuv) ----HDVPSS--(FVID2 Capture api  get yuv data) ----(FVID2 display ap)

                          Above three case whch is correct flow 

  • hi Thomas:

    ISS Overview  have a RSZ module  threre have two resizer (RSZ) accelerators,

    1) can you describe  resizer principle/use

     2) why have two  RSZ1 and RSZ2 

    figure: 

  • Jing,

    The first one means the hardware pipeline used to capture the data from the sensor in RGB and convert it and write to RAM in YUV. This is enabled by configuring the various blocks in the ISS sub system.

    The second one describes the software architecture. There is a software task known as Proxyserver which will implement remote procedure calls. Basically this is used for calling FVID2 APIs running in M3 core from the A8 core running Linux. In Linux the capture driver will make an FVID2 call (FVID2_create, queue, dequeue etc) to do various operations on M3. This so the linux will call the proxyserver APIs remotely from A8 using syslink communication driver and this will trigger a corresponding call in proxyserver and FVID2. Check the below PPT to understand what is syslink. 

    http://omappedia.org/images/7/7d/Syslink_Overview_Final_-_Public.ppt

    Regarding the second question, the right sequence.

    ISS capture->FVID2 API to pass buffers (shared between hdvpss and iss) -> A8->Linux driver will receive the buffer to application-> Encode and stream or give to local display.

  • Jing,

    Resizer module is mainly used for scale-down and scale-up of the captured images. The module also supports various input and output color formats. This essentially means that you can use it also for color conversion along with scaling or just for color conversion.

    There are two resizer modules, which can be used for generating two different output images from the same source. Once practical use is to have a low-resolution RGB video for preview and another higher resolution YUV video for encoding and streaming. There can be many use cases possible. I hope I could explain your queries properly. 

  • hi Thomas:


    Thank you for your answer, roughly I understand, now i can start tracking code solve my problems

  • hi all:

    platform: DM8127

    sensor: ar0331 parallel (1080p 30 fpbs)

    DCC: ImageTuningTool_Version V2.04

    hardware:  remove tps65911 and use spi flash

    test:       1) system work ok   2)test code   autorun_capturedisplay.sh   3) ar0331 sensor work ok

    question:

    i use DCC tools get raw bayer and yuv data  itt_capture.c  ittServer_run

    status = MessageQ_get(hIttMsgQ, &msg, MessageQ_FOREVER);  function  block 

    alg_itk_link_tsk.c Alg_ITK_Link_tskMain 

    MessageQ_get(hDccMsgQ, (MessageQ_Msg *) & msg, MessageQ_FOREVER); function  also because meassage queue not have message block

    above what causes messageQ processor commumication is not  normal what are the debug method

  • Jing,

    I'm not sure about the answer. But I suggest you to post a different query with a relevant subject line so that more people will look into this.

  • hello jing sun.

    I debug the sonser now ,and need to modify the iss register.

    1,I dont have the  SPRUH30_ISS. PDF,and haven't find it in internet.could you tell me well the file you find 。and the best way,send the file to me ,my email is

    chang-quan@126.com ,best regard!!!

    2,I use the dm8127,and chang the sensor to MN34041.display is:

    but the normal is :


     i think the color pattern is erro.so i find the SRC_COL register,and chang it ,but it still incorrect.and i have no reference to describ the register.

    so could you tell me how chang the color pattern? 


    hope for you answer,thank you!


  • above lost the incorrect picture:

  • Hi jing sun,I have a problem to ask you,hope you can reply me!

    I use IPNC 3.0 for 8127 Camera and I change my sensor to Mt9p006.The problem is , now of the sensor  output is  2592x1944 pixels and I cann't recv the picture .I see the default sensor Mt9j031 output 2560x1920 pixels,where must I change the code?Can you help me?Thanks!

  • Renjith Thomas said:

    There is a software task known as Proxyserver which will implement remote procedure calls. Basically this is used for calling FVID2 APIs running in M3 core from the A8 core running Linux. In Linux the capture driver will make an FVID2 call (FVID2_create, queue, dequeue etc) to do various operations on M3. This so the linux will call the proxyserver APIs remotely from A8 using syslink communication driver and this will trigger a corresponding call in proxyserver and FVID2. Check the below PPT to understand what is syslink. 

    http://omappedia.org/images/7/7d/Syslink_Overview_Final_-_Public.ppt

    Hi Renjith,

    This is exactly what I'm looking for. I wish to control the ISS sensor driver from system_server which runs on A8. I went over the presentation you provided but couldn't find any specific info and/or examples on how to call FVID2 functions from A8. 

    Could you point me in the right direction?

  • Hi Eliba,

    You understand how FVID2 calls are made to M3, you can check the vpss driver in drivers/video/ti81xx/ and search for fvid2. This will give an approximate idea on the code flow. If you can be clear on what exactly you are trying it will be easier to respond.

  • Hi Renjith,

    Thanks for your reply

    I have a more detailed version of my question in the following thread:

    http://e2e.ti.com/support/dsp/davinci_digital_media_processors/f/716/t/277325.aspx

  • Eliba,

    I checked the query posted in the other forum. To answer the query, I need to go through the stack before commenting. I need bit more time to find the answer.