This thread has been locked.

If you have a related question, please click the "Ask a related question" button in the top right corner. The newly created question will be automatically linked to this question.

TDA4 questions about camera



hi,

I have some questions about camera in TDA4 sdk 8.0:

1. how to use yuv camea sensor and captureNode in different processes at the same time, just as in single process? or camera sensor should be controlled in only one process,captureNode can be used in any process at the same time?

2. if the captureNode has Frame Synchronization with different channels in yuv camera sensor?

3. what the different kernels is used for?

  • Hi xin chao,

    Please find answers to your questions below.

    1. how to use yuv camea sensor and captureNode in different processes at the same time, just as in single process? or camera sensor should be controlled in only one process,captureNode can be used in any process at the same time?

    Why do you want to control sensor and capture node from different processes? There is a sequence that needs to be followed in order to get capture working. So if you have two different processes managing sensor and capture node, you would have to create some inter process communication to control the flow of execution.. 

    2. if the captureNode has Frame Synchronization with different channels in yuv camera sensor?

    Do you mean yuv data being captured are from different sensor? Yes, as long as, they all are from same fps, capture node should be able to sync them and capture them..

    3. what the different kernels is used for?

    If you want specific camera/channel to be handled by separate task, in that case, you could assign it to different target..

    Regards,

    Brijesh

  • we have different processes to do camera data processing, eg, one process is used to do surround view and the other one is used to do object detection.

    In this case, what should we do?

  • But in this case, you could two graphs in two threads, which can be part of the same process and share the data between two graphs? 

    One graph can handle SRV and other can handle object detection..

    Regards,

    Brijesh

  • yes, we can do this. But for architectural reasons, we use different processes.

    In this case, we just need to handle camera sensor(app_iss.c api) in A process. eg, A process opens camera, A and B process can use capturenode directly.  B process can tell A process to close camera with IPC message after done. Am i right?

  • Hi xin chao,

    But in this case, are both the processes using same camera input? or are these camera connected to the same port? 

    How about this, capture node and sensor framework should be handled by only one of the process, once a frame is available, it can then be shared to other process by using methods described on below link.

    https://stackoverflow.com/questions/2358684/can-i-share-a-file-descriptor-to-another-process-on-linux-or-are-they-local-to-t

    Regards,

    Brijesh

  • yes,the both processes using same camera input.

    Do you mean capture node and sensor framework must be handled by only one of the process?

  • Yes, handle sensor and capture node in a single process and then share the captured buffer (or whatever output buffer that second process requires) to another process. 

    Regards,

    Brijesh