Question:
I have written application that do 81 ReInitializations with code below.
Application fails about 7 times per 100 runs with "Segmentation fault (core dumped)".
How to do ReInitialization without Segmentation fault?
Used code of ReInitialization:
// DeInit
eglDestroyContext(g.display, g.context2);
eglReleaseThread();
// Init
g.context2 = eglCreateContext(g.display, g.cfg, g.context, contextAttribs);
eglMakeCurrent(g.display, g.surface, g.surface, g.context2);
g.glTexBindStreamIMG =(PFNGLTEXBINDSTREAMIMGPROC) eglGetProcAddress("glTexBindStreamIMG");
glTexParameteri(GL_TEXTURE_STREAM_IMG, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_STREAM_IMG, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
Full code: https://bitbucket.org/lolikandr/bccat_addr_test
History of question:
Previous question: http://e2e.ti.com/support/dsp/davinci_digital_media_processors/f/717/t/304638.aspx
According http://processors.wiki.ti.com/index.php/OpenGLES_Texture_Streaming_-_bc-cat_User_Guide :
"BCIOSET_BUFFERPHYADDR - Register the external buffer as a texture buffer to a given index. This ioctl should be called before initialize the IMG extensions."
DM8168 DVR has video decoder that change allocating frames every time when resolution of stream changed. So when we are going to start decoding new stream - we should deinitialise IMG extension, do set physical addresses and initialize IMG extension back.
Test run in bash:
# for i in `seq 1 50`; do ./bccat_aadr_test; echo $? >> result.log
Get quantity of fails:
# cat result.log | grep -v 0 | wc -l