I'm trying to use OpenGL for the first time under iOS in an attempt to render a stream of YUV images by converting them to RGB. I've looked around and found some examples here and there, and I'm planning on using the vertex/fragment shaders provided in Apple's GLCameraRipple sample code. I understand the math required to perform the conversion. I am having trouble understanding the OpenGL API and how I can simply draw a 2D texture on the screen.
Can someone illustrate the high level steps required to draw a texture using OpenGL? If I understand what needs to be done conceptually I'm hoping I'd be able to figure out the details.
From what I understand so far the vertex shader is triggered before the fragment shader and the vertex shader operates on vertices and the fragment shader operates on everything in between. The varying keyword is used to send data interpolated between vertices to the fragment shader.
This yields some questions:
- Do I need to pass a set of vertices that represent the 2D frame for the texture?
- Do I need to pass the YUV data to the fragment shader before passing the vertices?
- How can the fragment shader use the interpolated coordinates to access the appropriate YUV sample?
- How can I access the end result?
- How do I trigger the operation after I've passed the appropriate data to the shaders?
Thanks for your time.
I was able to get the texture drawn. I am having some issues with the UV channels displaying the colors properly as well as orientation/mirroring issues, but these issues are out of the scope of this question.
Do I need to pass a set of vertices that represent the 2D frame for the texture?
Yes, a texture must be drawn on top of a polygon, and a 2D area can be represented as two triangles. Vertex Buffer Objects (VBOs) can be used to pass an array of vertices as well as a complementary array containing indices into the array of vertices. This data can be used by OpenGL to draw the polygons. The functions glGenBuffers, glBindBuffer, and glBufferData can be used to pass this information to OpenGL.
This page explains it nicely: http://www.raywenderlich.com/3664/opengl-es-2-0-for-iphone-tutorial
Do I need to pass the YUV data to the fragment shader before passing the vertices?
No, the vertices can be passed into OpenGL before the textures are loaded. The textures, however, must be bound and uploaded to OpenGL before the polygons are drawn.
How can the fragment shader use the interpolated coordinates to access the appropriate YUV sample?
In my implementation, with each vertex, a corresponding texture coordinate is passed. This creates a mapping of vertex positions to texture coordinates. The vertex shader outputs this coordinate as a varying parameter. Using the varying keyword allows the fragment shader to receive a coordinate value for each pixel via interpolation. The fragment shader then uses this coordinate to retrieve the corresponding texture sample. In my case it uses the coordinate to retrieve a YUV sample.
How can I access the end result?
In general the end result of the operation is pushed into a framebuffer (I'm not familiar with the details). In iOS 5+ using a GLKView, I send the presentRenderbuffer message on the EAGLContext instance to display the result.
How do I trigger the operation after I've passed the appropriate data to the shaders?
The drawing is triggered by one of the OpenGL drawing functions. After the vertex/texture coordinate as well as the texture data is uploaded to OpenGL and associated with the parameters in the shaders, I used the glDrawElements function to draw the polygons and render the textures on top.
I'd suggest just about any basic OpenGL tutorial http://www.cs.washington.edu/education/courses/cse557/00wi/projects/impressionist/help.html for instance. Also look on the Kronos group's opengl.org site.
The standard way to paint an image onto the screen in OpenGL is to draw two textured triangles (one rectangle), with your image coded as the texture. The vertex shaders tend to be very standardized and not change too much (for 99% of the cases). You image processing, like rippling or color-space-conversion, will all be in the pixel shader.
- 相关文章
- Fast Blit with OpenGL ES 2.0 and texture (410人浏览)
- OpenGL ES From the Ground Up, Part 6: Te (661人浏览)
- 期待解决Android OpenGL ES显示yuv时 glTexSubImag (1829人浏览)
- 最新Android面试题整理 (2434人浏览)
- 最新Android面试题整理 (1153人浏览)
- Android利用GPU高清视频播放的分析 (705人浏览)
- 编译Android源码和Linux内核源码时候遇到的一些错误 (163人浏览)
- 1楼 荷兰网 发表于 2015-3-2 21:41:26
- 好文章,内容才高八斗. 荷兰网 http://www.zhongguohelanwang.com/
- 2楼 腾博会 发表于 2016-9-19 23:37:04
- 嗨,请问小编能够让我转载这篇文章吗?我会备注原文来由链接的以及作者。 腾博会 http://www.bjwlslm.com