How to display a raw YUV frame in a Cocoa OpenGL program?

This answer is not correct, see the other answers and comments. Original answer left below for posterity.

This answer is not correct, see the other answers and comments. Original answer left below for posterity. You can't display it directly.

You'll need to convert it to an RGB texture. As you may have gathered from Wikipedia, there are a bunch of variations on the YUV color space. Make sure you're using the right one.

For each pixel, the conversion from YUV to RGB is a straightforward linear transformation. You just do the same thing to each pixel independently. Once you've converted the image to RGB, you can display it by creating a texture.

You need to call glGenTextures() to allocate a texture handle, glBindTexture() to bind the texture to the render context, and glTexImage2D() to upload the texture data to the GPU. To render it, you again call glBindTexture(), followed by the rendering of a quad with texture coordinates set up properly. // parameters: image: pointer to raw YUV input data // width: image width (must be a power of 2) // height: image height (must be a power of 2) // returns: a handle to the resulting RGB texture GLuint makeTextureFromYUV(const float *image, int width, int height) { float *rgbImage = (float *)malloc(width * height * 3 * sizeof(float)); // check for NULL float *rgbImagePtr = rgbImage; // convert from YUV to RGB (floats used here for simplicity; it's a little // trickier with 8-bit ints) int y, x; for(y = 0; y Use linear filtering for minification // and magnification.

GlTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); // free data (it's now been copied onto the GPU) and return texture handle free(rgbImage); return texture; } To render: glBindTexture(GL_TEXTURE_2D, texture); glBegin(GL_QUADS); glTexCoord2f(0.0f, 0.0f); glVertex3f( 0.0f, 0.0f, 0.0f); glTexCoord2f(1.0f, 0.0f); glVertex3f(64.0f, 0.0f, 0.0f); glTexCoord2f(1.0f, 1.0f); glVertex3f(64.0f, 64.0f, 0.0f); glTexCoord2f(0.0f, 1.0f); glVertex3f( 0.0f, 64.0f, 0.0f); glEnd(); And don't forget to call glEnable(GL_TEXTURE_2D) at some point during initialization, and call glDeleteTextures(1, &texture) during shutdown.

Thanks a lot for the information Adam I really appreciate it. Now can you tell me how to extract the raw data file from the hard disk? I am familiar with NSOpenPanel, I can use that to extract the data file path, however, how do I use a file path and load the YUV file into the application?

– ReachConnection Jul 3 '09 at 20:28 Assuming that the file contains nothing but pixels, you'd simply use NSData or NSFileHandle and read everything. If the file contains metadata such as size information, you're going to have to interpret that using the standard C pointer and/or structure operators. – Peter Hosey Jul 4 '09 at 1:20 This answer is incorrect.

Starting with MacOS 10.2 and later, every Apple system supports an OpenGL extensions, that can directly load and display YUV data. Search for "GL_YCBCR_422_APPLE" for details. Whether the data is converted somewhere (by the driver in software, on the GPU in hardware, etc.) is none of your business when using this extension (and when the GPU is converting it, trust me, it can beat your code sample of above by at least 1000%) – Mecki May 25 '10 at 10:48.

Adam Rosenfield’s comment is incorrect. On Macs, you can display YCbCr (the digital equivalent to YUV) textures using the GL_YCBCR_422_APPLE texture format, as specified in the APPLE_ycbcr_422 extension.

I've done this with YUV frames captured from a CCD camera. Unfortunately, there are a number of different YUV formats. I believe the one that Apple uses for the GL_YCBCR_422_APPLE texture format is technically 2VUY422.To convert an image from a YUV422 frame generated by an IIDC Firewire camera to 2VUY422, I've used the following: void yuv422_2vuy422(const unsigned char *theYUVFrame, unsigned char *the422Frame, const unsigned int width, const unsigned int height) { int I =0, j=0; unsigned int numPixels = width * height; unsigned int totalNumberOfPasses = numPixels * 2; register unsigned int y0, y1, y2, y3, u0, u2, v0, v2; while (i Width, self frame.size.

Height); glMatrixMode(GL_PROJECTION); glLoadIdentity(); NSRect bounds = NSRectFromCGRect(self bounds); glOrtho( (GLfloat)NSMinX(bounds), (GLfloat)NSMaxX(bounds), (GLfloat)NSMinY(bounds), (GLfloat)NSMaxY(bounds), -1.0, 1.0); glBindTexture(GL_TEXTURE_RECTANGLE_EXT, 1); glTexSubImage2D (GL_TEXTURE_RECTANGLE_EXT, 0, 0, 0, videoImageWidth, videoImageHeight, GL_YCBCR_422_APPLE, GL_UNSIGNED_SHORT_8_8_REV_APPLE, videoTexture); glMatrixMode(GL_TEXTURE); glLoadIdentity(); glBegin(GL_QUADS); glTexCoord2f(0.0f, 0.0f); glVertex2f(0.0f, videoImageHeight); glTexCoord2f(0.0f, videoImageHeight); glVertex2f(0.0f, 0.0f); glTexCoord2f(videoImageWidth, videoImageHeight); glVertex2f(videoImageWidth, 0.0f); glTexCoord2f(videoImageWidth, 0.0f); glVertex2f(videoImageWidth, videoImageHeight); glEnd().

Thanks a lot brad – ReachConnection Jul 6 '09 at 19:49.

I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.

Related Questions