Screenshot of OpenGL ES content for Paint app?

I don't believe that resolution is your issue here. If you aren't seeing the grayish outlines on your drawing when it appears on the screen, odds are that you're observing a compression artifact in the saving process. Your image is probably being saved as a lower-quality JPEG image, where artifacts will appear on sharp edges, like the ones in your drawing.

Up vote 5 down vote favorite 4 share g+ share fb share tw.

I’m working on a paint app for iphone. In my code I'm using an imageView which contain outline image on which I am puting CAEAGLLayer for filling colors in outline image. Now I am taking screenshot of OpenGL ES CAEAGLLayer rendered content using function: - (UIImage*)snapshot:(UIView*)eaglview{ GLint backingWidth1, backingHeight1; // Bind the color renderbuffer used to render the OpenGL ES view // If your application only creates a single color renderbuffer which is already bound at this point, // this call is redundant, but it is needed if you're dealing with multiple renderbuffers.

// Note, replace "_colorRenderbuffer" with the actual name of the renderbuffer object defined in your class. GlBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer); // Get the size of the backing CAEAGLLayer glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth1); glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight1); NSInteger x = 0, y = 0, width = backingWidth1, height = backingHeight1; NSInteger dataLength = width * height * 4; GLubyte *data = (GLubyte*)malloc(dataLength * sizeof(GLubyte)); // Read pixel data from the framebuffer glPixelStorei(GL_PACK_ALIGNMENT, 4); glReadPixels(x, y, width, height, GL_RGBA, GL_UNSIGNED_BYTE, data); // Create a CGImage with the pixel data // If your OpenGL ES content is opaque, use kCGImageAlphaNoneSkipLast to ignore the alpha channel // otherwise, use kCGImageAlphaPremultipliedLast CGDataProviderRef ref = CGDataProviderCreateWithData(NULL, data, dataLength, NULL); CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB(); CGImageRef iref = CGImageCreate(width, height, 8, 32, width * 4, colorspace, kCGBitmapByteOrder32Big | kCGImageAlphaPremultipliedLast, ref, NULL, true, kCGRenderingIntentDefault); // OpenGL ES measures data in PIXELS // Create a graphics context with the target size measured in POINTS NSInteger widthInPoints, heightInPoints; if (NULL! = UIGraphicsBeginImageContextWithOptions) { // On iOS 4 and later, use UIGraphicsBeginImageContextWithOptions to take the scale into consideration // Set the scale parameter to your OpenGL ES view's contentScaleFactor // so that you get a high-resolution snapshot when its value is greater than 1.0 CGFloat scale = eaglview.

ContentScaleFactor; widthInPoints = width / scale; heightInPoints = height / scale; UIGraphicsBeginImageContextWithOptions(CGSizeMake(widthInPoints, heightInPoints), NO, scale); } else { // On iOS prior to 4, fall back to use UIGraphicsBeginImageContext widthInPoints = width; heightInPoints = height; UIGraphicsBeginImageContext(CGSizeMake(widthInPoints, heightInPoints)); } CGContextRef cgcontext = UIGraphicsGetCurrentContext(); // UIKit coordinate system is upside down to GL/Quartz coordinate system // Flip the CGImage by rendering it to the flipped bitmap context // The size of the destination area is measured in POINTS CGContextSetBlendMode(cgcontext, kCGBlendModeCopy); CGContextDrawImage(cgcontext, CGRectMake(0.0, 0.0, widthInPoints, heightInPoints), iref); // Retrieve the UIImage from the current context UIImage *image = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); // Clean up free(data); CFRelease(ref); CFRelease(colorspace); CGImageRelease(iref); return image;} combining this screenshot with outline image using function: - (void)Combine:(UIImage *)Back{ UIImage *Front =backgroundImageView. Image; //UIGraphicsBeginImageContext(Back. Size); UIGraphicsBeginImageContext(CGSizeMake(640,960)); // Draw image1 Back drawInRect:CGRectMake(0, 0, Back.size.

Width*2, Back.size. Height*2); // Draw image2 Front drawInRect:CGRectMake(0, 0, Front.size. Width*2, Front.size.

Height*2); UIImage *resultingImage = UIGraphicsGetImageFromCurrentImageContext(); UIImageWriteToSavedPhotosAlbum(resultingImage, nil, nil, nil); UIGraphicsEndImageContext(); } Save this image to photoalbum using function -(void)captureToPhotoAlbum { self Combine:self snapshot:self; UIAlertView *alert = UIAlertView alloc initWithTitle:@"Success" message:@"Image saved to Photo Album" delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil; alert show; alert release; } Above Code is working but the image quality of screenshot is poor. On the outlines of the brush, there is a grayish outline. I have uploaded a screenshot of my app which is combination of opengles content & UIImage.

Is there any way to get retina display screenshot of opengles-CAEaglelayer content. Thank you in advance! Iphone objective-c ios ipad opengl-es link|improve this question edited Sep 21 '11 at 11:44 asked Sep 21 '11 at 11:36user39240618715 55% accept rate.

I don't believe that resolution is your issue here. If you aren't seeing the grayish outlines on your drawing when it appears on the screen, odds are that you're observing a compression artifact in the saving process. Your image is probably being saved as a lower-quality JPEG image, where artifacts will appear on sharp edges, like the ones in your drawing.

To work around this, Ben Weiss's answer here provides the following code for forcing your image to be saved to the photo library as a PNG: UIImage* im = UIImage imageWithCGImage:myCGRef; // make image from CGRef NSData* imdata = UIImagePNGRepresentation ( im ); // get PNG representation UIImage* im2 = UIImage imageWithData:imdata; // wrap UIImage around PNG representation UIImageWriteToSavedPhotosAlbum(im2, nil, nil, nil); // save to photo album While this is probably the easiest way to address your problem here, you could also try employing multisample antialiasing, as Apple describes in the "Using Multisampling to Improve Image Quality" section of the OpenGL ES Programming Guide for iOS. Depending on how fill-rate limited you are, MSAA might lead to a little bit of slowdown in your application.

Thanks Brad, this is what I was looking for! – user392406 Sep 22 '11 at 6:06.

You're using kCGImageAlphaPremultipliedLast when you create the CG bitmap context. Although I can't see your OpenGL code, it seems unlikely to me that your OpenGL context is rendering premultiplied alpha. Unfortunately, IIRC, it's not possible to create a non-premultiplied CG bitmap context on iOS (it would be using kCGImageAlphaLast, but I think that'll just make the creation call fail), so you may need to premultiply the data by hand between getting it from OpenGL and making the CG context.

On the other hand, is there a reason your OpenGL context has an alpha channel? Could you just make it opaque white then use kCGImageAlphaNoneSkipLast?

I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.

Related Questions