2012-02-27 52 views
10

我有一堆难以获取Isgl3d受控视图的UIImage快照。看来我做了什么,我只是最后一个黑色的方块。捕获Isgl3d输出为图像

我在我的视图中有一个工作摄像头视图和一个3d模型,我尝试使用缓冲区方法和常规屏幕捕获来获取图像,但没有任何有效的结果。

有没有人有一些源代码,他们成功地拍摄了一个Isgl3d视图的图片?

回答

5

这里是苹果公司的说明&将GL视图快照到UIImage的官方代码(考虑到视网膜显示,翻转coords等),我一直在成功地使用它。当然,这不是iSGL3D特有的,但只要你能够获得合适的上下文和帧缓冲区来绑定,它就应该做正确的事情。 (由于页的笔记,你之前-presentRenderbuffer:被称为所以渲染是有效的必须是一定要采取快照。)

https://developer.apple.com/library/ios/#qa/qa1704/_index.html

我只有与iSGL3D库一个粗略的熟悉程度,以及它doesn”看起来好像有明显的钩子让你渲染场景但不呈现它(或者先渲染它到屏幕外的缓冲区)。您可能需要介入的地方位于您正在使用的Isgl3dGLContext子类的-finalizeRender方法中,就在呼叫-presentRenderbuffer呼叫之前。这个上下文在这里是一个内部框架类,所以你可能需要在库中稍微改变一些东西来设置(比如说)一个来自上下文的委托,然后再回到视图和导演之外,最终请求你的应用程序在“现在”通话之前采取任何行动,在此期间,如果您愿意,您可以选择运行截屏代码,或者如果您不想做任何事情,请不要执行任何操作。

3

这是你想要的吗?

这将从当前上下文和framebuffer中截取并保存到相册中。

如果你不想保存到相册,只需得到最终的UIImage。

还记得在完成绘图之后,但在切换缓冲区之前调用它。

此外,如果您使用的是MSAA,则必须在glResolveMultisampleFramebufferAPPLE和新缓冲区绑定后调用。

#ifdef AUTOSCREENSHOT 

// callback for CGDataProviderCreateWithData 
void releaseData(void *info, const void *data, size_t dataSize) { 
    free((void*)data); 
} 

// callback for UIImageWriteToSavedPhotosAlbum 
- (void)image:(UIImage *)image didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo { 
    NSLog(@"Save finished"); 
    [image release]; 
} 

-(void)saveCurrentScreenToPhotoAlbum { 
    int height = (int)screenSize.y*retina; 
    int width = (int)screenSize.x*retina; 

    NSInteger myDataLength = width * height * 4; 
    GLubyte *buffer = (GLubyte *) malloc(myDataLength); 
    GLubyte *buffer2 = (GLubyte *) malloc(myDataLength); 
    glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, buffer); 
    for(int y = 0; y <height; y++) { 
     for(int x = 0; x < width * 4; x++) { 
      buffer2[(int)((height - 1 - y) * width * 4 + x)] = buffer[(int)(y * 4 * width + x)]; 
     } 
    } 
    free(buffer); 

    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer2, myDataLength, releaseData); 
    int bitsPerComponent = 8; 
    int bitsPerPixel = 32; 
    int bytesPerRow = 4 * width; 
    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB(); 
    CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault; 
    CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault; 
    CGImageRef imageRef = CGImageCreate(width, height, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent); 

    CGColorSpaceRelease(colorSpaceRef); 
    CGDataProviderRelease(provider); 

    UIImage *image = [[UIImage alloc] initWithCGImage:imageRef]; 
    CGImageRelease(imageRef); 

    UIImageWriteToSavedPhotosAlbum(image, self, @selector(image:didFinishSavingWithError:contextInfo:), nil); 
} 

#endif 

我用这个代码来保存定时截图我玩,所以我有上帝的材料摆在应用商店的同时。

3

我在我的一个应用程序中成功使用此代码段来执行OpenGL屏幕截图。

enum { 
    red, 
    green, 
    blue, 
    alpha 
}; 

- (UIImage *)glToUIImage { 
    CGSize glSize = self.glView.bounds.size; 
    NSInteger bufDataLen = glSize.width * glSize.height * 4; 

    // Allocate array and read pixels into it. 
    GLubyte *buffer = (GLubyte *)malloc(bufDataLen); 
    glReadPixels(0, 0, glSize.width, glSize.height, GL_RGBA, GL_UNSIGNED_BYTE, buffer); 

    // We need to flip the image 
    NSUInteger maxRow = (NSInteger)glSize.height - 1; 
    NSUInteger bytesPerRow = (NSInteger)glSize.width * 4; 

    GLubyte *buffer2 = (GLubyte *)malloc(bufDataLen); 
    for(int y = maxRow; y >= 0; y--) { 
    for(int x = 0; x < bytesPerRow; x+=4) { 
     NSUInteger c0 = y * bytesPerRow + x; 
     NSUInteger c1 = (maxRow - y) * bytesPerRow + x; 
     buffer2[c0+red] = buffer[c1+red]; 
     buffer2[c0+green] = buffer[c1+green]; 
     buffer2[c0+blue] = buffer[c1+blue]; 
     buffer2[c0+alpha] = buffer[c1+alpha]; 
    } 
    } 
    free(buffer); 

    // Make data provider with data 
    CFDataRef imageData = CFDataCreate(NULL, buffer2, bufDataLen); 
    free(buffer2); 

    CGDataProviderRef provider = CGDataProviderCreateWithCFData(imageData); 
    CFRelease(imageData); 

    // Bitmap format 
    int bitsPerComponent = 8; 
    int bitsPerPixel = 32; 
    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB(); 
    CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault | kCGImageAlphaPremultipliedLast; 
    CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault; 

    // Create the CGImage 
    CGImageRef imageRef = CGImageCreate(glSize.width, 
             glSize.height, 
             bitsPerComponent, 
             bitsPerPixel, 
             bytesPerRow, 
             colorSpaceRef, 
             bitmapInfo, 
             provider, 
             NULL, 
             NO, 
             renderingIntent); 

    // Clean up 
    CGColorSpaceRelease(colorSpaceRef); 
    CGDataProviderRelease(provider); 

    // Convert to UIImage 
    UIImage *image = [[UIImage alloc] initWithCGImage:imageRef]; 
    CGImageRelease(imageRef); 

    return [image autorelease]; 
} 

确保您绑定的帧缓冲区在这之前,像这样

glBindFramebufferOES(GL_FRAMEBUFFER_OES, myFrameBuffer); 
glViewport(0, 0, myBackingWidth, myBackingHeight); 

并调用-glToUIImage呈现帧缓冲器之前!

有关更多信息Apple提供sample code从OpenGL截取屏幕截图。

+0

看起来很有用,但你从哪里得到你的红色,绿色,蓝色值? – 2012-03-06 15:19:09

+0

这只是一个枚举来命名偏移量(0,1,2,3),忘记了在这里添加这个。更新了片段。 – 2012-03-06 16:28:06

2

我想出了这个可能的解决方案。你必须修改一下isgl3d的库。

的步骤是:

1.

为Isgl3dGLContext1创建委托:

在Isgl3dGLContext1.h

@protocol ScreenShooterDelegate; 

#import <OpenGLES/ES1/gl.h> 
#import <OpenGLES/ES1/glext.h> 
#import "Isgl3dGLContext.h" 

@interface Isgl3dGLContext1 : Isgl3dGLContext { 

    NSObject<ScreenShooterDelegate>* __unsafe_unretained delegate; 

    GLuint _colorRenderBuffer; 
@private 
    EAGLContext * _context; 

    // The OpenGL names for the framebuffer and renderbuffer used to render to this view 
    GLuint _defaultFrameBuffer; 


    GLuint _depthAndStencilRenderBuffer; 
    GLuint _depthRenderBuffer; 
    GLuint _stencilRenderBuffer; 

    // OpenGL MSAA buffers 
    GLuint _msaaFrameBuffer; 
    GLuint _msaaColorRenderBuffer; 

    GLuint _msaaDepthAndStencilRenderBuffer; 
    GLuint _msaaDepthRenderBuffer; 
    GLuint _msaaStencilRenderBuffer; 
} 

- (id) initWithLayer:(CAEAGLLayer *) layer; 
@property (assign) NSObject<ScreenShooterDelegate>* delegate; 
@property BOOL takePicture; 
@property GLuint colorRenderBuffer; 

@end 

@protocol ScreenShooterDelegate 


@optional 

- (void)takePicture; 

@end 

2.

。该代码添加到Isgl3dGLContext1.m:

@synthesize takePicture; 
@synthesize colorRenderBuffer = _colorRenderBuffer; 

前行[_context presentRenderbuffer:GL_RENDERBUFFER_OES]。在 - (空)finalizeRender:

if(takePicture){ 
     takePicture=NO; 
     if([delegate respondsToSelector:@selector(takePicture)]){ 
      [delegate takePicture]; 
     } 
    } 

3把这段代码中的类要采取截图:

In Class.h add <ScreenShooterDelegate> 

在该方法中Class.m

[Isgl3dDirector sharedInstance].antiAliasingEnabled = NO; 

Photos3DAppDelegate *appDelegate = (Photos3DAppDelegate *)[[UIApplication sharedApplication] delegate]; 
[appDelegate.inOutSceneView showSphere]; 

Isgl3dEAGLView* eaglview=(Isgl3dEAGLView*)[[Isgl3dDirector sharedInstance] openGLView]; 
Isgl3dGLContext1 * _glContext=(Isgl3dGLContext1*)[eaglview glContext]; 
_glContext.delegate=self; 
_glContext.takePicture=YES; 

在方法 - (void)takePicture {}将Apple代码放在方法add [Isgl3dDirector sharedInstance] .antiAliasingEnabled = YES; (如果你使用它)

//https://developer.apple.com/library/ios/#qa/qa1704/_index.html 

-(void)takePicture{ 


NSLog(@"Creating Foto"); 

GLint backingWidth, backingHeight; 

Isgl3dEAGLView* eaglview=(Isgl3dEAGLView*)[[Isgl3dDirector sharedInstance] openGLView]; 
//Isgl3dGLContext1 * _glContext=(Isgl3dGLContext1*)[eaglview glContext]; 
//glBindRenderbufferOES(GL_RENDERBUFFER_OES, _glContext.colorRenderBuffer); 

glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth); 
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight); 

NSInteger x = 0, y = 0, width = backingWidth, height = backingHeight; 
NSInteger dataLength = width * height * 4; 
GLubyte *data = (GLubyte*)malloc(dataLength * sizeof(GLubyte)); 

// Read pixel data from the framebuffer 
glPixelStorei(GL_PACK_ALIGNMENT, 4); 
glReadPixels(x, y, width, height, GL_RGBA, GL_UNSIGNED_BYTE, data); 

// Create a CGImage with the pixel data 
// If your OpenGL ES content is opaque, use kCGImageAlphaNoneSkipLast to ignore the alpha channel 
// otherwise, use kCGImageAlphaPremultipliedLast 
CGDataProviderRef ref = CGDataProviderCreateWithData(NULL, data, dataLength, NULL); 
CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB(); 
CGImageRef iref = CGImageCreate(width, height, 8, 32, width * 4, colorspace, kCGBitmapByteOrder32Big | kCGImageAlphaPremultipliedLast, 
           ref, NULL, true, kCGRenderingIntentDefault); 

// OpenGL ES measures data in PIXELS 
// Create a graphics context with the target size measured in POINTS 
NSInteger widthInPoints, heightInPoints; 
if (NULL != UIGraphicsBeginImageContextWithOptions) { 
    // On iOS 4 and later, use UIGraphicsBeginImageContextWithOptions to take the scale into consideration 
    // Set the scale parameter to your OpenGL ES view's contentScaleFactor 
    // so that you get a high-resolution snapshot when its value is greater than 1.0 
    CGFloat scale = eaglview.contentScaleFactor; 
    widthInPoints = width/scale; 
    heightInPoints = height/scale; 
    UIGraphicsBeginImageContextWithOptions(CGSizeMake(widthInPoints, heightInPoints), NO, scale); 
} 
else { 
    // On iOS prior to 4, fall back to use UIGraphicsBeginImageContext 
    widthInPoints = width; 
    heightInPoints = height; 
    UIGraphicsBeginImageContext(CGSizeMake(widthInPoints, heightInPoints)); 
} 

CGContextRef cgcontext = UIGraphicsGetCurrentContext(); 

// UIKit coordinate system is upside down to GL/Quartz coordinate system 
// Flip the CGImage by rendering it to the flipped bitmap context 
// The size of the destination area is measured in POINTS 
CGContextSetBlendMode(cgcontext, kCGBlendModeCopy); 
CGContextDrawImage(cgcontext, CGRectMake(0.0, 0.0, widthInPoints, heightInPoints), iref); 

// Retrieve the UIImage from the current context 
UIImage *image = UIGraphicsGetImageFromCurrentImageContext(); 

UIGraphicsEndImageContext(); 

// Clean up 
free(data); 
CFRelease(ref); 
CFRelease(colorspace); 
CGImageRelease(iref); 

UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil); 

[Isgl3dDirector sharedInstance].antiAliasingEnabled = YES; 
} 

注:对于我来说工作只是评论glBindRenderbufferOES(GL_RENDERBUFFER_OES,_colorRenderbuffer);和 在你的情况下,你可以用Isgl3dGLContext2而不是Isgl3dGLContext1来完成这些步骤。