2014-01-07 67 views
11

我想在我的全屏渲染输出应用CoreImage过滤器,但看起来像我想的东西,因为我越来越黑屏输出。应用CIFilter到OpenGL渲染到纹理

首先我画整个场景的纹理。然后我创建了CoreImage,然后我最终绘制并呈现该纹理。但我得到的只是黑屏。我是在图纸上以质地和与OpenGLES整合CoreImage下面的Apple引导线:WWDC2012 511和https://developer.apple.com/library/ios/documentation/3ddrawing/conceptual/opengles_programmingguide/WorkingwithEAGLContexts/WorkingwithEAGLContexts.html

下面是相关代码:

渲染:

@interface Renderer() { 
    EAGLContext* _context; 
    GLuint _defaultFramebuffer, _drawFramebuffer, _depthRenderbuffer, _colorRenderbuffer, _drawTexture; 
    GLint _backingWidth, _backingHeight; 
    CIImage *_coreImage; 
    CIFilter *_coreFilter; 
    CIContext *_coreContext; 
} 

初始化方法:

- (BOOL)initOpenGL 
{ 
    _context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2]; 
    if (!_context) return NO; 

    [EAGLContext setCurrentContext:_context]; 

    glGenFramebuffers(1, &_defaultFramebuffer); 
    glBindFramebuffer(GL_FRAMEBUFFER, _defaultFramebuffer); 

    glGenRenderbuffers(1, &_colorRenderbuffer); 
    glBindRenderbuffer(GL_RENDERBUFFER, _colorRenderbuffer); 
    glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, _colorRenderbuffer); 

    glGenFramebuffers(1, &_drawFramebuffer); 
    glBindFramebuffer(GL_FRAMEBUFFER, _drawFramebuffer); 

    glGenTextures(1, &_drawTexture); 
    glBindTexture(GL_TEXTURE_2D, _drawTexture); 
    glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, _drawTexture, 0); 

    glGenRenderbuffers(1, &_depthRenderbuffer); 
    glBindRenderbuffer(GL_RENDERBUFFER, _depthRenderbuffer); 
    glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, _depthRenderbuffer); 

    _coreFilter = [CIFilter filterWithName:@"CIColorInvert"]; 
    [_coreFilter setDefaults]; 

    NSDictionary *opts = @{ kCIContextWorkingColorSpace : [NSNull null] }; 
    _coreContext = [CIContext contextWithEAGLContext:_context options:opts]; 

    return YES; 
} 

的Alloc存储器每当层尺寸的变化(上init和上取向变化):

- (void)resizeFromLayer:(CAEAGLLayer *)layer 
{ 
    layer.contentsScale = 1; 

    glBindFramebuffer(GL_FRAMEBUFFER, _defaultFramebuffer); 

    glBindRenderbuffer(GL_RENDERBUFFER, _colorRenderbuffer); 
    [_context renderbufferStorage:GL_RENDERBUFFER fromDrawable:layer]; 

    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &_backingWidth); 
    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &_backingHeight); 

    // glCheckFramebufferStatus ... SUCCESS 

    glBindFramebuffer(GL_FRAMEBUFFER, _drawFramebuffer); 

    glBindTexture(GL_TEXTURE_2D, _drawTexture); 
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, _backingWidth, _backingHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL); 

    glBindRenderbuffer(GL_RENDERBUFFER, _depthRenderbuffer); 
    glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, _backingWidth, _backingHeight); 

    // glCheckFramebufferStatus ... SUCCESS 
} 

绘制方法:

- (void)render:(Scene *)scene 
{ 
    [EAGLContext setCurrentContext:_context]; 

    glBindFramebuffer(GL_FRAMEBUFFER, _drawFramebuffer); 

    // Draw using GLKit, custom shaders, drawArrays, drawElements 
    // Now rendered scene is in _drawTexture 

    glBindFramebuffer(GL_FRAMEBUFFER, _defaultFramebuffer); 
    glBindRenderbuffer(GL_RENDERBUFFER, _colorRenderbuffer); 

    // Create CIImage with our render-to-texture texture 
    _coreImage = [CIImage imageWithTexture:_drawTexture size:CGSizeMake(_backingWidth, _backingHeight) flipped:NO colorSpace:nil]; 

    // Ignore filtering for now; Draw CIImage to current render buffer 
    [_coreContext drawImage:_coreImage inRect:CGRectMake(0, 0, _backingWidth, _backingHeight) fromRect:CGRectMake(0, 0, _backingWidth, _backingHeight)]; 

    // Present 
    [_context presentRenderbuffer:GL_RENDERBUFFER]; 
} 

注意,绘制现场后,_drawTexture包含渲染场景。我使用Xcode调试工具(Capture OpenGL ES框架)检查了这一点。

编辑:如果我尝试创建CIImage一些其它质地然后_drawTexture的,它正确地显示。我怀疑是_drawTexture可能没有准备好,或者在CIContext尝试通过CIImage呈现它时被锁定。

EDIT2:我也尝试过只用视清更换所有的绘图代码:

glViewport(0, 0, _backingWidth, _backingHeight); 
    glClearColor(0, 0.8, 0, 1); 
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); 

结果仍是黑色的。它表明问题可能与绘制纹理或帧缓冲区有关。

+0

为什么地球上这会被降低? –

+1

这很有趣。通常情况下,我会说这是因为Core Image上下文不是使用与渲染相同的OpenGL ES上下文创建的,而是看起来在这里可以正确设置。如果使用直通着色器并使用渲染的纹理在屏幕上绘制四边形,您是否可以验证场景是否正确渲染到纹理?最后,如果你没有和Core Image结婚,我在这里有一个小项目:https://github.com/BradLarson/GPUImage也可以做这种GPU端的后期处理。请参阅那里的CubeExample示例应用程序,它可以做到您想要的。 –

+0

好吧,我使用_drawTexture渲染了一个四边形,它是黑色的。因此,看起来像是该纹理或渲染的东西是错误的。也许我错过了渲染到纹理的东西。差异只是我将纹理附加为GL_COLOR_ATTACHEMNT0而不是渲染缓冲区。 –

回答

5

我终于找到什么是错的。 2个纹理iOS上的非权力必须具有线性过滤和钳位到边包装:

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR); 
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR); 
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); 
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); 

我的纹理具有相同大小的屏幕,但我并没有设置这四个PARAMS。

对于后代:代码以上是完全的OpenGL ES和CoreImage的互连的有效实例。只要确保你正确初始化你的纹理!