2012-03-05 111 views
4

我使用渲染了iOS 5.0的方法“CVOpenGLESTextureCacheCreateTextureFromImage”的ffmpeg的YUV帧的ffmpeg的OpenGL的YUV视频。渲染使用CVPixelBufferRef和着色

我使用的是像苹果的例子GLCameraRipple

我在iPhone屏幕上的结果是这样的:iPhone Screen

我需要知道我做错了。

我把我的代码部分发现的错误。

ffmpeg的配置帧:

ctx->p_sws_ctx = sws_getContext(ctx->p_video_ctx->width, 
           ctx->p_video_ctx->height, 
           ctx->p_video_ctx->pix_fmt, 
           ctx->p_video_ctx->width, 
           ctx->p_video_ctx->height, 
           PIX_FMT_YUV420P, SWS_FAST_BILINEAR, NULL, NULL, NULL); 


// Framebuffer for RGB data 
ctx->p_frame_buffer = malloc(avpicture_get_size(PIX_FMT_YUV420P, 
               ctx->p_video_ctx->width, 
               ctx->p_video_ctx->height)); 

avpicture_fill((AVPicture*)ctx->p_picture_rgb, ctx->p_frame_buffer,PIX_FMT_YUV420P, 
       ctx->p_video_ctx->width, 
       ctx->p_video_ctx->height); 

我的渲​​染方法:

if (NULL == videoTextureCache) { 
    NSLog(@"displayPixelBuffer error"); 
    return; 
}  


CVPixelBufferRef pixelBuffer;  
    CVPixelBufferCreateWithBytes(kCFAllocatorDefault, mTexW, mTexH, kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, buffer, mFrameW * 3, NULL, 0, NULL, &pixelBuffer); 



CVReturn err;  
// Y-plane 
glActiveTexture(GL_TEXTURE0); 
err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, 
                videoTextureCache, 
                pixelBuffer, 
                NULL, 
                GL_TEXTURE_2D, 
                GL_RED_EXT, 
                mTexW, 
                mTexH, 
                GL_RED_EXT, 
                GL_UNSIGNED_BYTE, 
                0, 
                &_lumaTexture); 
if (err) 
{ 
    NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err); 
} 

glBindTexture(CVOpenGLESTextureGetTarget(_lumaTexture), CVOpenGLESTextureGetName(_lumaTexture)); 
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); 
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);  

// UV-plane 
glActiveTexture(GL_TEXTURE1); 
err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, 
                videoTextureCache, 
                pixelBuffer, 
                NULL, 
                GL_TEXTURE_2D, 
                GL_RG_EXT, 
                mTexW/2, 
                mTexH/2, 
                GL_RG_EXT, 
                GL_UNSIGNED_BYTE, 
                1, 
                &_chromaTexture); 
if (err) 
{ 
    NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err); 
} 

glBindTexture(CVOpenGLESTextureGetTarget(_chromaTexture), CVOpenGLESTextureGetName(_chromaTexture)); 
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); 
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);  

glBindFramebuffer(GL_FRAMEBUFFER, defaultFramebuffer); 

// Set the view port to the entire view 
glViewport(0, 0, backingWidth, backingHeight); 

static const GLfloat squareVertices[] = { 
    1.0f, 1.0f, 
    -1.0f, 1.0f, 
    1.0f, -1.0f, 
    -1.0f, -1.0f, 
}; 

GLfloat textureVertices[] = { 
    1, 1, 
    1, 0, 
    0, 1, 
    0, 0, 
}; 

// Draw the texture on the screen with OpenGL ES 2 
[self renderWithSquareVertices:squareVertices textureVertices:textureVertices]; 


// Flush the CVOpenGLESTexture cache and release the texture 
CVOpenGLESTextureCacheFlush(videoTextureCache, 0);  
CVPixelBufferRelease(pixelBuffer);  

[moviePlayerDelegate bufferDone]; 

RenderWithSquareVertices方法

- (void)renderWithSquareVertices:(const GLfloat*)squareVertices textureVertices:(const GLfloat*)textureVertices 
{ 


    // Use shader program. 
    glUseProgram(shader.program); 

// Update attribute values. 
glVertexAttribPointer(ATTRIB_VERTEX, 2, GL_FLOAT, 0, 0, squareVertices); 
glEnableVertexAttribArray(ATTRIB_VERTEX); 
glVertexAttribPointer(ATTRIB_TEXTUREPOSITON, 2, GL_FLOAT, 0, 0, textureVertices); 
glEnableVertexAttribArray(ATTRIB_TEXTUREPOSITON); 

glDrawArrays(GL_TRIANGLE_STRIP, 0, 4); 

// Present 
glBindRenderbuffer(GL_RENDERBUFFER, colorRenderbuffer); 
[context presentRenderbuffer:GL_RENDERBUFFER]; 

}

我的片段着色器:

uniform sampler2D SamplerY; 
uniform sampler2D SamplerUV; 


varying highp vec2 _texcoord; 

void main() 
{ 

mediump vec3 yuv; 
lowp vec3 rgb; 

yuv.x = texture2D(SamplerY, _texcoord).r; 
yuv.yz = texture2D(SamplerUV, _texcoord).rg - vec2(0.5, 0.5); 

// BT.601, which is the standard for SDTV is provided as a reference 

/* rgb = mat3( 1,  1,  1, 
0, -.34413, 1.772, 
1.402, -.71414,  0) * yuv;*/ 


// Using BT.709 which is the standard for HDTV 
rgb = mat3(  1,  1,  1, 
      0, -.18732, 1.8556, 
      1.57481, -.46813,  0) * yuv; 

    gl_FragColor = vec4(rgb, 1); 

} 

非常感谢,

+0

什么样的视频,你解码?您是使用FFmpeg的libavcodec还是使用iOS的解码设备执行视频解码? – 2012-03-06 19:26:10

+0

那么这个应用程序有什么问题? – karlphillip 2012-03-14 18:50:13

+0

你好,居民,我试图做同样的事情,我也有一个绿色的屏幕。你有没有找到解决问题的办法?谢谢! – cpprulez 2012-08-01 11:55:02

回答

1

我想象问题是YUV420(或I420)是一个三平面图像格式。 I420是一个8位Y平面,接着是8位2x2二次采样的U和V平面。从GLCameraRipple代码期待NV12格式:8位Y平面,随后的交错U/V平面以2x2子采样。鉴于此,我期望您将需要三个纹理。 luma_tex,u_chroma_tex,v_chroma_tex。

还要注意的是GLCameraRipple也可以期待“的视频范围”。换句话说,平面格式的值是luma = [16,235]色度= [16,240]。

+0

你的意思是kCVPixelFormatType_420YpCbCr8BiPlanarFullRange NV12? – onmyway133 2013-06-17 09:19:46

相关问题