2016-08-30 134 views
1

我一直在尝试将.h264视频解码为opengl,但我显示的帧是黑色的。我没有看到任何错误,并且如果我将CMSampleBufferRef中的帧导出到相机胶卷中,它们就没有问题。iOS - 使用OpenGL ES 2.0进行视频解码

也许在openGL方面呢?但是当我显示图像而不是视频时,它可以正常工作,所以不知道在哪里看。

这里是给init视频解码器的代码:

- (void)initVideo { 

    glActiveTexture(GL_TEXTURE0); 
    glGenTextures(1, &_glTextureHook); 

    NSURL * url = [NSURL URLWithString:[self.videoMedia getFilePath]]; 
    self.videoAsset = [[AVURLAsset alloc] initWithURL:url options:NULL]; 

    dispatch_semaphore_t sema = dispatch_semaphore_create(0); 

    [self.videoAsset loadValuesAsynchronouslyForKeys:@[@"tracks"] completionHandler:^{ 

     self.videoTrack = [self.videoAsset tracksWithMediaType:AVMediaTypeVideo][0]; 

     NSString *key = (NSString *) kCVPixelBufferPixelFormatTypeKey; 
     NSNumber *value = @(kCVPixelFormatType_32BGRA); 
     NSDictionary *settings = @{key : value}; 

     self.outputVideoTrackOuput = [[AVAssetReaderTrackOutput alloc] 
             initWithTrack:self.videoTrack outputSettings:settings]; 

     self.assetReader = [[AVAssetReader alloc] initWithAsset:self.videoAsset error:nil]; 
     [self.assetReader addOutput:self.outputVideoTrackOuput]; 
     [self.assetReader startReading]; 

     dispatch_semaphore_signal(sema); 
    }]; 

    dispatch_semaphore_wait(sema, DISPATCH_TIME_FOREVER); 
} 

,并检索所述OpenGL纹理的代码中,视频的每个帧:

- (void)play { 

    if (self.assetReader.status == AVAssetReaderStatusReading) { 
     CMSampleBufferRef sampleBuffer = [self.outputVideoTrackOuput copyNextSampleBuffer]; 
     if(sampleBuffer) { 
      CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 

      if(pixelBuffer) { 
       CVPixelBufferLockBaseAddress(pixelBuffer, 0); 

       glBindTexture(GL_TEXTURE_2D, _glTextureHook); 
       glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 1920, 1080, 0, GL_BGRA_EXT, GL_UNSIGNED_BYTE, CVPixelBufferGetBaseAddress(pixelBuffer)); 
       CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); 
      } 

      CFRelease(sampleBuffer); 
     } 
    } 
} 

顶点着色器和片段是对于图像和视频(与图像一起使用)相同。我看到的唯一区别是glTexImage2D,其中两种格式都是GL_RGBA图像。

我确信,视频解码器的_glTextureHook很好地发送到着色器管理器,上下文线程上激活等

这是片段着色器的代码(这是很基本的,同为顶点着色器):

precision lowp float; 

uniform sampler2D Texture; 

varying vec4 DestinationColor; 
varying vec2 TexCoordOut; 

void main() { 
    gl_FragColor = DestinationColor * texture2D(Texture, TexCoordOut); 
} 
+0

你成功成功确实发挥了openGL下的IOS视频?如果是,你会介意分享代码吗?感谢您的提前! – loki

+0

感谢您的代码,我一直在寻找这样几天的具体例子。像@loki问,你有什么成功?您是否在任何时候都涉及音频? –

回答

0

我只是错过创建纹理之后的那些行:

glBindTexture(GL_TEXTURE_2D, _glTextureHook); 
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); 
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); 
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); 
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);