2016-07-04 92 views
0

我试图从iOS中的运行管道创建快照。我使用一个按钮来拍摄快照。从iOS的gstreamer中运行管道获取快照

我有以下的管道

udpsrc auto-multicast=true address=224.1.1.1 port=5004" 
+ " ! application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z0JAH6aAQAYZAA\\,aM4wpIAA\", payload=(int)96, ssrc=(uint)19088743, timestamp-offset=(uint)0, seqnum-offset=(uint)0" 
+ " ! rtpjitterbuffer latency=400" 
+ " ! rtph264depay ! avdec_h264 ! videoconvert" 
+ " ! tee name=snapshot snapshot. ! queue ! valve name=snap drop=true ! jpegenc ! filesink name=filepath location=screenshot.jpg async=false snapshot. ! queue" 
+ " ! autovideosink 

所以我用下面的代码在我的按钮处理阀:

GstElement *element = gst_bin_get_by_name (GST_BIN (pipeline), "snap"); 
if (strcmp("drop", "drop") == 0) 
{ 
    gboolean prop_val = FALSE; 

    // if the property value is true, then send an EOS. 
    if (strcmp("false", "true") == 0) 
    { 
     gst_element_send_event(element, gst_event_new_eos()); 
     prop_val = TRUE; 
    } else { 
     prop_val = FALSE; 
    } 

    g_object_set (element, "drop", prop_val, NULL); 
} 

但有了这个代码,我只能拿一个屏幕截图。我无法设置图像的文件名。

如何在不阻止流的情况下保存屏幕截图,并且每次单击该按钮时都将图像保存为具有自定义名称的文档文件夹?

回答

0

经过漫长的搜索和斗争,我找到了解决方案,所以我会自己回答我的问题。

我已经从管道中取出了三通,现在使用了视频缓冲区的最后一个样本。

我的管道:

udpsrc auto-multicast=true address=224.1.1.1 port=5004" 
+ " ! application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z0JAH6aAQAYZAA\\,aM4wpIAA\", payload=(int)96, ssrc=(uint)19088743, timestamp-offset=(uint)0, seqnum-offset=(uint)0" 
+ " ! rtpjitterbuffer latency=400" 
+ " ! rtph264depay ! avdec_h264 ! videoconvert" 
+ " ! autovideosink 

这是我的Objective-C代码:

-(UIImage*) takeSnapshot 
{ 
    GstSample *videobuffer = NULL; 
    GstCaps *caps; 
    gint width, height; 
    GstMapInfo map; 

    g_object_get(G_OBJECT(video_sink), "last-sample", &videobuffer, NULL); 

    if (videobuffer) 
    { 
     caps = gst_sample_get_caps(videobuffer); 

     if (!caps) { 
      return NULL; 
     } 

     GstStructure *s = gst_caps_get_structure(caps, 0); 
     /* we need to get the final caps on the buffer to get the size */ 
     gboolean res; 
     res = gst_structure_get_int (s, "width", &width); 
     res |= gst_structure_get_int (s, "height", &height); 
     if (!res) { 
      return NULL; 
     } 

     GstBuffer *snapbuffer = gst_sample_get_buffer(videobuffer); 
     if (snapbuffer && gst_buffer_map (snapbuffer, &map, GST_MAP_READ)) 
     { 
      CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, 
                    map.data, 
                    height * width * 4, 
                    NULL); 

      CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB(); 
      CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault; 
      CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault; 

      CGImageRef imageRef = CGImageCreate(width, 
              height, 
              8, 
              4 * 8, 
              width * 4, 
              colorSpaceRef, 
              bitmapInfo, 
              provider, 
              NULL, 
              NO, 
              renderingIntent); 

      UIImage *uiImage = [UIImage imageWithCGImage:imageRef]; 
      CGColorSpaceRelease(colorSpaceRef); 
      CGImageRelease(imageRef); 

      return uiImage; 
     } 

     return NULL; 
    } 

    return NULL; 
}