2017-04-10 104 views
2

背景: 我使用以下两个选项运行swift 2应用程序。CIContext初始化崩溃

选项A: 用户可以输入数字进行登录。在这种情况下,他/她的图片显示在UIImageView中。

选项B: 用户可以使用NFC标签登录在这种情况下,所述的UIImageView被替换为照相机层表示直播相机流并使用CIContext到上的按钮按下捕获图像。

问题: 我面临的问题是,有时,当我选择选项A(不使用相机层)时,应用程序崩溃。由于我无法确定性地重现崩溃,所以我已经陷入死胡同,无法理解应用程序崩溃的原因。

编辑:相机层在这两个选项使用,但被隐藏在选项A.

Crashlytics生成以下崩溃日志:

0 libswiftCore.dylib specialized _fatalErrorMessage(StaticString, StaticString, StaticString, UInt) ->() + 44 
1 CameraLayerView.swift line 20 CameraLayerView.init(coder : NSCoder) -> CameraLayerView? 
2 CameraLayerView.swift line 0 @objc CameraLayerView.init(coder : NSCoder) -> CameraLayerView? 
3 UIKit -[UIClassSwapper initWithCoder:] + 248 
32 UIKit UIApplicationMain + 208 
33 AppDelegate.swift line 17 main 
34 libdispatch.dylib (Missing) 

我检查线#20 CameraLayerView但它只是一个初始化语句

private let ciContext = CIContext(EAGLContext: EAGLContext(API: .OpenGLES2)) 

下面提到的是CameraLayerView文件。任何帮助,将不胜感激

var captureSession = AVCaptureSession() 
var sessionOutput = AVCaptureVideoDataOutput() 
var previewLayer = AVCaptureVideoPreviewLayer() 

private var pixelBuffer : CVImageBuffer! 
private var attachments : CFDictionary! 
private var ciImage : CIImage! 
private let ciContext = CIContext(EAGLContext: EAGLContext(API: .OpenGLES2)) 
private var imageOptions : [String : AnyObject]! 

var faceFound = false 
var image : UIImage! 

override func layoutSubviews() { 
    previewLayer.position = CGPoint(x: self.frame.width/2, y: self.frame.height/2) 
    previewLayer.bounds = self.frame 
    self.layer.borderWidth = 2.0 
    self.layer.borderColor = UIColor.redColor().CGColor 
} 

func loadCamera() { 
    let camera = AVCaptureDevice.devicesWithMediaType(AVMediaTypeVideo) 
    for device in camera { 
     if device.position == .Front { 
      do{ 
       for input in captureSession.inputs { 
        captureSession.removeInput(input as! AVCaptureInput) 
       } 
       for output in captureSession.outputs { 
        captureSession.removeOutput(output as! AVCaptureOutput) 
       } 
       previewLayer.removeFromSuperlayer() 
       previewLayer.session = nil 
       let input = try AVCaptureDeviceInput(device: device as! AVCaptureDevice) 
       if captureSession.canAddInput(input) { 
        captureSession.addInput(input) 
        sessionOutput.videoSettings = [String(kCVPixelBufferPixelFormatTypeKey) : Int(kCVPixelFormatType_32BGRA)] 
        sessionOutput.setSampleBufferDelegate(self, queue: dispatch_get_global_queue(Int(QOS_CLASS_BACKGROUND.rawValue), 0)) 
        sessionOutput.alwaysDiscardsLateVideoFrames = true 

        if captureSession.canAddOutput(sessionOutput) { 
         captureSession.addOutput(sessionOutput) 
         captureSession.sessionPreset = AVCaptureSessionPresetPhoto 
         captureSession.startRunning() 

         previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) 
         previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill 
         switch UIDevice.currentDevice().orientation.rawValue { 
         case 1: 
          previewLayer.connection.videoOrientation = AVCaptureVideoOrientation.Portrait 
          break 
         case 2: 
          previewLayer.connection.videoOrientation = AVCaptureVideoOrientation.PortraitUpsideDown 
          break 
         case 3: 
          previewLayer.connection.videoOrientation = AVCaptureVideoOrientation.LandscapeRight 
          break 
         case 4: 
          previewLayer.connection.videoOrientation = AVCaptureVideoOrientation.LandscapeLeft 
          break 
         default: 
          break 
         } 
         self.layer.addSublayer(previewLayer) 
        } 
       } 

      } catch { 
       print("Error") 
      } 
     } 
    } 
} 

func takePicture() -> UIImage { 
    self.previewLayer.removeFromSuperlayer() 
    self.captureSession.stopRunning() 
    return image 
} 

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) { 
    pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) 
    attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, sampleBuffer, kCMAttachmentMode_ShouldPropagate) 
    ciImage = CIImage(CVPixelBuffer: pixelBuffer!, options: attachments as? [String : AnyObject]) 
    if UIDevice.currentDevice().orientation == .PortraitUpsideDown { 
     imageOptions = [CIDetectorImageOrientation : 8] 
    } else if UIDevice.currentDevice().orientation == .LandscapeLeft { 
     imageOptions = [CIDetectorImageOrientation : 3] 
    } else if UIDevice.currentDevice().orientation == .LandscapeRight { 
     imageOptions = [CIDetectorImageOrientation : 1] 
    } else { 
     imageOptions = [CIDetectorImageOrientation : 6] 
    } 
    let faceDetector = CIDetector(ofType: CIDetectorTypeFace, context: ciContext, options: [CIDetectorAccuracy: CIDetectorAccuracyHigh]) 
    let features = faceDetector.featuresInImage(ciImage, options: imageOptions) 
    if features.count == 0 { 
     if faceFound == true { 
      faceFound = false 
      dispatch_async(dispatch_get_main_queue()) { 
       self.layer.borderColor = UIColor.redColor().CGColor 
      } 
     } 
    } else { 
     if UIDevice.currentDevice().orientation == .PortraitUpsideDown { 
      image = UIImage(CGImage: ciContext.createCGImage(ciImage, fromRect: ciImage.extent), scale: 1.0, orientation: UIImageOrientation.Left) 
     } else if UIDevice.currentDevice().orientation == .LandscapeLeft { 
      image = UIImage(CGImage: ciContext.createCGImage(ciImage, fromRect: ciImage.extent), scale: 1.0, orientation: UIImageOrientation.Down) 
     } else if UIDevice.currentDevice().orientation == .LandscapeRight { 
      image = UIImage(CGImage: ciContext.createCGImage(ciImage, fromRect: ciImage.extent), scale: 1.0, orientation: UIImageOrientation.Up) 
     } else { 
      image = UIImage(CGImage: ciContext.createCGImage(ciImage, fromRect: ciImage.extent), scale: 1.0, orientation: UIImageOrientation.Right) 
     } 
     if faceFound == false { 
      faceFound = true 
      for feature in features { 
       if feature.isKindOfClass(CIFaceFeature) { 
        dispatch_async(dispatch_get_main_queue()) { 
         self.layer.borderColor = UIColor.greenColor().CGColor 
        } 
       } 
      } 
     } 
    } 
} 
+0

您可以在本地(在Xcode中)或仅在用户的设备上重现崩溃吗?如果是这样,控制台日志说什么? –

+0

我一直无法自己重现错误。我也无法找到导致这次事故的一系列事件的模式。它不会经常发生,但当它发生时,它会强制应用程序重新启动(这本身就是一种奇怪的行为,因为应用程序崩溃一般会导致应用程序崩溃而不是重新启动它) – Malik

+0

我从来没有听说过iOS应用程序“自动崩溃并重新启动”(系统“应用程序”Springboard除外)。 –

回答

0

我测试了一个理论,它的工作。由于ciContext正在初始化视图初始化,它似乎像应用程序崩溃由于竞争条件。我将ciContext的初始化转移到了我的loadCamera方法中,并且从那以后它并没有崩溃。

UPDATE

我注意到另一件事是,在互联网上的各种教程和博客文章,声明在两个不同的声明宣布,使得它成为

let eaglContext = EAGLContext(API: .OpenGLES2) 
let ciContext = CIContext(EAGLContext: eaglContext) 

我仍然不是什么原因导致应用程序首先崩溃,但这两个更改似乎已解决问题

正确的答案

终于找到了罪魁祸首。在我使用ciContext的viewController中,我有一个没有失效的定时器,因此保留了对viewController的强引用。在每次后续访问时,它都会创建一个新的viewController,而前一个从未从内存中释放。这导致记忆力加班。一旦它通过了一定的阈值,ciContext intialiser将返回nil,因为内存不足会导致应用程序崩溃。

相关问题