我正在使用UIImagePickerController在iPhone上拍照。我想随时调整照片,看起来我可以使用UIImagePickerController快速调整照片的形状,但我无法找到一种方法来即时更改颜色。例如,将所有颜色改为黑/白。如何在iPhone摄像头预览窗口中即时更改像素颜色?
谢谢。
我正在使用UIImagePickerController在iPhone上拍照。我想随时调整照片,看起来我可以使用UIImagePickerController快速调整照片的形状,但我无法找到一种方法来即时更改颜色。例如,将所有颜色改为黑/白。如何在iPhone摄像头预览窗口中即时更改像素颜色?
谢谢。
执行此操作的最佳方法是使用AVCaptureSession对象。我正在使用我的免费应用程序“Live Effects Cam”正在进行的操作。
在线上有几个代码示例可帮助您实现此目的。以下是可能有所帮助的一段代码示例:
- (void) activateCameraFeed
{
videoSettings = nil;
#if USE_32BGRA
pixelFormatCode = [[NSNumber alloc] initWithUnsignedInt:(unsigned int)kCVPixelFormatType_32BGRA];
pixelFormatKey = [[NSString alloc] initWithString:(NSString *)kCVPixelBufferPixelFormatTypeKey];
videoSettings = [[NSDictionary alloc] initWithObjectsAndKeys:pixelFormatCode, pixelFormatKey, nil];
#endif
videoDataOutputQueue = dispatch_queue_create("com.jellyfilledstudios.ImageCaptureQueue", NULL);
captureVideoOutput = [[AVCaptureVideoDataOutput alloc] init];
[captureVideoOutput setAlwaysDiscardsLateVideoFrames:YES];
[captureVideoOutput setSampleBufferDelegate:self queue:videoDataOutputQueue];
[captureVideoOutput setVideoSettings:videoSettings];
[captureVideoOutput setMinFrameDuration:kCMTimeZero];
dispatch_release(videoDataOutputQueue); // AVCaptureVideoDataOutput uses dispatch_retain() & dispatch_release() so we can dispatch_release() our reference now
if (useFrontCamera)
{
currentCameraDeviceIndex = frontCameraDeviceIndex;
cameraImageOrientation = UIImageOrientationLeftMirrored;
}
else
{
currentCameraDeviceIndex = backCameraDeviceIndex;
cameraImageOrientation = UIImageOrientationRight;
}
selectedCamera = [[AVCaptureDevice devices] objectAtIndex:(NSUInteger)currentCameraDeviceIndex];
captureVideoInput = [AVCaptureDeviceInput deviceInputWithDevice:selectedCamera error:nil];
captureSession = [[AVCaptureSession alloc] init];
[captureSession beginConfiguration];
[self setCaptureConfiguration];
[captureSession addInput:captureVideoInput];
[captureSession addOutput:captureVideoOutput];
[captureSession commitConfiguration];
[captureSession startRunning];
}
// AVCaptureVideoDataOutputSampleBufferDelegate
// AVCaptureAudioDataOutputSampleBufferDelegate
//
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
if (captureOutput==captureVideoOutput)
{
[self performImageCaptureFrom:sampleBuffer fromConnection:connection];
}
[pool drain];
}
- (void) performImageCaptureFrom:(CMSampleBufferRef)sampleBuffer
{
CVImageBufferRef imageBuffer;
if (CMSampleBufferGetNumSamples(sampleBuffer) != 1)
return;
if (!CMSampleBufferIsValid(sampleBuffer))
return;
if (!CMSampleBufferDataIsReady(sampleBuffer))
return;
imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
if (CVPixelBufferGetPixelFormatType(imageBuffer) != kCVPixelFormatType_32BGRA)
return;
CVPixelBufferLockBaseAddress(imageBuffer,0);
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
int bufferSize = bytesPerRow * height;
uint8_t *tempAddress = malloc(bufferSize);
memcpy(tempAddress, baseAddress, bytesPerRow * height);
baseAddress = tempAddress;
//
// Apply affects to the pixels stored in (uint32_t *)baseAddress
//
//
// example: grayScale((uint32_t *)baseAddress, width, height);
// example: sepia((uint32_t *)baseAddress, width, height);
//
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = nil;
if (cameraDeviceSetting != CameraDeviceSetting640x480) // not an iPhone4 or iTouch 5th gen
newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipFirst);
else
newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
CGColorSpaceRelease(colorSpace);
CGContextRelease(newContext);
free(tempAddress);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
if (newImage == nil)
{
return;
}
// To be able to display the CGImageRef newImage in your UI you will need to do it like this
// because you are running on a different thread here…
//
[self performSelectorOnMainThread:@selector(newCameraImageNotification:) withObject:(id)newImage waitUntilDone:YES];
}
您可以覆盖图像上的视图并更改混合模式以匹配黑白效果。
时退房QuartzDemo从苹果,特别是在该演示中,例如Blending Modes
另一种方式做,这将是使用AVFoundation
每一帧转换。我对此没有太多的经验,但WWDC2010的“Session 409 - 使用带AVFoundation的摄像头”视频及其示例项目应该能够帮助解决您的问题。
也就是说,如果你还好,可以使用iOS4类。
尝试了您的Live Effects凸轮,它看起来不错,它具有更多我试图实现的功能。做得好!只是感到惊讶,它是免费的。 – BlueDolphin 2010-12-30 03:36:41
谢谢。我每天下载50次下载量为99美分,平均每天下载量超过1500次,而且是免费的。我正在发布一个更新,其中包含应用程序购买中最受欢迎的新功能。我建议今天任何开发新应用的人都可以使用应用内购买方式的免费应用。 – 2011-01-02 02:26:39