2012-03-04 88 views
4

我第一次尝试核心映像(在OS X上,10.7.3),并且正在运行到一堵砖墙。我敢肯定,这是我正在做的傻事,只需要更熟悉框架的人来指出我。“无法识别的选择器”当试图访问CIFilter的outputImage

考虑下面的代码(我们姑且把imageURL是一个有效的文件URL指向磁盘上的一个JPG):

CIImage *inputImage = [CIImage imageWithContentsOfURL:imageURL]; 
CIFilter *filter = [CIFilter filterWithName:@"CIAreaAverage" keysAndValues: 
                kCIInputImageKey, inputImage, 
                kCIInputExtentKey, [inputImage valueForKey:@"extent"], 
                nil]; 
CIImage *outputImage = (CIImage *)[filter valueForKey:@"outputImage"]; 

运行此代码,最后一行触发:

0 CoreFoundation      0x00007fff96c2efc6 __exceptionPreprocess + 198 
1 libobjc.A.dylib      0x00007fff9153cd5e objc_exception_throw + 43 
2 CoreFoundation      0x00007fff96cbb2ae -[NSObject doesNotRecognizeSelector:] + 190 
3 CoreFoundation      0x00007fff96c1be73 ___forwarding___ + 371 
4 CoreFoundation      0x00007fff96c1bc88 _CF_forwarding_prep_0 + 232 
5 CoreImage       0x00007fff8f03c38d -[CIAreaAverage outputImage] + 52 
6 Foundation       0x00007fff991d8384 _NSGetUsingKeyValueGetter + 62 
7 Foundation       0x00007fff991d8339 -[NSObject(NSKeyValueCoding) valueForKey:] + 392 

现在,Core Image Filter Reference明确指出CIAreaAverage“返回包含感兴趣区域的平均颜色的单像素图​​像。”事实上,更令人费解,当我检查调试器中的过滤器属性(尝试valueForKey:呼叫前):

(lldb) po [filter attributes] 
(id) $3 = 0x00007fb3e3ef0e00 { 
    CIAttributeDescription = "Calculates the average color for the specified area in an image, returning the result in a pixel."; 
    CIAttributeFilterCategories =  (
     CICategoryReduction, 
     CICategoryVideo, 
     CICategoryStillImage, 
     CICategoryBuiltIn 
    ); 
    CIAttributeFilterDisplayName = "Area Average"; 
    CIAttributeFilterName = CIAreaAverage; 
    CIAttributeReferenceDocumentation = "http://developer.apple.com/cgi-bin/apple_ref.cgi?apple_ref=//apple_ref/doc/filter/ci/CIAreaAverage"; 
    inputExtent =  { 
     CIAttributeClass = CIVector; 
     CIAttributeDefault = "[0 0 640 80]"; 
     CIAttributeDescription = "A rectangle that specifies the subregion of the image that you want to process."; 
     CIAttributeDisplayName = Extent; 
     CIAttributeType = CIAttributeTypeRectangle; 
     CIUIParameterSet = CIUISetBasic; 
    }; 
    inputImage =  { 
     CIAttributeClass = CIImage; 
     CIAttributeDescription = "The image to process."; 
     CIAttributeDisplayName = Image; 
     CIUIParameterSet = CIUISetBasic; 
    }; 
    outputImage =  { 
     CIAttributeClass = CIImage; 
    }; 
} 

outputImage就在那里 - 给出型CIImage!

那么,我做错了什么?我所见过的所有文档和教程都指出-valueForKey:是访问属性的正确方式,包括outputImage

回答

7

我相信你的范围是罪魁祸首(无论如何是奇怪的)。当我将范围更改为CIVector *时,它可以工作。

NSURL *imageURL = [NSURL fileURLWithPath:@"/Users/david/Desktop/video.png"]; 
CIImage *inputImage = [CIImage imageWithContentsOfURL:imageURL]; 
CIFilter *filter = [CIFilter filterWithName:@"CIAreaAverage"]; 
[filter setValue:inputImage forKey:kCIInputImageKey]; 
CGRect inputExtent = [inputImage extent]; 
CIVector *extent = [CIVector vectorWithX:inputExtent.origin.x 
             Y:inputExtent.origin.y 
             Z:inputExtent.size.width 
             W:inputExtent.size.height]; 
[filter setValue:extent forKey:kCIInputExtentKey]; 
CIImage *outputImage = [filter valueForKey:@"outputImage"]; 

[inputImage extent]返回一个CGRect,但显然CIVector *效果更好。

+0

嗯......不是我本来期望。我今天晚上会检查一下并回复你。谢谢! – 2012-03-15 23:26:12

+0

我很想知道这是否适合你。 – devguydavid 2012-03-27 04:57:29

+0

对不起,我没有忘记,我刚刚被其他东西淹没(这是一个“个人好奇”项目)。我保证我会回复... – 2012-03-27 07:15:55

0

以下是我在iOS应用中取得CIAreaAverage工作:

CGRect inputExtent = [self.inputImage extent]; 
CIVector *extent = [CIVector vectorWithX:inputExtent.origin.x 
             Y:inputExtent.origin.y 
             Z:inputExtent.size.width 
             W:inputExtent.size.height]; 
CIImage* inputAverage = [CIFilter filterWithName:@"CIAreaAverage" keysAndValues:@"inputImage", self.inputImage, @"inputExtent", extent, nil].outputImage; 

//CIImage* inputAverage = [self.inputImage imageByApplyingFilter:@"CIAreaMinimum" withInputParameters:@{@"inputImage" : inputImage, @"inputExtent" : extent}]; 
EAGLContext *myEAGLContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2]; 
NSDictionary *options = @{ kCIContextWorkingColorSpace : [NSNull null] }; 
CIContext *myContext = [CIContext contextWithEAGLContext:myEAGLContext options:options]; 

size_t rowBytes = 32 ; // ARGB has 4 components 
uint8_t byteBuffer[rowBytes]; // Buffer to render into 

[myContext render:inputAverage toBitmap:byteBuffer rowBytes:rowBytes bounds:[inputAverage extent] format:kCIFormatRGBA8 colorSpace:nil]; 

const uint8_t* pixel = &byteBuffer[0]; 
float red = pixel[0]/255.0; 
float green = pixel[1]/255.0; 
float blue = pixel[2]/255.0; 
NSLog(@"%f, %f, %f\n", red, green, blue); 


return outputImage; 
} 
@end 

输出会是这个样子:

2015-05-23 15:58:20.935 CIFunHouse[2400:489913] 0.752941, 0.858824, 0.890196 
2015-05-23 15:58:20.981 CIFunHouse[2400:489913] 0.752941, 0.858824, 0.890196 
相关问题