如何从 iPhone 拍摄的图像中获取像素数据

发布于 2024-12-28 14:01:35 字数 1481 浏览 1 评论 0原文

问题不是获取像素数据,我能够找到一些来源,

-(NSArray *)getRGBAtLocationOnImage:(UIImage *)theImage X:(int)x Y:(int)y
{
    // First get the image into your data buffer
    CGImageRef image = [theImage CGImage];
    NSUInteger width = CGImageGetWidth(image);
    NSUInteger height = CGImageGetHeight(image);

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    unsigned char *rawData = malloc(height * width * 4);
    NSUInteger bytesPerPixel = 4;
    NSUInteger bytesPerRow = bytesPerPixel * width;
    NSUInteger bitsPerComponent = 8;
    CGContextRef context = CGBitmapContextCreate(rawData, width, height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
    CGColorSpaceRelease(colorSpace);

    CGContextDrawImage(context, CGRectMake(0, 0, width, height),image);
    CGContextRelease(context);

    // Now your rawData contains the image data in the RGBA8888 pixel format.
    int byteIndex = (bytesPerRow * y) + x * bytesPerPixel;
    int red = rawData[byteIndex];
    int green = rawData[byteIndex + 1];
    int blue = rawData[byteIndex + 2];
    //int alpha = rawData[byteIndex + 3];

    NSLog(@"Red: %d   Green: %d    Blue: %d",red,green,blue);

    NSArray *i = [[NSArray alloc] initWithObjects:[NSNumber numberWithInt:red], [NSNumber numberWithInt:green], [NSNumber numberWithInt:blue], nil];

    free(rawData);
    return i;
}

问题是我想要获取的像素的位置。我不知道如何找出我想要获取的像素所在的位置。有什么办法可以解决这个问题。

the problem isnt getting the pixel data i was able to find some source for that

-(NSArray *)getRGBAtLocationOnImage:(UIImage *)theImage X:(int)x Y:(int)y
{
    // First get the image into your data buffer
    CGImageRef image = [theImage CGImage];
    NSUInteger width = CGImageGetWidth(image);
    NSUInteger height = CGImageGetHeight(image);

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    unsigned char *rawData = malloc(height * width * 4);
    NSUInteger bytesPerPixel = 4;
    NSUInteger bytesPerRow = bytesPerPixel * width;
    NSUInteger bitsPerComponent = 8;
    CGContextRef context = CGBitmapContextCreate(rawData, width, height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
    CGColorSpaceRelease(colorSpace);

    CGContextDrawImage(context, CGRectMake(0, 0, width, height),image);
    CGContextRelease(context);

    // Now your rawData contains the image data in the RGBA8888 pixel format.
    int byteIndex = (bytesPerRow * y) + x * bytesPerPixel;
    int red = rawData[byteIndex];
    int green = rawData[byteIndex + 1];
    int blue = rawData[byteIndex + 2];
    //int alpha = rawData[byteIndex + 3];

    NSLog(@"Red: %d   Green: %d    Blue: %d",red,green,blue);

    NSArray *i = [[NSArray alloc] initWithObjects:[NSNumber numberWithInt:red], [NSNumber numberWithInt:green], [NSNumber numberWithInt:blue], nil];

    free(rawData);
    return i;
}

the problem is i the location of the pixels i want to get. i have no idea how to figure out where the pixels i want to get are located. what is a way of figuring that out.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

不知在何时 2025-01-04 14:01:35

不确定是否理解您的问题,但是...

看看您的方法:

-(NSArray *)getRGBAtLocationOnImage:(UIImage *)theImage X:(int)x Y:(int)y {
    // Your method
}

它等待 x 和 y 并返回 i,一个包含您传递的点 (x,y) 的 RGB 数据的数组。

假设有一张 100x100 像素的图像,如果您想检查图像中的所有像素,则必须调用您的方法 10000 次(每个像素一次)。

在这种情况下,你可以尝试这样的事情:

NSMutableArray *RGBImage = [[NSMutableArray alloc] initWithObjects:nil];
    for (int k = 0; k < IMAGE_WIDTH; k++) {
        for (j = 0; j < IMAGE_HEIGHT; j++) {
            NSArray *RGBPixel = [self getRGBAtLocationOnImage:theImage X:k Y:j]
            [RGBImage addObject:RGBPixel];
        }
    }

Not sure to undestand your issue, but...

Take a look at your method:

-(NSArray *)getRGBAtLocationOnImage:(UIImage *)theImage X:(int)x Y:(int)y {
    // Your method
}

It waits for x and y and returns i, an array containing the RGB data of the point (x,y) you passed.

Suppose to have an image 100x100 pixels, you have to call your method 10000 times (one per pixel) if you want to check all the pixels in your image.

In that case, you can try something like this:

NSMutableArray *RGBImage = [[NSMutableArray alloc] initWithObjects:nil];
    for (int k = 0; k < IMAGE_WIDTH; k++) {
        for (j = 0; j < IMAGE_HEIGHT; j++) {
            NSArray *RGBPixel = [self getRGBAtLocationOnImage:theImage X:k Y:j]
            [RGBImage addObject:RGBPixel];
        }
    }
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文