如何通过 TCP 将 IplImage 从服务器发送到 iPod 客户端 UIImage

发布于 2024-11-18 04:34:33 字数 2123 浏览 2 评论 0原文

我在 Linux 中有一个使用 Berkeley_sockets 的服务器,并创建了与 iPod 客户端的 TCP 连接。我有一个 IplImage* img; 从服务器发送到 iPod。我使用 write(socket,/*DATA*/,43200); 命令,我尝试发送的数据是:reinterpret_cast(img)imgimg->imageData。所有这些选择实际上都会发送任何类型的数据。

在iPod端,我以这种方式接收数据(正如我在SO中看到的那样。不要介意复杂的东西,它只是为了从单个图像接收所有数据。):

bytesRead = [iStream read: (char*)[buffer mutableBytes] + totalBytesRead maxLength: 43200 - totalBytesRead];

收到整个图像后,我有这个:

[buffer setLength: 43200];
NSData *imagem = [NSData dataWithBytes:buffer length:43200];
UIImage *final= [self UIImageFromIplImage:imagem];

现在..我知道我可以让 openCV 在 iPod 上工作,但我找不到关于如何让它工作的简单解释,所以我使用了 此网页的第二个代码 并对其进行了调整,因为我知道图像的所有规格(例如,我设置了 CGImageCreate() 函数中的所有变量。):

- (UIImage *)UIImageFromIplImage:(NSData *)image {

CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray();

// Allocating the buffer for CGImage
NSData *data = [NSData dataWithBytes:image length:43200];

CGDataProviderRef provider = CGDataProviderCreateWithCFData((CFDataRef)data);

// Creating CGImage from chunk of IplImage    
size_t width = 240;
size_t height = 180;
size_t depth = 8;             //bitsPerComponent
size_t depthXnChannels = 8;   //bitsPerPixel
size_t widthStep = 240;       //bytesPerRow

CGImageRef imageRef = CGImageCreate(width, height, depth, depthXnChannels, widthStep, colorSpace, kCGImageAlphaNone|kCGBitmapByteOrderDefault,provider, NULL, false, kCGRenderingIntentDefault);

// Getting UIImage from CGImage
UIImage *ret = [UIImage imageWithCGImage:imageRef];
lolView.image = ret;
CGImageRelease(imageRef);
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpace);
return ret;

}

问题:当我显示图像时,尽管发送的图像始终相同,但我觉得它完全奇怪且“随机”。我真的不知道出了什么问题..

PS:TCP 连接与其他数据(如数字或单词)工作正常。并且图像是灰度的。

感谢您的所有帮助。

I have a server in linux using the Berkeley_sockets and I create a TCP connection with an iPod client. I have an IplImage* img; to send from the server to the iPod. I use the write(socket,/*DATA*/,43200); command and the data i tried to send is: reinterpret_cast<char*>(img), img and img->imageData. All of this choices actually send any kind of data.

On the iPod side I receive data this way (as i've seen here in SO. Don't mind the complicated stuff, it's just for receiving all the data from a single image.):

bytesRead = [iStream read: (char*)[buffer mutableBytes] + totalBytesRead maxLength: 43200 - totalBytesRead];

After receiving the whole image, i have this:

[buffer setLength: 43200];
NSData *imagem = [NSData dataWithBytes:buffer length:43200];
UIImage *final= [self UIImageFromIplImage:imagem];

Now.. i know i could have openCV working on the iPod, but i can't find a simple explanation on how to get it to work, so I used the second code from this webpage and adapted it, since i know all the specifications of my image (for instance I set up all the variables from the CGImageCreate() function.):

- (UIImage *)UIImageFromIplImage:(NSData *)image {

CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray();

// Allocating the buffer for CGImage
NSData *data = [NSData dataWithBytes:image length:43200];

CGDataProviderRef provider = CGDataProviderCreateWithCFData((CFDataRef)data);

// Creating CGImage from chunk of IplImage    
size_t width = 240;
size_t height = 180;
size_t depth = 8;             //bitsPerComponent
size_t depthXnChannels = 8;   //bitsPerPixel
size_t widthStep = 240;       //bytesPerRow

CGImageRef imageRef = CGImageCreate(width, height, depth, depthXnChannels, widthStep, colorSpace, kCGImageAlphaNone|kCGBitmapByteOrderDefault,provider, NULL, false, kCGRenderingIntentDefault);

// Getting UIImage from CGImage
UIImage *ret = [UIImage imageWithCGImage:imageRef];
lolView.image = ret;
CGImageRelease(imageRef);
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpace);
return ret;

}

THE PROBLEM: When I display the image, I get it completely weird and ´random´, even though the image sent is always the same. I really have no idea what's wrong..

PS: The TCP connection is working fine with other data, like numbers or words. And the image is grayscale.

Thanks for all the help.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

喜爱皱眉﹌ 2024-11-25 04:34:33

我得到它像这样工作。
在服务器端(code::blocks in linux with openframeworks (& ofxOpenCv)):

img.allocate(240, 180, OF_IMAGE_COLOR);                    //ofImage
img2.allocate(240, 180);                                   //ofxCvColorImage
frame = cvCreateImage(cvSize(240,180), IPL_DEPTH_8U, 3);   //IplImage
bw = cvCreateImage(cvSize(240,180), IPL_DEPTH_8U, 1);      //IplImage
gray.allocate(240, 180);                                   //ofxCvGrayscaleImage


///ofImage
img.loadImage("lol.jpg");

///ofImage -> ofxCvColor
img2.setFromPixels(img.getPixels(), 240, 180);

///ofxCvColor -> IplImage
frame = img2.getCvImage();

///IplImage in GRAY
cvCvtColor(frame,bw,CV_RGB2GRAY);
cvThreshold(bw,bw,200,255,CV_THRESH_BINARY);  //It is actually a binary image
gray = bw;
pix = gray.getPixels();

n=write(newsockfd,pix,43200);

在客户端(iPod 4.3):

-(UIImage *) dataFromIplImageToUIImage:(unsigned char *) rawData;
{
size_t width = 240;
size_t height = 180;
size_t depth = 8;                   //bitsPerComponent
size_t depthXnChannels = 8;         //bitsPerPixel
size_t widthStep = 240;             //bytesPerRow

CGContextRef ctx = CGBitmapContextCreate(rawData, width, height, depth, widthStep,  CGColorSpaceCreateDeviceGray(), kCGImageAlphaNone);

CGImageRef imageRef = CGBitmapContextCreateImage (ctx);
UIImage* rawImage = [UIImage imageWithCGImage:imageRef];  

CGContextRelease(ctx);  

myImageView.image = rawImage;  
return rawImage;

free(rawData);
}

可能有一种更简单的方法可以做到这一点,但是嘿,完成工作。希望这对任何人都有帮助。

I got it working like this.
On the server side (code::blocks in linux with openframeworks (& ofxOpenCv)):

img.allocate(240, 180, OF_IMAGE_COLOR);                    //ofImage
img2.allocate(240, 180);                                   //ofxCvColorImage
frame = cvCreateImage(cvSize(240,180), IPL_DEPTH_8U, 3);   //IplImage
bw = cvCreateImage(cvSize(240,180), IPL_DEPTH_8U, 1);      //IplImage
gray.allocate(240, 180);                                   //ofxCvGrayscaleImage


///ofImage
img.loadImage("lol.jpg");

///ofImage -> ofxCvColor
img2.setFromPixels(img.getPixels(), 240, 180);

///ofxCvColor -> IplImage
frame = img2.getCvImage();

///IplImage in GRAY
cvCvtColor(frame,bw,CV_RGB2GRAY);
cvThreshold(bw,bw,200,255,CV_THRESH_BINARY);  //It is actually a binary image
gray = bw;
pix = gray.getPixels();

n=write(newsockfd,pix,43200);

On the client side (iPod 4.3):

-(UIImage *) dataFromIplImageToUIImage:(unsigned char *) rawData;
{
size_t width = 240;
size_t height = 180;
size_t depth = 8;                   //bitsPerComponent
size_t depthXnChannels = 8;         //bitsPerPixel
size_t widthStep = 240;             //bytesPerRow

CGContextRef ctx = CGBitmapContextCreate(rawData, width, height, depth, widthStep,  CGColorSpaceCreateDeviceGray(), kCGImageAlphaNone);

CGImageRef imageRef = CGBitmapContextCreateImage (ctx);
UIImage* rawImage = [UIImage imageWithCGImage:imageRef];  

CGContextRelease(ctx);  

myImageView.image = rawImage;  
return rawImage;

free(rawData);
}

Probably there's an easier way to do this, but hey, gets the work done. Hope this helps anyone.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文