NSScrollView 中的图像尺寸减小

发布于 2024-08-22 06:33:01 字数 1762 浏览 4 评论 0原文

我试图将图像加载到 NSImageView/NSScrollView 中并以实际大小显示,但图像最终神秘地以大约一半大小显示。我一开始以为它可能会被缩小以适应框架等的某种约束,但很快意识到这不可能是正确的,因为如果我物理上放大图像尺寸(在图像编辑程序中)然后加载它再次,我发现我可以加载/显示我想要的大小的图像。

所讨论图像的实际尺寸仅为 2505 x 930,我想这不是问题,因为我可以将其放大一倍或四倍而不会出现任何明显问题(当然,它们在显示时都缩小了约 50%) )。我非常简单的代码的相关部分是:

- (IBAction)openSourceImage:(NSString*)aFilepath
{
// obtain image filepath passed from 'chooseFile'...
NSImage *theImage = [[NSImage alloc] initWithContentsOfFile:aFilepath];
if (theImage)
    {
    [theImageView setImage:theImage];
    // resize imageView to fit image; causes the surrounding NSScrollView to adjust its scrollbars appropriately...
    [theImageView setFrame:
        NSMakeRect([theImageView frame].origin.x, [theImageView frame].origin.y, [theImage size].width, [theImage size].height)];
    [theImageView scrollRectToVisible:
        NSMakeRect([theImageView frame].origin.x, [theImageView frame].origin.y + [theImageView frame].size.height,1,1)];
    [theImage release]; // we're done with 'theImage' we allocated, so release it

    // display the window title from the filepath...
    NSString *aFilename = [aFilepath lastPathComponent];
    [[theImageView window] setTitle:aFilename];
}

}

谁能告诉我这里哪里出错了,以及如何以实际尺寸显示图像?

解决方案:好的,所以调用“size”会导致显示的图像太小而无法使用,而调用“pixelsHigh/pixelsWide”会导致不确定比例的放大图像...

我的测试应用程序使用几个滑块驱动的“十字准线”来绘制图像上特征的坐标(例如照片或地图)。纯属偶然,我无意中注意到,虽然加载的图像仅显示其实际大小的一小部分,但 x,y 坐标确实对应于现实生活(表示大约 70 像素/英寸)。想想那个...

使用:

[theImage setSize:NSMakeSize(imageSize.width * 4, imageSize.height * 4)];

我现在能够以已知的放大倍数加载图像,并将绘制的 x,y 测量值减少相同的系数。我还引入了 NSAffineTransform 方法,允许我放大/缩小以获得最佳观看尺寸。

唷!这对于我这个新手水平的人来说是一个挑战,而且我仍然不明白原始显示问题的根本原因,但我想最终结果才是最重要的。再次感谢你们俩:-)

I'm trying to load an image into my NSImageView/NSScrollView and display it at actual size, but the image mysteriously ends up getting displayed at about half-size. I thought at first it might be being reduced to fit into some kind of constraints of the frame etc., but soon realised this couldn't be right because if I physically enlarge the image size (in an image editing program) and then load it again, I find I can presumably load/display images as big as I want to.

The actual size of the image in question is only 2505 x 930, which I guess isn't a problem since I can double and quadruple this without any apparent problems (except of course, that they're all reduced by about 50% when displayed). The relevant part of my very straightforward code is:

- (IBAction)openSourceImage:(NSString*)aFilepath
{
// obtain image filepath passed from 'chooseFile'...
NSImage *theImage = [[NSImage alloc] initWithContentsOfFile:aFilepath];
if (theImage)
    {
    [theImageView setImage:theImage];
    // resize imageView to fit image; causes the surrounding NSScrollView to adjust its scrollbars appropriately...
    [theImageView setFrame:
        NSMakeRect([theImageView frame].origin.x, [theImageView frame].origin.y, [theImage size].width, [theImage size].height)];
    [theImageView scrollRectToVisible:
        NSMakeRect([theImageView frame].origin.x, [theImageView frame].origin.y + [theImageView frame].size.height,1,1)];
    [theImage release]; // we're done with 'theImage' we allocated, so release it

    // display the window title from the filepath...
    NSString *aFilename = [aFilepath lastPathComponent];
    [[theImageView window] setTitle:aFilename];
}

}

Can anyone please tell me where I'm going wrong here, and how to display images at actual size?

Solution: Okay, so calling 'size' results in a displayed image that is too small to work with and calling 'pixelsHigh/pixelsWide' results in a magnified image of indeterminate scale...

My test app uses a couple of slider-driven 'crosshairs' to plot the coordinates of features on an image (like a photo or map for instance). By pure chance, I accidentally noticed that while the loaded image only displayed at a fraction of its actual size, the x,y coordinates DID correspond to real-life (indicating about 70 pixels/inch). Go figure that one...

Using:

[theImage setSize:NSMakeSize(imageSize.width * 4, imageSize.height * 4)];

I'm now able to load the image at a KNOWN magnification and reduce my plotted x,y measurements by the same factor. I also threw in an NSAffineTransform method to allow me to zoom in/out for the best viewing size.

Phew! That was challenging for someone at my fairly novice level, and I still don't understand the underlying cause of the original display problem, but I guess the end result is all that counts. Thanks again to both of you :-)

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

隔纱相望 2024-08-29 06:33:02

我不太确定为什么图像的尺寸不正确,但您可以尝试“校正”尺寸以匹配其表示的尺寸之一:(

NSBitmapImageRep* aRep = [[theImage representations] objectAtIndex: 0];

NSSize correctedImageSize = NSMakeSize([aRep pixelsWide], [aRep pixelsHigh]);
[theImage setSize:correctedImageSize];

当然,也许代表的尺寸也是错误的!)

I'm not really sure why the image wouldn't be the right size, but you could try "correcting" the size to match one of its representations' sizes:

NSBitmapImageRep* aRep = [[theImage representations] objectAtIndex: 0];

NSSize correctedImageSize = NSMakeSize([aRep pixelsWide], [aRep pixelsHigh]);
[theImage setSize:correctedImageSize];

(Of course, maybe the rep is also the wrong size!)

我不会写诗 2024-08-29 06:33:02

size 方法返回以点为单位的大小,而不是像素。 pixelsHighpixelsWide 方法就是您所追求的。

The size method returns the size in points, not pixels. The pixelsHigh and pixelsWide methods are what you're after.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文