以编程方式访问 iSight?

发布于 2024-07-04 04:09:47 字数 131 浏览 13 评论 0原文

是否可以通过编程方式访问 MacBook 上的 iSight 摄像头? 我的意思是,我希望能够根据命令从 iSight 摄像头抓取静止帧,然后对它们执行某些操作。 如果是这样,是否只能使用 Objective C 访问,或者也可以使用其他语言?

Is it possible to access the iSight camera on a macbook programmatically? By this I mean I would like to be able to just grab still frames from the iSight camera on command and then do something with them. If so, is it only accessible using objective c, or could other languages be used as well?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(7

李白 2024-07-11 04:09:47

有一个名为 isightcapture 的命令行实用程序,可以执行更多或少做你想做的事。 您可能可以从开发人员那里获取代码(他的电子邮件地址位于下载该实用程序时获得的自述文件中)。

There's a command line utility called isightcapture that does more or less what you want to do. You could probably get the code from the developer (his e-mail address is in the readme you get when you download the utility).

溇涏 2024-07-11 04:09:47

到目前为止还没有提到的一件事是 IKPictureTaker,它是图像套件的一部分。 这将提供标准操作系统提供的拍照面板,其中包括所有可能的滤镜功能等。 我不确定这是否是你想要的。

我想你也可以从其他语言使用它,考虑到有像 cocoabridges 但我没有使用它们的经验。

谷歌搜索还提出了stackoverflow上的另一个问题似乎可以解决这个问题。

One thing that hasn't been mentioned so far is the IKPictureTaker, which is part of Image Kit. This will come up with the standard OS provided panel to take pictures though, with all the possible filter functionality etc. included. I'm not sure if that's what you want.

I suppose you can use it from other languages as well, considering there are things like cocoa bridges but I have no experience with them.

Googling also came up with another question on stackoverflow that seems to address this issue.

や莫失莫忘 2024-07-11 04:09:47

除了 ObjC 之外,您还可以使用 PyObjC 或 RubyCocoa 绑定来访问它。 如果你对哪种语言不挑剔,我建议使用 Ruby,因为 PyObjC 的文档记录非常糟糕(甚至 Apple 官方页面上也提到了旧版本,而不是 OS X Leopard 附带的版本)。

Quartz Composer 是可能是访问它的最简单的方法,并且 .quartz 文件可以很容易地嵌入到应用程序中(并且数据通过管道输出到 ObjC 等)

另外,我想 /Developer/Examples/ 中应该有一个或两个示例

Aside from ObjC, you can use the PyObjC or RubyCocoa bindings to access it also. If you're not picky about which language, I'd say use Ruby, as PyObjC is horribly badly documented (even the official Apple page on it refers to the old version, not the one that came with OS X Leopard)

Quartz Composer is probably the easiest way to access it, and .quartz files can be embed in applications pretty easily (and the data piped out to ObjC or such)

Also, I suppose there should be an example or two of this in the /Developer/Examples/

你又不是我 2024-07-11 04:09:47

从专门要求解决方案为Pythonic的相关问题来看,您应该尝试 motmot 的 camiface 库来自安德鲁·斯特劳。 它也适用于火线相机,但它也适用于 isight,这正是您正在寻找的。

从教程中:

import motmot.cam_iface.cam_iface_ctypes as cam_iface
import numpy as np

mode_num = 0
device_num = 0
num_buffers = 32

cam = cam_iface.Camera(device_num,num_buffers,mode_num)
cam.start_camera()
frame = np.asarray(cam.grab_next_frame_blocking())
print 'grabbed frame with shape %s'%(frame.shape,)

From a related question which specifically asked the solution to be pythonic, you should give a try to motmot's camiface library from Andrew Straw. It also works with firewire cameras, but it works also with the isight, which is what you are looking for.

From the tutorial:

import motmot.cam_iface.cam_iface_ctypes as cam_iface
import numpy as np

mode_num = 0
device_num = 0
num_buffers = 32

cam = cam_iface.Camera(device_num,num_buffers,mode_num)
cam.start_camera()
frame = np.asarray(cam.grab_next_frame_blocking())
print 'grabbed frame with shape %s'%(frame.shape,)
看春风乍起 2024-07-11 04:09:47

您应该查看 QTKit Capture 文档

在 Leopard 上,您可以通过 RubyCocoa 桥获取所有这些内容:

require 'osx/cocoa'
OSX.require_framework("/System/Library/Frameworks/QTKit.framework")

OSX::QTCaptureDevice.inputDevices.each do |device|
    puts device.localizedDisplayName
end

You should check out the QTKit Capture documentation.

On Leopard, you can get at all of it over the RubyCocoa bridge:

require 'osx/cocoa'
OSX.require_framework("/System/Library/Frameworks/QTKit.framework")

OSX::QTCaptureDevice.inputDevices.each do |device|
    puts device.localizedDisplayName
end
从﹋此江山别 2024-07-11 04:09:47

我这里没有 Mac,但这里有一些文档:

http://developer.apple.com/documentation/Hardware/Conceptual/iSightProgGuide/01introduction/chapter_1_section_1.html

看起来您必须通过 QuickTime API。 应该有一个名为“MungGrab”的示例项目,根据 此帖子

I don't have a Mac here, but there is some Documentation up here:

http://developer.apple.com/documentation/Hardware/Conceptual/iSightProgGuide/01introduction/chapter_1_section_1.html

It looks like you have to go through the QuickTime API. There is supposed to be a Sample Project called "MungGrab" which could be worth a look according to this thread.

那请放手 2024-07-11 04:09:47

如果您浏览 Apple 的邮件列表,您也可以找到一些用 Java 实现此目的的代码。 这是一个适合捕获单个帧的简单示例,和这里有一个更复杂的,它的速度足以显示实时视频< /a>.

If you poke around Apple's mailing lists you can find some code to do it in Java as well. Here's a simple example suitable for capturing individual frames, and here's a more complicated one that's fast enough to display live video.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文