当相机在 iPhone 中运行时使用 UIAlertView
当我的相机使用 OpenCV 检测到人脸时,我尝试触发 AlertView。我设法进行人脸检测并可以成功输出 NSLog。但是,当我尝试触发警报视图时,
NSLog(@"Face Detected");
UIAlertView *alert = [[[UIAlertView alloc] initWithTitle:@"Face Detected" message:@"Do you really want to try again?" delegate:self cancelButtonTitle:@"Cancel" otherButtonTitles:nil] autorelease];
[alert addButtonWithTitle:@"Yes"];
[alert show];
[alert release];
我可以看到警报视图在屏幕变暗时被触发,但我永远看不到警报视图出来......
感谢您的帮助!
I'm trying to trigger an AlertView when my camera detect a face using OpenCV. I manage to do the face detection and can output an NSLog successfully. But when I tried to trigger the alert view with
NSLog(@"Face Detected");
UIAlertView *alert = [[[UIAlertView alloc] initWithTitle:@"Face Detected" message:@"Do you really want to try again?" delegate:self cancelButtonTitle:@"Cancel" otherButtonTitles:nil] autorelease];
[alert addButtonWithTitle:@"Yes"];
[alert show];
[alert release];
I can see the Alert View is kind of triggered as the screen is dimmed but I could never see the alert view came out...
Thanks for helping!
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
删除
[警报发布]
。您已经对其调用了autorelease
。此外,您还可以将
[alert addButtonWithTitle:@"Yes"];
集成到初始化程序中:Remove
[alert release]
. You already calledautorelease
on it.Also, you can integrate
[alert addButtonWithTitle:@"Yes"];
in the initializer:你从哪里打电话来的?主线程还是次要线程?
因为 UIKit 的东西应该总是在主线程上完成。
代码示例:
然后
where are you calling this from ? Main thread or secondary ?
Because UIKit stuff should always been done on main thread.
Code example:
and then