手势库中手势的变化是否可以提高识别率?
我正在努力在我的应用程序中实现手势识别,使用手势生成器创建手势库。我想知道手势的多种变化是否会有助于或阻碍识别(或表现)。例如,我想识别一个圆形手势。我将至少有两种变体 - 一种用于顺时针旋转,另一种用于逆时针旋转,具有相同的语义,以便用户不必考虑它。但是,我想知道是否需要为每个方向保存多个手势,例如,不同半径的手势,或者“足够接近”的不同形状(例如鸡蛋形状、椭圆形等,包括不同的角度旋转)每一个。有人有这方面的经验吗?
I'm working on implementing gesture recognition in my app, using the Gestures Builder to create a library of gestures. I'm wondering if having multiple variations of a gesture will help or hinder recognition (or performance). For example, I want to recognize a circular gesture. I'm going to have at least two variations - one for a clockwise circle, and one for counterclockwise, with the same semantic meaning so that the user doesn't have to think about it. However, I'm wondering if it would be desirable to save several gestures for each direction, for example, of various radii, or with different shapes that are "close enough" - like egg shapes, ellipses, etc., including different angular rotations of each. Anybody have experience with this?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
好的,经过一些实验和阅读 android 源代码后,我学到了一些…首先,看来我不一定需要担心在我的手势库中创建不同的手势来覆盖不同的角度旋转或方向(我的圆形手势的顺时针/逆时针)。默认情况下,GestureStore 使用 SEQUENCE_SENSITIVE 的序列类型(意味着起点和终点很重要)和 ORIENTATION_SENSITIVE 的方向样式(意味着旋转角度很重要)。但是,可以使用“setOrientationStyle(ORIENTATION_INVARIANT)”和 setSequenceType(SEQUENCE_INVARIANT) 覆盖这些默认值。
此外,引用源中的评论...“当使用 SEQUENCE_SENSITIVE 时,当前仅允许单笔划手势”和“ORIENTATION_SENSITIVE 和 ORIENTATION_INVARIANT 仅适用于 SEQUENCE_SENSITIVE 手势”。
有趣的是,ORIENTATION_SENSITIVE 似乎不仅仅意味着“方向很重要”。它的值为 2,与其关联的注释和一些相关的(未记录的)常量意味着您可以请求不同级别的敏感度。
在调用 GestureLibary.recognize() 期间,方向类型值(1、2、4 或 8)作为参数 numOrientations 传递给 GestureUtils.minimumCosineDistance(),随后执行的一些计算高于我的工资等级(见下文)。如果有人可以解释这一点,我很感兴趣。我知道它正在计算两个手势之间的角度差,但我不明白它使用 numOrientations 参数的方式。我的期望是,如果我指定值 2,它会找到手势 A 和手势 B 的两种变体之间的最小距离——一种变体是“正常 B”,另一种是 B 旋转 180 度。因此,我预计值 8 将考虑 B 的 8 个变体,间隔 45 度。然而,即使我不完全理解下面的数学,在我看来,在任何计算中都不会直接使用 numOrientations 值 4 或 8,尽管大于 2 的值确实会产生不同的代码路径。也许这就是为什么其他值没有记录的原因。
根据我的阅读,我推测最简单和最好的方法是存储一个圆形手势,将序列类型和方向设置为不变。这样,任何圆形都应该很好地匹配,无论方向或方向如何。所以我尝试了一下,对于几乎任何与圆形有一点相似的东西,它确实返回了高分(大约在 25 到 70 之间)。然而,对于甚至不接近圆形的手势(水平线、V 形等),它也会返回 20 左右的分数。因此,我对区分应该匹配和不应该匹配的内容感到不舒服。似乎最有效的方法是存储两个手势,每个方向一个,并将 SEQUENCE_SENSITIVE 与 ORIENTATION_INVARIANT 结合使用。对于任何模糊的圆形,我的得分为 2.5 或更高,但对于非圆形的手势,得分低于 1(或根本不匹配)。
OK, after some experimenation and reading of the android source, I've learned a little... First, it appears that I don't necessarily have to worry about creating different gestures in my gesture library to cover different angular rotations or directions (clockwise/counterclockwise) of my circular gesture. By default, a GestureStore uses a sequence type of SEQUENCE_SENSITIVE (meaning that the starting point and ending points matter), and an orientation style of ORIENTATION_SENSITIVE (meaning that the rotational angle matters). However, these defaults can be overridden with 'setOrientationStyle(ORIENTATION_INVARIANT)' and setSequenceType(SEQUENCE_INVARIANT).
Furthermore, to quote from the comments in the source... "when SEQUENCE_SENSITIVE is used, only single stroke gestures are currently allowed" and "ORIENTATION_SENSITIVE and ORIENTATION_INVARIANT are only for SEQUENCE_SENSITIVE gestures".
Interestingly, ORIENTATION_SENSITIVE appears to mean more than just "orientation matters". It's value is 2, and the comments associated with it and some related (undocumented) constants imply that you can request different levels of sensitivity.
During a call to GestureLibary.recognize(), the orientation type value (1, 2, 4, or 8) is passed through to GestureUtils.minimumCosineDistance() as the parameter numOrientations, whereupon some calculations are performed that are above my pay grade (see below). If someone can explain this, I'm interested. I get that it is calculating the angular difference between two gestures, but I don't understand the way it's using the numOrientations parameter. My expectation is that if I specify a value of 2, it finds the minimum distance between gesture A and two variations of gesture B -- one variation being "normal B", and the other being B spun around 180 degrees. Thus, I would expect a value of 8 would consider 8 variations of B, spaced 45 degrees apart. However, even though I don't fully understand the math below, it doesn't look to me like a numOrientations value of 4 or 8 is used directly in any calculations, although values greater than 2 do result in a distinct code path. Maybe that's why those other values are undocumented.
Based on my reading, I theorized that the simplest and best approach would be to have one stored circular gesture, setting the sequence type and orientation to invariant. That way, anything circular should match pretty well, regardless of direction or orientation. So I tried that, and it did return high scores (in the range of about 25 to 70) for pretty much anything remotely resembling a circle. However, it also returned scores of 20 or so for gestures that were not even close to circular (horizontal lines, V shapes, etc.). So, I didn't feel good about the separation between what should be matches and what should not. What seems to be working best is to have two stored gestures, one in each direction, and using SEQUENCE_SENSITIVE in conjunction with ORIENTATION_INVARIANT. That's giving me scores of 2.5 or higher for anything vaguely circular, but scores below 1 (or no matches at all) for gestures that are not circular.