青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

Gesture recognition

LINK: http://wiki.nuigroup.com/Gesture_recognition

Contents

[hide]

Background

Touchlib does a fine job of picking out contacts within the input surface. At the moment, there is no formal way of defining how those blob contacts are translated into intended user actions. Some of this requires application assistance to provide context, but some of it is down to pattern matching the appearance, movement and loss of individual blobs or combined blobs.

What I'm attempting to describe here, and for others to contribute to, is a way of


  1. Describing a standard library of gestures suitable for the majority of applications
  2. A library of code that supports the defined gestures, and generates events to the application layer


By using an XML dialect to describe gestures, it means that individual applications can specify their range of supported gestures to the Gesture Engine. Custom gestures can be supported.

By loosely coupling gesture recognition to the application, we can allow people to build different types of input device and plug them all into the same applications where appropriate.

In the early stages of development, we are all doing our own thing with minimal overlap. Over time we will realise the benefits of various approaches, and by using some standardised interfaces, we can mix and match to take advantage of the tools that work best for our applications. Hard coded interfaces or internal gesture recognition will tie you down and potentially make your application obsolete as things move on.

I'd really appreciate some feedback on this - this is just my take on how to move this forward a little at this stage.

Gesture Definition Markup Langauge

GDML is a proposed XML dialect that describes how events on the input surface are built up to create distinct gestures.

Tap Gesture

<gdml>
<gesture name="tap">
<comment>
A 'tap' is considered to be equivalent to the single click event in a
normal windows/mouse environment.
</comment>
<sequence>
<acquisition type="blob"/>
<update>
<range max="10" />
<size maxDiameter="10" />
<duration max="250" />
</update>
<loss>
<event name="tap">
<var name="x" />
<var name="y" />
<var name="size" />
</event>
</loss>
</sequence>
</gesture>
</gdml>

The gesture element defines the start of a gesture, and in this case gives it the name 'tap'.

The sequence element defines the start of a sequence of events that will be tracked. This gesture is considered valid whilst the events sequence remains valid.

The acquisition element defines that an acquisition event should be seen (fingerDown in Touchlib). This tag is designed to be extensible to input events other than blob, such as fiduciary markers, real world objects or perhaps facial recognition for input systems that are able to distinguish such features.

The update element defines the allowed parameters for the object once acquired. If the defined parameters become invalid during tracking of the gesture, the gesture is no longer valid.

The range element validates that the current X and Y coordinates of the object are within the specified distance of the original X and Y coordinates. range should ultimately support other validations, such as 'min'.

The size element validates that the object diameter is within the specified range. Again, min and other validations of size could be defined. Size allows you to distinguish between finger and palm sized touch events for example.

The duration element defines that the object should only exist for the specified time period (milliseconds). If the touch remains longer than this period, its not a 'tap', but perhaps a 'move' or 'hold' gesture.

The loss element defines what should occur when the object is lost from the input device.

The event element defines that the gesture library should generate a 'tap' event to the application layer, providing the x, y, and size variables.

Double Tap Gesture

<gesture name="doubleTap">
<comment>
A 'doubleTap' gesture is equivalent to the double click event in a normal
windows/mouse environment.
</comment>
<sequence>
<gestureRef id="A" name="tap" />
<duration max="250" />
<gestureRef id="B" name="tap">
<onEvent name="acquisition">
<range objects="A,B" max="10" />
</onEvent>
<onEvent name="tap">
<range objects="A,B" max="10" />
<event name="doubleTap">
<var name="x" />
<var name="y" />
<var name="size" />
</event>
</onEvent>
</gestureRef>
</sequence>
</gesture>

This example shows how more complex gestures can be built from simple gestures. A double tap gesture is in effect, two single taps with a short space between. The taps should be within a defined range of each other, so that they are not confused with taps in different regions of the display.

Note that the gesture is not considered invalid if a tap is generated in another area of the display. GestureLib will discard it and another tap within the permitted range will complete the sequence.

In the case of double tap, an initial tap gesture is captured. A timer is then evaluated, such that the gesture is no longer valid if the specified duration expires. However, if a second tap is initiated, it is checked to make sure that it is within range of the first. range is provided with references to the objects that need comparing (allowing for other more complex gestures to validate subcomponents of the gesture). This is done at the point of acquisition of the second object.

Once the second tap is complete and the event raised, range is again validated, and an event generated to inform the application of the gesture.

Move Gesture

<gesture name="move">
<comment>
A 'move' is considered to be a sustained finger down incorporating
movement away from the point of origin (with potential return during
the transition).
</comment>
<sequence>
<aquisition type="blob" />
<update>
<range min="5" />
<event name="move">
<var name="x" />
<var name="y" />
<var name="size" />
</event>
</update>
<loss>
<event name="moveComplete">
<var name="x" />
<var name="y" />
<var name="size" />
</event>
</loss>
</sequence>
</gesture>

Zoom Gesture

<gesture name="zoom">
<comment>
A 'zoom' is considered to be two objects that move towards or away from
each other in the same plane.
</comment>
<sequence>
<compound>
<gestureRef id="A" name="move">
<gestureRef id="B" name="move">
</compound>
<onEvent name="move">
<plane objects="A,B" maxVariance="5" />
<event name="zoom">
<var name="plane.distance" />
<var name="plane.centre" />
</event>
</onEvent>
<onEvent name="moveComplete">
<plane objects="A,B" maxVariance="5" />
<event name="zoomComplete">
<var name="plane.distance" />
<var name="plane.centre" />
</event>
</onEvent>
</sequence>
</gesture>

A zoom gesture is a compound of two move gestures.

The compound element defines that the events occur in parallel rather than series.

The plane element calculates the line between the two objects, and checks the maximum variance in the angle from its initial (so you can distinguish between a zoom and a rotate, for example).

'move' events from either object are translated into zoom events to the application.

Rotate Gesture

<gesture name="rotate">
<comment>
A 'rotate' is considered to be two objects moving around a central axis
</comment>
<sequence>
<compound>
<gestureRef id="A" name="move">
<gestureRef id="B" name="move">
</compound>
<onEvent name="move">
<axis objects="A,B" range="5" />
<event name="rotate">
<var name="axis.avgX" />
<var name="axis.avgY" />
<var name="axis.angleMax" />
</event>
</onEvent>
<onEvent name="moveComplete">
<axis objects="A,B" range="5" />
<event name="rotateComplete">
<var name="axis.avgX" />
<var name="axis.avgY" />
<var name="axis.angleMax" />
</event>
</onEvent>
</sequence>
</gesture>

The axis element calculates the midpoint between two objects and compares current position against the initial.

GestureLib - A Gesture Recognition Engine

GestureLib does not currently exist!

The purpose of GestureLib is to provide an interface between Touchlib (or any other blob/object tracking software), and the application layer. GestureLib analyses object events generated by Touchlib, and creates Gesture related events to the application for processing.

GestureLib reads gesture definitions defined in GDML, and the operates a pattern matching principle to those gestures to determine which gestures are in progress.

Why GestureLib?

My feeling is that this functionality should be separated from Touchlib, a) for the sake of clarity, and b) because its quite likely that working solutions for a high performance multi-touch environment will require distributed processing. i.e. one system doing blob tracking, another doing gesture recognition, and a further system for the application. If you can get all of your components within the same machine, then excellent, but modularity gives a great deal of flexibility and scalability.

Proposed Processing

When a object is acquired, GestureLib sends an event to the application layer providing the basic details of the acquired object, such as coordinates and size. The application can then provide context to GestureLib about the gestures that are allowed in this context.

For example, take a photo light table type application. This will have a background canvas (which might support zoom and pan/move gestures), and image objects arranged on the canvas. When the user touches a single photo, the application can inform GestureLib that the applicable gestures for this object are 'tap', 'move' and 'zoom'.

GestureLib now starts tracking further incoming events knowing that for this particular object, only three gestures are possible. Based on the allowable parameters for the defined gestures, GestureLib is then able to determine over time which unique gesture is valid. For example if a finger appears, it could be a tap, move or potentially a zoom if another finger appears. If the finger is quickly released, only a tap gesture is possible (assuming that a move must contain a minimum time or distance parameter). If the finger moves outside the permitted range for a tap, tap can be excluded, and matching continues with only move or zoom. Zoom is invalid until another finger appears, but would have an internal timeout that means the introduction of another finger later in the sequence can be treated as a separate gesture (perhaps another user, or the same user interacting with another part of the application).

Again, the application can be continually advised of touch events so that it can continue to provide context, without needing to do the math to figure out the exact gesture.

posted on 2009-05-12 14:41 zmj 閱讀(1447) 評論(0)  編輯 收藏 引用


只有注冊用戶登錄后才能發(fā)表評論。
網(wǎng)站導航: 博客園   IT新聞   BlogJava   博問   Chat2DB   管理


青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            国产欧美一区二区三区另类精品| 中文久久精品| 一本久久精品一区二区| 在线观看成人小视频| 亚洲乱亚洲高清| 亚洲福利av| 久久国产精品久久久| 午夜精品一区二区三区四区| 欧美日韩不卡| 亚洲欧洲精品成人久久奇米网 | 欧美色播在线播放| 久久在线观看视频| 国产在线拍揄自揄视频不卡99| 日韩一级片网址| 99这里只有精品| 欧美激情第二页| 亚洲高清色综合| 亚洲精品欧美日韩| 久久躁狠狠躁夜夜爽| 美女诱惑一区| 亚洲电影在线免费观看| 久久综合久久美利坚合众国| 久久综合色一综合色88| 激情欧美一区二区三区在线观看| 欧美一区二区在线| 久久婷婷国产麻豆91天堂| 国内精品久久久久久久影视麻豆| 欧美一区不卡| 久久午夜av| 亚洲国产精品va| 女女同性精品视频| 亚洲国产一区二区三区a毛片| 亚洲国产精品成人| 欧美国产日韩一区| 日韩一级视频免费观看在线| 亚洲免费一在线| 国产欧美日韩亚洲一区二区三区| 午夜精品久久久久久久99樱桃| 久久九九免费| 亚洲电影有码| 欧美日韩一区二区视频在线 | 欧美日韩国产在线播放网站| 一本色道久久综合亚洲精品不| 亚洲欧美日韩精品久久亚洲区 | 99精品视频免费全部在线| 欧美日韩一区二| 亚洲欧美激情四射在线日| 久久久噜噜噜久久人人看| 亚洲国产高清一区二区三区| 欧美日韩精品欧美日韩精品一| 亚洲一区二区动漫| 久久婷婷丁香| 一区二区福利| 国产亚洲欧美日韩精品| 免费的成人av| 亚洲一品av免费观看| 久久只有精品| 亚洲视频在线观看三级| 国产专区欧美专区| 欧美屁股在线| 久久国产精品久久久久久久久久| 亚洲国产三级在线| 久久国产精品99国产| 亚洲人妖在线| 国产日韩亚洲欧美精品| 欧美激情在线播放| 欧美亚洲综合另类| 亚洲精品一二三| 久久人人97超碰人人澡爱香蕉| 亚洲精品资源美女情侣酒店| 国产欧美精品在线播放| 欧美激情一区二区三区蜜桃视频| 午夜精品久久久久久久男人的天堂| 欧美国产第一页| 久久精品视频免费观看| 一本一本久久a久久精品综合妖精| 国外成人免费视频| 欧美日韩ab| 美女任你摸久久| 欧美在线视频在线播放完整版免费观看 | 久久久国产亚洲精品| 亚洲日本电影| 免费视频久久| 久久久久久色| 欧美一区二区免费| 亚洲一区二区在线播放| 亚洲精品一区中文| 一区免费观看视频| 国产亚洲欧美中文| 国产精品系列在线播放| 欧美日韩亚洲视频一区| 欧美国产日韩一区二区在线观看 | 亚洲一区欧美二区| 亚洲精品一区二区三区四区高清| 免费中文日韩| 狂野欧美一区| 久久精品国产91精品亚洲| 亚洲综合色丁香婷婷六月图片| 亚洲精品乱码久久久久久| 狠狠入ady亚洲精品经典电影| 国产欧美欧洲在线观看| 国产精品毛片大码女人| 欧美体内谢she精2性欧美| 欧美精品一区二区三区一线天视频| 久久艳片www.17c.com| 久久久久久成人| 久久精品卡一| 久久精品国产免费| 久久黄色级2电影| 久久精品国产亚洲a| 久久疯狂做爰流白浆xx| 久久激情婷婷| 久久婷婷麻豆| 嫩草成人www欧美| 欧美激情女人20p| 欧美片第1页综合| 国产精品xxx在线观看www| 国产精品视频1区| 国产亚洲一区二区三区在线观看 | 91久久久国产精品| 亚洲精品专区| 亚洲性线免费观看视频成熟| 亚洲在线视频免费观看| 欧美专区在线观看| 麻豆精品在线播放| 欧美理论在线播放| 欧美午夜国产| 国产一区二区中文字幕免费看| 在线观看中文字幕亚洲| 亚洲区免费影片| 一区二区日韩精品| 9i看片成人免费高清| 亚洲精品乱码久久久久久久久 | 国产精品极品美女粉嫩高清在线| 欧美丝袜一区二区| 国产一区二区日韩精品欧美精品| 狠狠综合久久av一区二区老牛| 1024成人网色www| 一区二区久久久久| 小黄鸭精品密入口导航| 麻豆久久婷婷| 亚洲美女黄色片| 欧美在线播放| 欧美日韩国产在线播放网站| 国产女主播一区| 亚洲精品综合| 久久国产免费看| 91久久久国产精品| 欧美一区二区大片| 欧美极品一区| 99国产精品99久久久久久| 在线亚洲欧美视频| 久久亚洲综合色| 国产精品剧情在线亚洲| 91久久中文字幕| 欧美一级二区| 亚洲区一区二| 久久激情久久| 欧美日韩一区在线播放| 精品91在线| 午夜精品理论片| 亚洲国产精品成人久久综合一区 | 亚洲精品美女免费| 欧美一区不卡| 欧美日韩一区二区免费视频| 在线播放中文一区| 先锋影音国产一区| 日韩写真在线| 免费亚洲视频| 黄色成人在线| 欧美在线网址| 亚洲一区二区精品| 欧美猛交免费看| 亚洲激情校园春色| 久久女同互慰一区二区三区| 亚洲性夜色噜噜噜7777| 欧美另类女人| 亚洲理伦在线| 欧美激情四色| 免费成人在线观看视频| 一区二区三区我不卡| 久久久久国产一区二区三区| 亚洲一区精彩视频| 国产精品日本一区二区| 亚洲一区免费| 亚洲特黄一级片| 国产精品国产自产拍高清av| 亚洲视频999| 日韩亚洲不卡在线| 欧美吻胸吃奶大尺度电影| 中文一区字幕| 99精品欧美一区二区三区| 欧美日韩视频一区二区| 亚洲一区二区三区乱码aⅴ蜜桃女| 亚洲国产专区校园欧美| 欧美激情91| 亚洲午夜久久久久久久久电影院| 日韩亚洲精品视频| 国产精品理论片在线观看| 午夜精品国产|