青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品

Gesture recognition

LINK: http://wiki.nuigroup.com/Gesture_recognition

Contents

[hide]

Background

Touchlib does a fine job of picking out contacts within the input surface. At the moment, there is no formal way of defining how those blob contacts are translated into intended user actions. Some of this requires application assistance to provide context, but some of it is down to pattern matching the appearance, movement and loss of individual blobs or combined blobs.

What I'm attempting to describe here, and for others to contribute to, is a way of


  1. Describing a standard library of gestures suitable for the majority of applications
  2. A library of code that supports the defined gestures, and generates events to the application layer


By using an XML dialect to describe gestures, it means that individual applications can specify their range of supported gestures to the Gesture Engine. Custom gestures can be supported.

By loosely coupling gesture recognition to the application, we can allow people to build different types of input device and plug them all into the same applications where appropriate.

In the early stages of development, we are all doing our own thing with minimal overlap. Over time we will realise the benefits of various approaches, and by using some standardised interfaces, we can mix and match to take advantage of the tools that work best for our applications. Hard coded interfaces or internal gesture recognition will tie you down and potentially make your application obsolete as things move on.

I'd really appreciate some feedback on this - this is just my take on how to move this forward a little at this stage.

Gesture Definition Markup Langauge

GDML is a proposed XML dialect that describes how events on the input surface are built up to create distinct gestures.

Tap Gesture

<gdml>
<gesture name="tap">
<comment>
A 'tap' is considered to be equivalent to the single click event in a
normal windows/mouse environment.
</comment>
<sequence>
<acquisition type="blob"/>
<update>
<range max="10" />
<size maxDiameter="10" />
<duration max="250" />
</update>
<loss>
<event name="tap">
<var name="x" />
<var name="y" />
<var name="size" />
</event>
</loss>
</sequence>
</gesture>
</gdml>

The gesture element defines the start of a gesture, and in this case gives it the name 'tap'.

The sequence element defines the start of a sequence of events that will be tracked. This gesture is considered valid whilst the events sequence remains valid.

The acquisition element defines that an acquisition event should be seen (fingerDown in Touchlib). This tag is designed to be extensible to input events other than blob, such as fiduciary markers, real world objects or perhaps facial recognition for input systems that are able to distinguish such features.

The update element defines the allowed parameters for the object once acquired. If the defined parameters become invalid during tracking of the gesture, the gesture is no longer valid.

The range element validates that the current X and Y coordinates of the object are within the specified distance of the original X and Y coordinates. range should ultimately support other validations, such as 'min'.

The size element validates that the object diameter is within the specified range. Again, min and other validations of size could be defined. Size allows you to distinguish between finger and palm sized touch events for example.

The duration element defines that the object should only exist for the specified time period (milliseconds). If the touch remains longer than this period, its not a 'tap', but perhaps a 'move' or 'hold' gesture.

The loss element defines what should occur when the object is lost from the input device.

The event element defines that the gesture library should generate a 'tap' event to the application layer, providing the x, y, and size variables.

Double Tap Gesture

<gesture name="doubleTap">
<comment>
A 'doubleTap' gesture is equivalent to the double click event in a normal
windows/mouse environment.
</comment>
<sequence>
<gestureRef id="A" name="tap" />
<duration max="250" />
<gestureRef id="B" name="tap">
<onEvent name="acquisition">
<range objects="A,B" max="10" />
</onEvent>
<onEvent name="tap">
<range objects="A,B" max="10" />
<event name="doubleTap">
<var name="x" />
<var name="y" />
<var name="size" />
</event>
</onEvent>
</gestureRef>
</sequence>
</gesture>

This example shows how more complex gestures can be built from simple gestures. A double tap gesture is in effect, two single taps with a short space between. The taps should be within a defined range of each other, so that they are not confused with taps in different regions of the display.

Note that the gesture is not considered invalid if a tap is generated in another area of the display. GestureLib will discard it and another tap within the permitted range will complete the sequence.

In the case of double tap, an initial tap gesture is captured. A timer is then evaluated, such that the gesture is no longer valid if the specified duration expires. However, if a second tap is initiated, it is checked to make sure that it is within range of the first. range is provided with references to the objects that need comparing (allowing for other more complex gestures to validate subcomponents of the gesture). This is done at the point of acquisition of the second object.

Once the second tap is complete and the event raised, range is again validated, and an event generated to inform the application of the gesture.

Move Gesture

<gesture name="move">
<comment>
A 'move' is considered to be a sustained finger down incorporating
movement away from the point of origin (with potential return during
the transition).
</comment>
<sequence>
<aquisition type="blob" />
<update>
<range min="5" />
<event name="move">
<var name="x" />
<var name="y" />
<var name="size" />
</event>
</update>
<loss>
<event name="moveComplete">
<var name="x" />
<var name="y" />
<var name="size" />
</event>
</loss>
</sequence>
</gesture>

Zoom Gesture

<gesture name="zoom">
<comment>
A 'zoom' is considered to be two objects that move towards or away from
each other in the same plane.
</comment>
<sequence>
<compound>
<gestureRef id="A" name="move">
<gestureRef id="B" name="move">
</compound>
<onEvent name="move">
<plane objects="A,B" maxVariance="5" />
<event name="zoom">
<var name="plane.distance" />
<var name="plane.centre" />
</event>
</onEvent>
<onEvent name="moveComplete">
<plane objects="A,B" maxVariance="5" />
<event name="zoomComplete">
<var name="plane.distance" />
<var name="plane.centre" />
</event>
</onEvent>
</sequence>
</gesture>

A zoom gesture is a compound of two move gestures.

The compound element defines that the events occur in parallel rather than series.

The plane element calculates the line between the two objects, and checks the maximum variance in the angle from its initial (so you can distinguish between a zoom and a rotate, for example).

'move' events from either object are translated into zoom events to the application.

Rotate Gesture

<gesture name="rotate">
<comment>
A 'rotate' is considered to be two objects moving around a central axis
</comment>
<sequence>
<compound>
<gestureRef id="A" name="move">
<gestureRef id="B" name="move">
</compound>
<onEvent name="move">
<axis objects="A,B" range="5" />
<event name="rotate">
<var name="axis.avgX" />
<var name="axis.avgY" />
<var name="axis.angleMax" />
</event>
</onEvent>
<onEvent name="moveComplete">
<axis objects="A,B" range="5" />
<event name="rotateComplete">
<var name="axis.avgX" />
<var name="axis.avgY" />
<var name="axis.angleMax" />
</event>
</onEvent>
</sequence>
</gesture>

The axis element calculates the midpoint between two objects and compares current position against the initial.

GestureLib - A Gesture Recognition Engine

GestureLib does not currently exist!

The purpose of GestureLib is to provide an interface between Touchlib (or any other blob/object tracking software), and the application layer. GestureLib analyses object events generated by Touchlib, and creates Gesture related events to the application for processing.

GestureLib reads gesture definitions defined in GDML, and the operates a pattern matching principle to those gestures to determine which gestures are in progress.

Why GestureLib?

My feeling is that this functionality should be separated from Touchlib, a) for the sake of clarity, and b) because its quite likely that working solutions for a high performance multi-touch environment will require distributed processing. i.e. one system doing blob tracking, another doing gesture recognition, and a further system for the application. If you can get all of your components within the same machine, then excellent, but modularity gives a great deal of flexibility and scalability.

Proposed Processing

When a object is acquired, GestureLib sends an event to the application layer providing the basic details of the acquired object, such as coordinates and size. The application can then provide context to GestureLib about the gestures that are allowed in this context.

For example, take a photo light table type application. This will have a background canvas (which might support zoom and pan/move gestures), and image objects arranged on the canvas. When the user touches a single photo, the application can inform GestureLib that the applicable gestures for this object are 'tap', 'move' and 'zoom'.

GestureLib now starts tracking further incoming events knowing that for this particular object, only three gestures are possible. Based on the allowable parameters for the defined gestures, GestureLib is then able to determine over time which unique gesture is valid. For example if a finger appears, it could be a tap, move or potentially a zoom if another finger appears. If the finger is quickly released, only a tap gesture is possible (assuming that a move must contain a minimum time or distance parameter). If the finger moves outside the permitted range for a tap, tap can be excluded, and matching continues with only move or zoom. Zoom is invalid until another finger appears, but would have an internal timeout that means the introduction of another finger later in the sequence can be treated as a separate gesture (perhaps another user, or the same user interacting with another part of the application).

Again, the application can be continually advised of touch events so that it can continue to provide context, without needing to do the math to figure out the exact gesture.

posted on 2009-05-12 14:41 zmj 閱讀(1447) 評論(0)  編輯 收藏 引用


只有注冊用戶登錄后才能發表評論。
網站導航: 博客園   IT新聞   BlogJava   博問   Chat2DB   管理


青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>
            海角社区69精品视频| 国产精品色午夜在线观看| 合欧美一区二区三区| 久久久人成影片一区二区三区观看| 一二三区精品福利视频| 欧美视频免费在线观看| 亚洲欧美日韩电影| 亚洲一区久久| 国产一区观看| 欧美成人午夜激情| 欧美久久一级| 亚洲一区二区成人在线观看| 亚洲天堂网在线观看| 国产一区二区高清视频| 免费国产自线拍一欧美视频| 欧美高清在线精品一区| 亚洲一区亚洲二区| 欧美一区二区在线观看| 亚洲国产成人porn| 久久国产直播| av成人老司机| 开元免费观看欧美电视剧网站| 午夜在线a亚洲v天堂网2018| 在线播放亚洲| 一区二区三区免费网站| 国内揄拍国内精品少妇国语| 亚洲国产精选| 国产亚洲第一区| 亚洲国产精品久久久久秋霞不卡| 欧美性jizz18性欧美| 久久久久久久久久久久久久一区| 欧美精品二区| 久久久夜夜夜| 国产精品久久久久永久免费观看 | 欧美88av| 欧美一区二区三区婷婷月色| 欧美成人精品在线视频| 欧美专区日韩专区| 欧美精品一级| 欧美 日韩 国产一区二区在线视频| 欧美日韩亚洲一区| 欧美不卡在线视频| 国产亚洲一区二区三区在线播放| 最新成人av网站| 在线播放日韩| 欧美一区二区三区免费观看视频| 一本色道久久综合亚洲精品小说| 久久久久久久久一区二区| 亚洲性夜色噜噜噜7777| 欧美高清视频| 免费人成精品欧美精品| 国产欧美精品日韩区二区麻豆天美 | 国产偷国产偷亚洲高清97cao | 亚洲成人资源网| 亚洲一区二区伦理| 亚洲在线成人精品| 欧美精品久久久久久久| 欧美xx视频| 亚洲第一天堂无码专区| 亚洲欧美综合另类中字| 亚洲欧美精品中文字幕在线| 欧美精品一区二区久久婷婷| 欧美激情一区在线| 亚洲第一精品夜夜躁人人爽| 欧美在线三区| 久久米奇亚洲| 亚洲成色777777女色窝| 久久久久久伊人| 免费观看在线综合色| 在线欧美亚洲| 欧美福利电影在线观看| 亚洲国内自拍| 亚洲午夜精品久久| 欧美日韩精品在线视频| 99精品视频免费观看| 亚洲在线成人精品| 国产精品―色哟哟| 欧美在线视频不卡| 麻豆国产va免费精品高清在线| 一区二区三区在线观看国产| 久久夜精品va视频免费观看| 欧美国产日韩一区二区| 99视频精品全部免费在线| 欧美日韩专区| 欧美亚洲一区二区在线| 模特精品在线| 99视频在线观看一区三区| 国产精品爱久久久久久久| 午夜精品久久久久久久久| 久热精品视频在线观看一区| 亚洲激情视频在线播放| 欧美日韩精品三区| 亚洲欧美日韩一区二区在线| 卡通动漫国产精品| 99精品欧美一区二区三区| 国产精品天天看| 久久综合中文| 这里只有精品在线播放| 久久―日本道色综合久久| 亚洲精品视频一区二区三区| 国产精品视频久久久| 久久全球大尺度高清视频| 日韩视频久久| 久热精品在线| 亚洲欧美激情诱惑| 亚洲国产日韩欧美在线图片| 欧美三级韩国三级日本三斤| 久久久久欧美精品| 在线亚洲精品| 亚洲成人自拍视频| 欧美在线免费视频| 99精品国产福利在线观看免费| 国产日韩精品一区二区三区在线| 欧美电影免费| 久久精品女人的天堂av| 一本到12不卡视频在线dvd| 久久中文在线| 亚洲男同1069视频| 亚洲精品影院在线观看| 国产一区二区三区久久| 欧美午夜精品电影| 蜜桃av噜噜一区| 久久国产综合精品| 亚洲一区二区黄| 99re视频这里只有精品| 欧美电影免费观看高清| 久久精品成人欧美大片古装| 一区二区三区高清| 亚洲精品久久久久久久久| 国产一区二区主播在线| 国产精品一级| 国产精品99一区二区| 欧美福利在线| 免费成人高清| 噜噜噜91成人网| 久久久久久久欧美精品| 欧美一区二区三区日韩| 亚洲综合视频网| 亚洲一区二三| 亚洲欧美激情在线视频| 中文一区二区在线观看| 99国产一区| 亚洲少妇中出一区| 亚洲一区三区电影在线观看| 一区二区福利| 亚洲一二三区视频在线观看| 亚洲毛片在线看| 一区二区电影免费观看| 99在线热播精品免费| 在线视频精品| 亚洲一区二区少妇| 亚洲综合视频1区| 午夜久久久久久| 久久9热精品视频| 久久久久久色| 久久中文久久字幕| 欧美高潮视频| 欧美性做爰毛片| 国产噜噜噜噜噜久久久久久久久| 国产精品综合av一区二区国产馆| 国产日韩欧美在线播放不卡| 国产亚洲精品美女| 亚洲第一视频网站| 99精品免费| 午夜久久久久久久久久一区二区| 久久www成人_看片免费不卡| 久久久www| 亚洲国产精彩中文乱码av在线播放| 亚洲国产一区二区在线| 99视频在线观看一区三区| 亚洲女性喷水在线观看一区| 久久爱91午夜羞羞| 欧美福利一区二区| 国产精品久久久久久久久免费桃花| 国产亚洲观看| 亚洲美洲欧洲综合国产一区| 亚洲一区999| 免费不卡在线观看| 日韩一区二区高清| 久久精品日韩一区二区三区| 欧美激情精品久久久久久黑人| 国产精品美女www爽爽爽视频| 激情亚洲一区二区三区四区| 日韩一区二区电影网| 久久精品国产欧美激情| 亚洲国产日韩欧美在线动漫| 亚洲欧美第一页| 欧美国产精品一区| 国产一区二区高清视频| 日韩西西人体444www| 久久精品视频99| 99精品视频一区| 免费永久网站黄欧美| 国产欧美精品日韩精品| aa亚洲婷婷| 欧美成人精品h版在线观看| 亚洲综合精品| 欧美日韩一卡| 亚洲黄色成人| 蜜桃久久精品一区二区|