• <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
    <noscript id="pjuwb"></noscript>
          <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
            <dd id="pjuwb"></dd>
            <abbr id="pjuwb"></abbr>

            Touchlib Homepage

            LINK:http://www.whitenoiseaudio.com/touchlib/

            touch img

            What is Touchlib?

            Touchlib is our library for creating multi-touch interaction surfaces. It handles tracking blobs of infrared light for you and sends your programs multitouch events, such as 'finger down', 'finger moved', and 'finger released'. It includes a configuration app and a few demos to get you started. It interaces with most major types of webcams and video capture devices. It currently works only under windows but efforts are being made to port it to other platforms.

            Who Should Use Touchlib?

            Touchlib only comes with simple demo applications. If you want to use touchlib you must be prepared to make your own apps. There are a few ways to do this. You can build applications in C++ and take advantage of touchlib's simple programming interface. Touchlib does not provide you with any graphical or front end abilities - it simply passes you touch events. The graphics are up to you. If you like, take a look at the example apps which use OpenGL GLUT.

            If you don't want to have to compile touchlib, binaries are available.

            As of the current version, touchlib now can broadcast events in the TUIO protocol (which uses OSC). This makes touchlib compatible with several other applications that support this protocol, such as vvvv, Processing, PureData, etc.. This also makes it possible to use touchlib for blob detection / tracking and something like vvvv or Processing to write appliactions. Of course the other option is to do all your blob detection and processing in vvvv or processing. It's up to you. Supporting the TUIO protocol also enables a distributed architecture where one machine can be devoted to detection and tracking and another machine can handle the application.

            If you don't like touchlib and want to program your own system, the latest version of OpenCV (1.0) now has support for blob detection and tracking. This might be a good starting point.

            My Mindmap

            My mindmap for the touchscreen is available here. This contains info on what parts you'll need for the construction of the screen, where to find them and some very basic instructions for how to build a screen. It also includes some more links. I hope it's useful for some of the people reading this who are interested in doing their own screens. You'll need Freemind (which is coincidentally, free), in order to view it. I'm a big fan of freemind for planning out projects and getting ideas down. It's hierarchical nature allows you to organize and hide parts you are not interested in. It can also link to images, other mindmaps and web pages.

            FAQ

            Frequently asked questions about the construction of the screen can be found here.

            Where to get the source to Touchlib, our multitouch table library:

            All our source code is available on our Google Code site at http://code.google.com/p/touchlib/ . You can acces the repsitory using Subversion. If you are using windows, get TortoiseSVN. Use Tortoise to access the repository and download all the files (much easier than going thru the web interface). If you are interested in porting touchlib to linux or the mac, please email me. The system was written in such a way that it should be easy to port and does not depend heavily on any windows specific api's.

            Binaries are available here.

            Touchlib is written in C++ (the BlobTracking / Analysis is all written by yours truly) and has a Visual Studio 2005 Solution ready to compile. No docs are available right now and it's windows only (though it should be possible to make everything compile under other OS's with a little work). It currently depends on OpenCV, DirectShow (you'll need the Microsoft Platform SDK), VideoWrapper and the DSVideoLib. The source code includes our main library which you can link into your application to start capturing touch events. It has support for most major camera/webcam types. It also includes a basic config app which will need to be run in order to calibrate your camera, and has a couple example apps. Alternately, I've heard other people have used things like vvvv, EyesWeb, processing and Max/MSP in order to do blob tracking / processing and make applications. You can check out some of the demo apps if you want to see how it works. Pong or the config app should be fairly easy to follow. Setting up a bare minimum multitouch app should only take a dozen lines of code or less.

            DL Links for dependencies:

            You'll need to configure a few environment variables to get everything compiled. They are:

            • DSVL_HOME - dsvideolib root directory
            • VIDEOWRAPPER_HOME - root directory of the video wrapper library
            • OPENCV_HOME - root directory of OpenCV
            • OSCPACK_HOME - root directory of oscpack

            The config app

            In order to calibrate the touchlib library for your camera and projector, you'll need to run the config app. Here's how it works. You'll need to set up your computer so that the main monitor is the video projector so that the app comes up on that screen. Run the config app. Press 'b' at any time to recapture the background. Tweak the sliders until you get the desired results. The last step (rectify) should just have light coming from your finger tips (no background noise, etc). When you are satisfied press 'enter'. This will launch the app in full screen mode and you'll see a grid of points (green pluses). Now you can press 'c' to start calibrating. The current point should turn red. Press on your FTIR screen where the point is. Hopefully a press is detected (you can check by looking in the debug window). Press 'space' to calibrate the next point. You'll continue through until all points are calibrated. Note that the screen may not indicate where you are pressing. When you are all done, you can press 'ESC' to quit. All your changes (slider tweaks and calibration points) are saved to the config.xml file. Now when you run any touchlib app it will be calibrated. Note that any changes to where the projector is pointing or your webcam will require a re-calibration.

            Testing

            Alternate config files are available if you want to test the library using an .AVI for input (instead of the webcam). Replace the config.xml with 5point_avi.xml or 2point_avi.xml. You can edit those files to use a different AVI if you like (you can record a new one using your camera - but you may need to tweak some of the other settings in the config too).

            Links

            NEW: We now have an official community site for building FTIR tables. Access the site here . The site includes forums, a wiki, news and more.

            Other tables and info.

            Other

            IRC: #ftir on irc.freenode.net

            posted on 2009-05-07 17:00 zmj 閱讀(1439) 評論(0)  編輯 收藏 引用

            久久亚洲AV无码精品色午夜| 一本色综合久久| 亚洲精品乱码久久久久久蜜桃不卡 | 国产成人久久久精品二区三区| 国内精品伊人久久久久网站| 久久青青色综合| 精品国产91久久久久久久a| 少妇久久久久久被弄到高潮| 精品国产乱码久久久久久郑州公司| 久久93精品国产91久久综合| 亚洲欧美精品一区久久中文字幕 | 欧美激情精品久久久久| 久久久午夜精品福利内容| 亚洲国产另类久久久精品小说 | 性做久久久久久久久| 久久久久久午夜成人影院| 久久久久久综合网天天| 伊人久久精品影院| 久久精品一区二区国产| 久久婷婷激情综合色综合俺也去| 亚洲国产小视频精品久久久三级 | 久久r热这里有精品视频| 亚洲国产精品嫩草影院久久| 9191精品国产免费久久| 亚洲成人精品久久| 日韩AV无码久久一区二区| 亚洲欧洲中文日韩久久AV乱码| 欧美亚洲另类久久综合| 国产午夜精品久久久久免费视| 精品久久久久久久久免费影院| 国产亚洲精午夜久久久久久| AV狠狠色丁香婷婷综合久久| 亚洲AV无码一区东京热久久| 久久亚洲sm情趣捆绑调教| 久久久久亚洲精品天堂久久久久久| 欧美激情精品久久久久久| 久久精品国产91久久麻豆自制| 久久久亚洲欧洲日产国码aⅴ| 久久久无码精品亚洲日韩蜜臀浪潮| 亚洲精品无码成人片久久| 久久久噜噜噜久久中文福利|