It’s been a while since my last update on AIRServer – an actionscript library which enables you to set up a socket server in Adobe AIR, which listens to multiple socket inputs (native AMF sockets, websockets and P2P input).
Anyway, took me quite some work, but I finally got it working. I was able to port some of the code of the Bauglir Internet Library (http://www.webnt.eu/index.php) to actionscript. AIRServer now supports both hybi-00 and hybi-10 of the protocol.
Update: This sample only supports the beta1 SDK on windows. Please check out http://www.as3nui.com, where further development is happening (newer SDK versions, support for OSX).
I’ve been playing with native extensions for air for a couple of weeks now. One of the things I wanted to do, was to get the Kinect working through a native extension. I’ve posted some sample libraries before, where the kinect date was sent to Adobe AIR through UDP sockets. However, the bandwith is quite limited, and there are noticable delays.
Using a native extension, we don’t have those limitations anymore. Another pro is that we don’t need to run a seperate program to send the data to our flash application.
I’ve got multiple skeleton tracking working, together with the video and the depth video.
You can download the flash builder project and the visual studio project for the native extension. I didn’t have any C programming experience before, so there’s probably room for improvement on the C side. Currently, the extension is only available for windows using the kinect sdk. Make sure you’ve installed the microsoft kinect sdk, aswell as visual studio. You’ll also need to launch the flash builder project using my ant build script which is included in the flash builder project. You’ll want to update the path to your air sdk in the ant-debug.xml.
If you want to see it live in action, come check out our session “interREACT with the flas platform” at the FITC unconference at MAX (tuesday).
Finally had a chance to play around with the official Kinect SDK. First thing I wanted to try, was to link the Kinect SDK to Adobe AIR (with UDP), and adjust my previous kinect demos (with OSCeleton and the open source drivers) so I could do multitouch with the kinect (a bit like minority report). The official SDK is a lot more stable then the open source drivers, and another plus is you don’t need to stand in a calibration position for the skeleton to get tracked. Downside of the official SDK is that it’s windows only…
So, I booted up my Windows pc, and threw together a UDP bridge in C#, which sends the skeleton information as a JSON encoded string to a target IP + port. I’ve coded a little adapter in Actionscript 3, which translates the JSON object into TUIO cursors. After that, it was a piece of cake to get multitouch working in Adobe AIR, using the skeleton information from the kinect.
When the game is running, you can control it through websockets aswel, using the HTML5 Client in the repository. Deploy the client on a websurfer, and access it with an iPhone4 / iPod Touch 2 or iPad to control the game.
I wrote this game on top of my AIRServer library, as part of some multi-user explorations with the flash platform. The graphics were created by my HOWEST colleague Jynse Cremers :–)
Users can join a 30-second spacegame. Each user controls a spaceship with their smartphone. The game itself was built with Adobe AIR and the AIRServer actionscript library. To join the game users can choose between a native application built with Adobe AIR, or an html5/js websockets client in their browser.
I’ll release the source files after some cleaning up, so you can play around with it aswell!
Since Adobe AIR 2.0, you can create your own socket-servers, using the flash.net.ServerSocket class. This makes it easy to hook external applications to your AIR applications, for example to create a game (server) with multiple devices as game controllers (socket clients).