Wouter Verweirder

My Personal Blog

AIR Native Extension for Simple Speech Recognition on OSX

Today I played around a bit with speech recognition on OSX. One of the things I did, was create a simple bridge between Adobe AIR and the NSSpeechRecognizer API on OSX. This api allows you to set a predefined list of commands, and listen for those spoken commands.

The API is quite simple, and so is the native extension api. After including the ANE file, you’ll create an instance of the bridge:

1
var nsSpeechRecognizerBridge:NSSpeechRecognizerBridge = new NSSpeechRecognizerBridge();

After that, you add a list of valid commands:

1
2
3
4
nsSpeechRecognizerBridge.setCommands(Vector.<String>([
    "square",
    "circle",
]));

You add an event handler, which is triggered when a command is recognized:

1
nsSpeechRecognizerBridge.addEventListener(CommandRecognizedEvent.COMMAND_RECOGNIZED, commandRecognizedHandler);

And you start the recognizer:

1
nsSpeechRecognizerBridge.startListening();

This will open the speech recognition widget of OSX, with your AIR application. The event handler is triggered when the bridge recognizes one of the commands. The CommandRecognizedEvent object, will contain the command that was recognized:

1
2
3
4
protected function commandRecognizedHandler(event:CommandRecognizedEvent):void
{
    trace("command recognized: " + event.command);
}

Note that this built-in speech recognition engine is quite sensitive to background noise, and only recognizes US-English spoken words.

Everything (as3 source, native source, demo & ane) is on github for your coding pleasure. Enjoy!

OpenCV AIR Native Extension

In preparation of my FITC session last month, I wrote a native extension to integrate OpenCV in your AIR applications. What it does is use OpenCV in a multithreaded extension, to execute HAAR cascades against a bitmap you provide from ActionScript. This bitmap could be a snapshot of your webcam image, but could also be data from other sources (e.g. kinect camera snapshot?).

I had quite a hard time to get OpenCV compiling & integrating it on my device, and I don’t know if it’ll work on other people’s computers. It only works on OSX, my system was OSX 10.7 (Lion). After a lot of trying, I managed to compile opencv as static libraries, and could integrate them in a native extension package.

The extension ID is be.aboutme.nativeExtensions.opencv.OpenCV. It’s event based, and will execute the heavy haar detection code in a seperate thread, so your actionscript project doesn’t lock up. First of all, you’ll create an instance of the extension, add a detection listener and load a haar cascade xml file:

1
2
3
openCV = new OpenCV();
openCV.addEventListener(DetectionEvent.DETECTION_UPDATE, detectionUpdateHandler);
openCV.loadCascade("/Users/wouter/haarcascades/haarcascade_frontalface_alt2.xml");

You will also need to send bitmap data to the extension. In this example, I’m sending a bitmap snapshot of the webcam image to the extension, using the updateImage method. You can also set minimum & maximum sizes for the detection area’s, so area’s smaller then the mnimum size of larger then the maximum size are ignored. I recommend supplying a minimum size for the detection (for example: face must be at least 40×40 pixels), as this will improve performance of the application:

1
2
bmpData.draw(video);
openCV.updateImage(bmpData, minSize, maxSize);

In the event handler, you’ll get the detected area’s with the event object that is provided:

1
2
3
4
5
6
7
protected function detectionUpdateHandler(event:DetectionEvent):void
{
    for each(var r:Rectangle in event.rectangles)
    {
        //draw rectangles here
    }
}

I’ve commited everything (native, as3 code & demo) to a github repo (https://github.com/wouterverweirder/AIR-OpenCV-Extension)

Again, it’s a mac-only extension, and will need more testing if my opencv compile will run on other mac’s :–)

FITC Amsterdam – Slides & Demo Videos of Opencv & Finger Tracking Air Native Extensions

Just got back of the FITC Amsterdam conference, and it’s been awesome! I gave a kinect workshop on sunday, together with my Happy Banana & HOWEST colleague Koen De Weggheleire. Apart from a longer-than-expected installation process of the kinect drivers on mac, we had a great time showing the attendees how they can use the AIRKinect native extension to use the kinect inside their AIR applications.

You can read some of the attendees their impressions on their blogs:
http://www.aloft.nl/2012/02/hack-slash-play-with-kinect-at-fitc-amsterdam-2012/ http://thomasantonbinder.com/fitc-2012-day-1-workshop-kinect

On tuesday, I gave a presention on AIR native extensions. I had a 90-minute timeslot, and was a bit afraid how I would balance the talk. So, the first half of the talk, I talked about programming & packaging native extensions, and showed some extensions I created.

I was happy, I could get opencv working as an air native extension, and showed a quick demo of executing HAAR cascades through OpenCV in an AIR native extension:

The second half of the talk, was about the new version of AIRKinect we’ll be releasing shortly. I did some quick live-code-demos, showing how we created the api, especially for Actionscript 3.0. Next to the features we’re bringing in AIRKinect 2, I could show how you can combine the kinect drivers & opencv in an AIR native extension, to do finger tracking for your AIR application :–)

I had a great time at FITC, and would like to thank Shawn & the FITC crew to give me the opportunity to present at the conference! We were asked to create a “thank you” video, which was shown at the end of the conference. Check out the making-of-blooper-version Koen and I made :–)

Access the Mac Sudden Motion Sensor With an AIR Native Extension

AIR on mobile enables you to access the accelerometer of the mobile device. But what about the motion sensor in your macbook / macbook pro computer on the desktop? All macbooks come with a “sudden motion sensor”, which shuts down the hard disk when the laptop moves too much.

I wrote an AIR native extension, to access this sensor information in AIR on OSX. I tried to mimic the Accelerometer API as much as possible, and make it easy to use in your AIR application.

You can check if the SuddenMotionSensor is supported on your mac, set the update interval and listen for accelerometer events:

1
2
3
4
5
6
if(SuddenMotionSensor.isSupported)
{
    suddenMotionSensor = new SuddenMotionSensor();
    suddenMotionSensor.setRequestedUpdateInterval(50);
    suddenMotionSensor.addEventListener(AccelerometerEvent.UPDATE, accelerometerUpdateHandler, false, , true);
}

You can then access the accelerometer info in the event handler:

1
2
3
4
private function accelerometerUpdateHandler(event:AccelerometerEvent):void
{
    trace(event.accelerationX, event.accelerationY, event.accelerationZ);
}

You can download the sources & demos on github: https://github.com/wouterverweirder/AIR-Sudden-Motion-Sensor-Extension. Happy coding!

Accessing the Kinect in Javascript Through Websockets

Good morning all! (or evening, night, … depending on when you read this post of course). As you might know, I’ve been working on AIRKinect (as3nui.com) and I’ve got a side project AIRServer aswell (which allows you to setup air as a socket server, including websocket support).

Wouldn’t it be fun, to combine these two projects in a demo, so you can access the kinect information through a websocket? That’s exactly what I did. You run a desktop application on your computer, which is responsible for accessing the kinect, and exposing the skeleton information over a websocket. Using a javascript client, which supports websockets, you can connect to that server, and use the skeleton information in the javascript client :–)

javascript displaying kinect skeleton information

In this demo, I’m just rendering the skeleton points in a canvas element, using three.js.

I’ve uploaded the sources and included binary installers for the desktop application (windows 7, OSX Lion). What you’ll need to do is install & launch the desktop application, and click on the “start server” button to listen for websocket connections on the given port. Make sure you’ve got the kinect sdk installed on your computer (windows) or openni on OSX.

Using the javascript client, you connect to your ip (if you’re testing on the same ip, 127.0.0.1 should be fine), and you can start dancing in the canvas element :–).

AIRServer 0.5 – Socket Byte Concatenation

I’ve just finished work on a little update of my AIRServer library (version 0.5, hooray!). Apparently, when you send large chunks of data over the socket (like sending an image to the server, over the socket), it could happen the data is split over multiple packages. This caused errors on the server side.

I’ve fixed that issue, and uploaded an updated version, together with an image-sending-demo. You can find the latest version on github. Enjoy!

AIRServer 0.4 – UDP Handling, Chrome 16 Websockets & Bugfixes

I’ve just finished some updates on my AIRServer library, which enables you to create an AIR app that listens for different inputs such as sockets, websockets and P2P traffic. This gives you the option to create a multi-user game, with different input controllers.

AIRServer handles multiple inputs

I’ve added a UDP Endpoint, so you can handle UDP traffic aswell now (check out the UDP native extension for AIR mobile, to use UDP on mobile devices). UDP is connectionless, so you can specify a timeout, when we mark a “udp client” as disconnected.

1
server.addEndPoint(new UDPEndPoint(1236, new NativeObjectSerializer(), 60000));

When you want to send data back over UDP, you’ll need to know the UDP listening port of the client: therefore, the client can send a “PORT” command, with the listening port as data argument:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
protected function connect():void
{
    listeningSocket = new DatagramSocket();
    listeningSocket.addEventListener(DatagramSocketDataEvent.DATA, socketDataHandler);
    listeningSocket.bind(9876);
    listeningSocket.receive();

    sendingSocket = new DatagramSocket();
    sendingSocket.connect("127.0.0.1", int(port.text));
    currentState = "connected";
}

protected function sendInput():void
{
    sendObject({command: "PORT", data: listeningSocket.localPort});
    sendObject(inputField.text);
    inputField.text = "";
}

protected function sendObject(o:Object):void
{
    var bytes:ByteArray = new ByteArray();
    bytes.writeObject(o);
    sendingSocket.send(bytes);
}

I’ve also made some arguments optional (such as the message serializers). By default, an AMF endpoint will use a NativeObjectSerializer, websockets will use the JSONSerializer.

1
2
3
4
server.addEndPoint(new SocketEndPoint(1234, new AMFSocketClientHandlerFactory()));
server.addEndPoint(new SocketEndPoint(1235, new WebSocketClientHandlerFactory()));
server.addEndPoint(new UDPEndPoint(1236, new NativeObjectSerializer(), 60000));
server.addEndPoint(new CocoonP2PEndPoint("be.aboutme.airserver.demos.Messages"));

I’ve fixed some issues with multiple-messages in one packet. This was especially a problem with the websocket listener. The object serializer is now responsible for splitting the input into multiple messages (when necessary). By default, the JSONSerializer used for the websockets, will split messages on the newline (\n) character. Make sure you terminate each message you send from the client with this character, and you should be good to go.

As always, you can download the sources & updated demos to play with. Happy coding!

You can find the latest version on github

UDP Native Extension for AIR Mobile – Now With Android Support!

I’ve continued my work on my UDP AIR native extension, to add support for Android. I’m happy to release version 0.2, which adds Android support :–)

UDP extension on Android

This means, from now on, you can send / receive UDP packets in your AIR mobile projects on both iOS & Android. If you find any bugs, or have suggestions, please let me know.

You can find the updated ane, sources and demo on github: https://github.com/wouterverweirder/AIR-Mobile-UDP-Extension

Enjoy!

UDP in AIR for iOS Using a Native Extension

update: added Android support to the extension.

When you’re using a mobile device as a controller for an application or a game, you’ll want fast data transfers. Classic TCP/IP traffic over sockets is a bit slow, due to the nature of TCP/IP (packets are delivered in the correct order, the receiver sends a confirmation of reception to the sender for each received packet). The alternative is UDP: you’re not sure if the packet arrives, or in what order you packets will arrive at the destination, but because of that, there is less delay between the sender and the receiver of the packet.

AIR has a builtin class to handle UDP: flash.net.DatagramSocket. However, for some reason this is not available on AIR for mobile devices. I decided to write a native extension (only for iOS for now) to offer UDP functionality on AIR for mobile devices. I tried to use the same API as the DatagramSocket for AIR for Desktop, so the principles are the same.

Example of UDPSocket extension on AIR for iOS

To send packets over UDP, you’ll create an instance of the UDPSocket class (be.aboutme.nativeExtensions.udp.UDPSocket), and use the send method with a bytearray:

1
2
3
4
var udpSocket:UDPSocket = new UDPSocket();
var bytes:ByteArray = new ByteArray();
bytes.writeUTFBytes("Hello World");
udpSocket.send(bytes, "192.168.9.1", 1234);

To receive packets, you’ll use the bind(portnr) and receive() methods of the same class, and listen to a DatagramSocketEvent.DATA event:

1
2
3
4
5
6
7
8
9
var udpSocket:UDPSocket = new UDPSocket();
udpSocket.addEventListener(DatagramSocketDataEvent.DATA, udpDataHandler);
udpSocket.bind(1234);
udpSocket.receive();

protected function udpDataHandler(event:DatagramSocketDataEvent):void
{
    trace(event.data);
}

It will transfer whatever you put in the bytearray, so you can send native actionscript objects aswell if you want:

1
2
3
4
5
6
var bytes:ByteArray = new ByteArray();
var o:Object = {};
o.command = "MESSAGE";
o.content = "Hello World!";
bytes.writeObject(o);
udpSocket.send(bytes, "192.168.9.1", 1234);

You can find the ane, together with the sources & demo on github: https://github.com/wouterverweirder/AIR-Mobile-UDP-Extension. The native extension id is “be.aboutme.nativeExtensions.udp.UDPSocket”. Stay tuned for the Android version The extension has been updated, and supports Android aswell now!

Including Dylib in Air Native Extension for OSX

I’ve been struggling a bit to create a framework file to use in an air native extension on OSX. The issue was that I was using 3rd party dylib libraries (intel building blocks), which aren’t normally installed on somebody’s computer. The framework compiled and everything, but for some reason the application crashed when using the native extension.

When air loaded the native extension, it could no longer find the necessary dylib files. Solution was:

  • add a copy files phase to your build phases, which should copy your dylib file to the Resources directory of your framework
  • after compilation, open up a terminal window and navigate to your build folder. Use the otool -L command to get a listing of the linked libraries:
1
otool -L HelloThreadsAndUSB.framework/HelloThreadsAndUSB

Output is something like this:

1
2
3
4
5
6
7
HelloThreadsAndUSB:
    /Library/Frameworks/HelloThreadsAndUSB.framework/Versions/A/HelloThreadsAndUSB (compatibility version 1.0.0, current version 1.0.0)
    @rpath/Adobe AIR.framework/Versions/1.0/Adobe AIR (compatibility version 1.0.0, current version 1.0.0)
    /System/Library/Frameworks/Cocoa.framework/Versions/A/Cocoa (compatibility version 1.0.0, current version 15.0.0)
    libtbb.dylib (compatibility version 0.0.0, current version 0.0.0)
    /usr/lib/libstdc++.6.dylib (compatibility version 7.0.0, current version 7.9.0)
    /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 125.2.11)

In my case, the libtbb.dylib file could not be found when the application launched. You need to change the lookup path with the install_name_tool command:

1
install_name_tool -change libtbb.dylib @loader_path/Resources/libtbb.dylib HelloThreadsAndUSB.framework/HelloThreadsAndUSB

When you type in the otool command again, you should see the adjusted path:

1
2
3
4
5
6
7
HelloThreadsAndUSB:
    /Library/Frameworks/HelloThreadsAndUSB.framework/Versions/A/HelloThreadsAndUSB (compatibility version 1.0.0, current version 1.0.0)
    @rpath/Adobe AIR.framework/Versions/1.0/Adobe AIR (compatibility version 1.0.0, current version 1.0.0)
    /System/Library/Frameworks/Cocoa.framework/Versions/A/Cocoa (compatibility version 1.0.0, current version 15.0.0)
    @loader_path/Resources/libtbb.dylib (compatibility version 0.0.0, current version 0.0.0)
    /usr/lib/libstdc++.6.dylib (compatibility version 7.0.0, current version 7.9.0)
    /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 125.2.11)

Now you should be able to use this framework in your air native extension :–)

(Note: I tried the other options @executable_path and @rpath, but those didn’t do the trick)