Exciting update on two pieces of software from the Open Virtual Worlds research group.
It puts an abstraction layer over the top of the NUI device to
hide the gory details of the original API and allows the developer to
focus on what they are trying to use the device for. It aims to ease
cross platform support, support for different devices, development and
experimentation with new NUI input parsing algorithms, integration of
new algirithms and code clarity.
The second is Armadillo.
This is a Virtual World client modified to support Kinect input. Users
can perform gestures to move their avatar through the world without having to interact with the computer itself. Helpful in museum or school installation
A video of Armadillo in action is available on the Open Virtual Worlds’ facebook timeline.
Kinect integration in Armadillo was achieved solely using NuiLib.
Talks are underway to include Armadillo in an
educational pilot program across 38 schools in Ireland and as part of a
Virtual World performance art project.
Both projects were developed by John McCaffery. You can find him in Room 0.09 (Jack Cole Building).
If you are starting on a Kinect project and want
to look at NuiLib or would like to superman your way through the Open
Virtual Worlds group’s reconstruction
of St Andrews Cathedral send him an email or pop in for a chat.