We all know Microsoft's motion- and voice-controlled Xbox accessory can enhance your games, but its greatest achievement so far has perhaps been in the way it's breathed fresh life into the DIY hacking community. Kinect's array of sensors have been put to all sorts of weird and wonderful non-gaming uses, and you can follow all the past, current, and future developments in that field right here.
Apr 9, 2012
Kinect used to control 83-year-old, four-story high organ in Australia
Melbourne Town Hall Organ Read Article >Australian composer Chris Vik wanted to play the historic four-story tall Town Hall organ in Melbourne, and he decided that Microsoft's Kinect would be the best way to go about it. While the organ was built back in 1929, it was upgraded to support MIDI in the 90s. Meanwhile, Vik had already created his own software called Kinectar, which turns Microsoft's motion sensing device into a MIDI controller and was previously used to create dance-controlled electronic music. When it came to playing the historic organ, Vik decided to compose an original song and team up with singer Elise Richards, and the duo put on a performance at the town hall last November (a snippet of which you can see below). Vik hasn't revealed what his next project is, but the composer / developer has only been playing around with Kinect since last April — so we can't wait to see what he comes up with next.
Apr 4, 2012
Microsoft selects 11 Kinect Accelerator startups for future app development
gestsure Last year Microsoft announced its Kinect Accelerator program, pledging to help ten startups bring their ideas for the motion-sensing video game controller to fruition. Well, after receiving more than 500 applications the company has made its decision, and ended up choosing 11 projects in all. Each startup will spend April to June in Seattle with office space and $20,000 in investment from Microsoft. While some of the projects are still under wraps, most are linked to on Microsoft's Kinect Accelerator page and a lot of the startups have some pretty innovative ideas.
Read Article >For example, GestSure Technologies is working on motion-controlled interfaces for surgeons to operate medical equipment, Styku's project involves full body scanning to help people virtually try on clothes on and offline (reminiscent of the Retail Experience Platform demonstrated by an ex-Microsoft marketing agency), and Freak'n Genius is developing Kinect-driven animation software. We've already seen a lot of alternative uses for Kinect, but Microsoft's direct backing of projects like these should at least result in some more interesting products than the company's own Kinect Fun Labs.
Apr 3, 2012
Tongue-controlled Kinect interface under development in Japan
tongue controlled kinect Yesterday we brought you news of a Japanese research effort to produce interactive kissable posters, but some of you expressed concerns over its hygiene in the comments. Well, a separate team in Japan may have inadvertently hit on a solution by devising a tongue-control interface for Kinect. Adding to the long list of innovative uses people have found for Microsoft's motion-sensing device, the research group at Tokyo's University of Electro-Communications is actually working towards a much more useful target: people with oral motor function disorders. In future, the system could be used to train people how to speak or swallow, though for now the demonstration software is a simple shooting game where you aim bullets with your tongue.
Read Article >If you've ever played a Kinect game, you're probably wondering how it could be used to detect something as precise as tongue flicking. After all, it tends to work best with exaggerated full body motion like dance moves and volleyball serves, and doesn't usually pick up finer details such as individual finger movements. The team got it to work by extrapolating positional data from standard face recognition — after detecting the eyes, movements in the nose and mouth can be used to more or less calculate motion in the tongue. The researchers admits that it isn't very accurate so far, but the goal is a worthy one; while more precise systems are being developed, they require physical hardware to be affixed to the mouth.
Mar 18, 2012
'Making Things See' can teach you how to hack the Kinect
making things see cover With the Kinect, Microsoft is providing today's hackers with a powerful off-the-shelf system for accessing motion controls and 3D imaging; letting people build things that would have been out of reach for anyone but experts and researchers just a few years ago. Unfortunately, as is often the case in the hobbyist world, the quantity and decentralized nature of online documentation can be daunting for first-timers, discouraging people that might otherwise be interested in joining the scene. Luckily, that may no longer be the case with the release of Greg Borstein's new book Making Things See: 3D Vision with Kinect, Processing, Arduino and MakerBot.
Read Article >With Making Things See — the newest in the Make: Books series from O'Reilly — Borstein tries to bring the skills he learned as a grad student at NYU's Interactive Telecommunications Program to the hacker and builder communities at large. At 440 pages, this is a substantial publication, with instructions for 12 projects including "'Stayin' Alive' dance move triggers MP3" (featured in the video below), and a section on capturing meshes and 3D printing using the Kinect. While the book isn't going to teach you how to code from scratch (the book is for "beginning creative coders," and all of the examples are in Processing), if you think you have the skills you need to get started, the time has never been better to pull that Kinect off the shelf and fire up your text editor.
Mar 8, 2012
Microsoft releases Robotics Developer Studio 4 with Kinect support
Kinect Robot Microsoft's Robotics Developer Studio 4, a freely downloadable framework for robot programming and management, has been released today. The new software will add support for Microsoft's .NET Framework 4.0, XNA 4.0, and Silverlight 4.0. Most excitingly, will also include support for up to four Kinect sensors using the Kinect for Windows SDK. To help users integrate the new features, there's a new reference hardware design that includes Kinect-based navigation like that in the video seen below. The Parallax Eddie robot kit is also built from this design.
Read Article >A beta of the software was released in December, and Microsoft's own developers have been using it for some time — it's what powers this dog-sitting telepresence robot. This latest version should have improved stability over the beta. MRDS 4 isn't necessarily a prerequisite for designing Kinect-based robots, but it's good to see Microsoft supporting the many alternative uses people have found for the hardware in its development suite.
Mar 2, 2012
Kinect and Canon 5D combine for candid 3D imagery
kinect 3d depth Read Article >In our Lytro review, we mentioned seeing a preview of an upcoming mode that allows you to manipulate the perspective of a photo after the fact, using 3D data captured by the camera's unique sensor. While the Lytro won't be widely available for a while, you may already have technology with the potential for similar functionality under your TV stand in the shape of Microsoft's Kinect. James George and Alexander Porter have exploited this in a series of CCTV-inspired images, using custom software to combine the Kinect sensor's depth information with images from a Canon 5D. The results are striking — George and Porter took candid photographs of passengers on the New York City subway, once intended to have a high-tech surveillance system of its own, and the images impart the eerie idea that we never quite fully know who (or what) is watching us.
Feb 28, 2012
Microsoft's transparent 3D desktop puts a virtual computing environment at your fingertips
Microsoft's Applied Sciences Groups 3D Virtual Desktop In yet another TechForum reveal, Microsoft's Applied Sciences Group has demonstrated a new interactive 3D desktop prototype. The system uses a transparent OLED screen made by Samsung along with Microsoft Kinect sensors to create a virtual desktop environment that users can manipulate with their hands in real-time. A keyboard beneath the transparent screen allows a user to type as they would with a normal computer, and then when they need to interact with the desktop — to say, flip through a stack of files — they simply reach up and move the virtual objects with their fingers.
Read Article >The Kinect sensors not only track the motion of the user's hands, but they also track head and eye position as well, allowing the screen to provide a 3D image with perspective and depth customized to the user's exact position. If you're thinking this sounds a little similar to Norman Jayden's Additional Reality Interface from Heavy Rain, you're not the only one, but to get a real sense of the system in action, check out the video below.
Feb 10, 2012
Kinect-based DarwinBot lets you remotely play with and talk to your pet
DarwinBot Unwilling to leave his dog Darwin home alone while he was at work, Microsoft Robotics Team developer Jordan Correa came up with an unusual solution: building a remote-controlled robot doppelganger that could dispense treats, throw and retrieve a ball, and video "chat" with the dog on Skype. DarwinBot was created using the Parallax EDDIE hardware platform, its features implemented with Microsoft's Robotics Developer Studio 4. As you can see in the video below, it's essentially a mobile table with a Kinect for navigation, a tablet and camera for video connection, and some special hardware like a ball-launching robotic arm.
Read Article >Correa isn't the only person to come up with ways to connect with a pet through telepresence. Earlier this year, we saw a robot that would let you remotely groom your cat. Besides the possibilities for virtual interaction with the real world that these robots demonstrate, what we're really impressed with is the extent to which the pets accept them. Sure, Darwin seems a little nonplussed, but he's a lot more sanguine than we would be about being approached by a claw-equipped robotic creature with a loved one's face and voice.
Jan 26, 2012
South Korea's Live Park uses Kinect sensors and RFID to create an interactive 3D fantasy world
Live Park 4D Theme Park We've seen Microsoft's Kinect show up in a number of unexpected places, and now it's being used to power a 3D theme park in South Korea. Created by interactive company D'strict, Live Park consists of 65 attractions spread out over 7 stages; RFID-enabled bracelets track users as they move from exhibit to exhibit, while Kinect sensors allow their motion and expressions to control virtual avatars within the park's wall installations, holograms and panoramic 3D projection screens. Even better, at night the entire park turns into a dance club.
Read Article >Built at a cost of $13 million — and able to house 3,000 visitors at any given time — Live Park is more of a free-roaming exhibit than a theme park in the traditional sense, but it's proven so successful since its December debut that D'strict is extending its run at South Korea's Kintex exhibition center. This isn't a one-off, either: D'strict will be licensing Live Park to additional partners, with plans for permanent facilities in China and Singapore already in the planning stages. As for those of us in the US, the company has partnered with a "Hollywood entertainment powerhouse," and in March is expected to announce plans to bring a version of Live Park to Los Angeles and Las Vegas.
Jan 14, 2012
Chaotic Moon Labs Board of Awesomeness: your hand is the throttle on this Kinect-controlled skateboard
Gallery Photo: Board of Awesomeness pictures Chaotic Moon's Board of Awesomeness is one of the craziest things we've seen here at CES 2012. And by crazy, we mean awesome. The frankenstein creation was built in just two weeks and is composed of a longboard with a set of gigantic rugged wheels, electric motor, batteries, Kinect, and Windows 8 tablet.
Read Article >I've been skateboarding on and off for around eight years and I'll admit — knowing how to skateboard doesn't necessarily mean you'll be able to handle the Board of Awesomeness. Riding it requires a ton of balance and concentration, and you need to be focused on a few things at all times: leaning forward, keeping your foot down on the kill-switch and making sure your hand remains very steady when it's in the hitbox. All three things work in concert together. For example, when I wasn't leaning forward, the torque threw me off balance, causing my foot to come off the kill-switch and my hand to move, which caused the board to not move at all. However, I started to get the hang of it after a few runs on the slowest setting, and was on the highest speed setting in no time.
Jan 10, 2012
Geomagic demonstrates Kinect-To-Print for 3D printers (video)
Geomagic We're still waiting to check out 3D Systems Corporation's 3D printing service at CES. In the meantime, Geomagic, which powers the Kinect-To-Print app for the company, has released a video demonstrating the process by which a Kinect image gets turned into a printable template. Essentially, Geomagic uses the Kinect to capture a series of points, then turns them into a 3D model which can be sent to the printer or, presumably, uploaded to "create-and-make" environment Cubify.
Read Article >The process is supposed to take less than two minutes, and from what we can see picks up enough detail to produce a recognizable bust of a Geomagic engineer. We're not going to pretend we don't find staring into his disembodied face a little creepy... but, then, such is the price of progress.
Jan 8, 2012
FIRST robotics competition to feature Kinect-powered robots
Kinect PC Hookup Read Article >We've seen examples of Kinect-controlled Robot use before, last year's FIRST competition also included some examples using the unofficial drivers, but students receive a Kinect sensor and the Kinect for Windows software development kit (SDK) this time around. Microsoft's commercial SDK is also on the horizon, enabling businesses to deliver Kinect-powered apps, and we'll likely see the Kinect effects of Microsoft's new platform throughout the course of 2012.
Dec 25, 2011
Reseachers create Kinect-powered system to weigh astronauts on sight
Astronaut Researchers are using the Microsoft Kinect to calculate a person's weight just by looking at them — perhaps even in outer space. Computer scientist Carmelo Velardo and a team at the Italian Institute of Technology's Center for Human Space Robotics created the system, which uses the Kinect's body-tracking camera to generate a 3D model of a given individual. A database of 28,000 people is then utilized to calculate weight based on the physical measurements of the subject, generating results that the team says are 97 percent accurate. Velardo sees the system as being ideally suited for use in space travel: without gravity, normal scales are useless, and astronauts must climb atop an oscillating spring-mounted stool to measure their body mass. The Kinect system is both smaller and more energy efficient than the current solution, with Velardo hypothesizing that it could even be built into the walls of a space station.
Read Article >NASA scientist John Charles gave a cautious thumbs-up on the concept when speaking with New Scientist, noting that water shifts in the body caused by zero gravity could throw off the calculations being used. While the system hasn't been trialed in space yet, Velardo hopes to soon take it aboard a reduced-gravity aircraft, which simulates weightlessness without leaving Earth's atmosphere. The Kinect system will be presented at the Emerging Signal Processing Applications conference next month in Las Vegas.
Dec 20, 2011
Kinect aids in improving mobility and rehabilitating stroke patients
via img.skitch.com Kinect's finding its way into more and more areas of research as a comparatively cheap way to create 3D images. The camera's particularly gaining ground amongst health practitioners, and now we've heard of Kinect being used for gait analysis and rehab.
Read Article >The uses in gait analysis — the study of how people walk — are being developed separately by both Missouri University and by students from Oak Ridge High School, Tennessee. Equipment to monitor people's walking exists already, but is expensive and usually requires use in a specialized testing environment. Missouri's working in collaboration with Americare's Tiger Place senior housing, where the cameras are being installed into residents' apartments to collect data and analyse their movement in everyday situations. The data this generates is able to act as an early warning system to forthcoming health issues for the residents, including fall risks, illness, or mobility impairment. Advanced warnings like these can be crucial in preventing injury and maintaining quality of life. Oak Ridge's system, developed by students Ziyuan Liu and Cassee Cain, has similar applications, providing advanced movement analysis for physiotherapy. The pair's work has been recognized by the national Siemens Competition for high school science, winning them a shared $100,000 college scholarship.
Dec 6, 2011
Kinect mod for Skyrim adds voice and motion control to unleash your inner Dovahkiin
Kinect Skyrimmed Microsoft's Kinect still isn't the reliably robust motion-sensing device we wish it was, but every time we see a breathtaking mod come out we can't help but fall in love with its promise. This time, YouTube user KinectFAAST (who doesn't seem to be explicitly affiliated with the Kinect FAAST USC group) has integrated the sensor with Skyrim on PC to glorious effect in a video you can see below. The mod lets players speak in order to switch weapons, access their favorites, open their quest journal, start conversations with NPCs by saying "hello," and of course, shout in the dragon tongue. A program called Voice Activated Commands (VAC) was used to map recordings of dragon shouts and other phrases to in-game actions.
Read Article >The motion control side of things is equally impressive, and the mod allows players to slash, cast spells, walk, sprint, and perform other movements with gestures that appear quite intuitive. Still, we can't help but wonder what this would look like with official support. Do you hear us, Bethesda? Get on it.
Nov 10, 2011
Kinect Beatwheel remixes music samples on the fly
Kinect Beatwheel Read Article >Microsoft's Kinect has inspired plenty of useful hacks, but the Beatwheel seems designed to party — it lets you remix audio loops in real time with a simple wave of the hand. The UI splits a sound sample into segments wrapped around you like a clock, and plays whichever one you point to. A green bar shows what's currently playing, a red dot follows your hand around to select the next segment, and a blue tempo indicator shows you how fast the music is playing. You can also adjust the playback speed and number of segments, or play them in reverse by swinging your arm counterclockwise. Ryan Challinor put together the Beatwheel for Music Hack Day in Boston last weekend, but if you missed that, you can watch a demo in the video below.
Nov 1, 2011
Microsoft's Kinect-infused Augmented Projectors make your entire room a touchscreen
Microsoft augmented projectors Read Article >Thanks, Pradeep
Nov 1, 2011
Kinect Effect ad inspires, recognizes hacking community
Kinect effect ad Read Article >It's clear that Microsoft has come to view Kinect as a much broader platform. Earlier this year, Microsoft released an official SDK for enthusiasts and academic institutions, and recently announced that it will offer a commercial SDK early next year. Of course, there's no guarantee that Kinect will evolve meaningfully beyond small hacking projects, but we're hopeful that corporate budgets will be able to make Microsoft's aspirational vision a reality.
Nov 1, 2011
GLaDOS for Kinect is your contemptuous personal assistant
Kinect Read Article >Siri may have a lot of brains behind its technology, but it lacks the hopelessly sardonic demeanor that people like Corey Thomas enjoy — he's created GLaDOS, a personal assistant inspired by the menacing AI from Portal, which responds to voice recognition using the Kinect. GLaDOS can recognize its own name, open and close programs, and offer plenty of biting insults. The program responds pretty quickly to commands, which you can see for yourself in the video below.
Oct 18, 2011
Microsoft and Carnegie Mellon researchers put a touchscreen on the palm of your hand
OmniTouch Read Article >As part of the proof-of-concept, the team ran the system through a number of trials, from simply tracking touches on the inside of a user's arm, to a more elaborate drawing app mock-up, where the subject would "paint" on a nearby wall with their finger, using their free hand as a virtual palette to choose colors. The system is still clearly a research project, with crude graphics and an unwieldy rig required, and it's not even the first projection-based touchscreen system out there: Light Blue Optics showed one at CES last year. OmniTouch does excite, however, due to its responsiveness and sheer flexibility. The research project is set for its public debut this Wednesday at the UIST conference in Santa Barbara. To see it in action, check out the video below.
Oct 5, 2011
Self-Defense Training Camp for Kinect tries to turn you into a lethal weapon
Self-Defense Training Camp Given the way Ubisoft is selling it — a promotional video features a female black belt discussing reaction times when being strangled, and the company's press release promises gamers will "discover ways of protecting themselves" — it does raise the question of whether players should really expect to develop real-world fighting skills from a video game. But if dropping a knee to the head of a virtual attacker is your kind of workout, you won't have long to wait: the title lands November 8th. Check out video of the game in action below.
Read Article >Jun 6, 2011
Kinect Fun Labs offers you some tech demos while you wait
We still haven't seen Kinect's "killer app," and Microsoft can't seem to get gameplay out of on-rails hell, but while we twiddle our motion captured thumbs waiting for a compelling gaming experience, we can finally start playing around with all that Kinect power from the 360 itself. Microsoft is launching Kinect Fun Labs, which is an app store of sorts for tech demos -- a purtified version of the wild west of PC hacks currently available for the hardware. Each downloadable "gadget" is meant to show of a discrete bit of Kinect functionality, like body scanning ("Kinect Me"), object scanning ("Build a Buddy"), and finger tracking ("Kinect Sparklers"). Everything so far seems to be first party and free, and it's not clear how Microsoft is going to make this available to third party developers -- not to mentioned the unwashed hacker masses, more of a fourth party, really. Still, it's nice to see Microsoft attempting to take clues from the community, although the larger symptom here seems to be a dearth of "real" games that use the obvious power of Kinect in truly immersive ways. Kinect Fun Labs is supposed to be available immediately, but some apps won't be available until July -- which is when Avatar Kinect (that Avatar chatting service) is also supposed to arrive.
Read Article >None of this sound very interesting to you? Follow after the break anyway for a completely bizarre "tour" from the one and only Christopher Lloyd. The maddest scientist of them all.