Windows

Windows Kinect means gestures made at your PC now mean something

Welcome to the era of gestures via the Microsoft Windows Kinect. Talking with your hands is about to take on a whole new meaning.

Courtesy of Microsoft

On February 1, 2012, Microsoft released the official Kinect SDK and Runtime for PCs. Prepare yourself for a flood of new ways to interact with your PC using gestures, gyrations, and other nontraditional mannerisms. Whether these new interactive methods will become generally accepted is certainly up for debate, but there is little doubt that there will be many creative designs coming soon.

Now, don't get me wrong -- I think the Kinect for Windows will be a great addition to PC gaming. It will open up genres and inspire innovation, and there is no telling where it will all lead -- that part is fun and exciting. However, the cynic in me is cringing at the prospect of waving at my PC to highlight a set of cells in Excel -- or whatever "interface improvements" some well-meaning but misguided developer may try to foist upon us.

That bit of curmudgeonly skepticism aside, if you would like to develop some Kinect for Windows applications you can download the SDK directly from Microsoft. You can also buy a Kinect sensor for $250.

In her blog post on ZDNet, Mary Jo Foley points out several improvements Microsoft has made to the Kinect SDK since the beta release, including:

  • Support for up to four Kinect sensors plugged in to the same computer,
  • Improved skeletal tracking, including the ability for developers to control which user is being tracked by the sensor, and
  • Near Mode for the new Kinect for Windows hardware, which enables the depth camera to see objects as close as 40 centimeters in front of the device

Are you looking forward to making gestures at your PC that mean more than the fact that you are frustrated by something it is, or is not, doing? Do you think interacting with your PC via Kinect is going to become the predominant way you will control your PC?

Also read:

About

Mark Kaelin is a CBS Interactive Senior Editor for TechRepublic. He is the host for the Microsoft Windows and Office blog, the Google in the Enterprise blog, the Five Apps blog and the Big Data Analytics blog.

41 comments
carl
carl

I have just released a powerful gesture recording and recognition API for KFW called GesturePak (http://gesturepak.com). You can create a gesture using any of your joints tracked across X, Y, and or Z axis in 2 minutes, and with 7 lines of code you can recognize those gestures in your own Windows apps. All this for only 99 bucks. There's an interactive demo available on the website: http://gesturepak.com Carl Franklin

Shawenb1
Shawenb1

Anyone ever think MS is using people for programming and conceptual ideas? It just occured to me that they own the software and hardware. Bet the ULA is ironclad, if you have a civil lawsuit against them, they have a specific city you have to go to. In EVERY Nation. Judges there LOVE Microsoft I bet!

Shawenb1
Shawenb1

The simple fact is you will not be able to use it for extended periods. The effects on the human body from repeated movements will cause big problems. The devolpers will have to figure out natural movments to reduce fatigue. But hey, can't be worse than a key board or mouse over the long haul.

bk1
bk1

When it doesn't quite understand the gesture, does it recognise the two finger salute?

twistedg
twistedg

The regular Kinect can read two people, although poorly and no where near what the PC Move can follow with wands. Can the PC Kinect only follow one person? That's $1,000 for unexplainable support.

it
it

Pair it up with a voice command and a 3-D interface system without glasses and you have Tony Stark's garage in the recent Iron Man movies.

five.cent.family
five.cent.family

No, I don't think this kind of interaction will become the "dominant" interface technique. This is primarily going to be for gaming (were a HUGE deal of innovation will occur), and in other areas where touching a device is problematic and gestures will work better. Imagine virtual manipulation of 3D objects in design, remote manipulation of robotic machines in manufacturing and scientific arenas, virtual avatar interaction for fashion and makeup industries. The list goes on and on. But for the regular old user. I think touch screen (or a mouse for those that prefer) and keyboard will still be the norm.

Curtis_Wayne
Curtis_Wayne

Poeple don't even want to reach for their mouse. Can you imagine not having to reach for the keyboard? Ever heard of NUI? Natural User Interface centers around voice and 3D motion for interaction w/ a PC. Both are getting more & more accurate, just too darn expensive. But we all know how that goes. I am SHOCKED by the results of your poll - absolutely the opposite of what I expected (70% believe Kinect will not be the dominant interface). I guess the wording of the question could leave open a lot of other alternatives as well. I'm curious to see more discussion & REAL eager to see how the market actually responds. Enjoy! Curtis

us-english
us-english

Only a shallow mind could not see the advantage of such system over the traditional mouse/pad, or even the touch screen. This was years in coming and MS has a winner in it. The concept and technology was developed on on a hint from the Apple's success with the touch screen, which was indicative of the dire need of change in how humans interact with computers. It's not just some gimmick, MS has acquired several 3D camera software companies and spent years in testing it, in gaming first. Of course, there will be resistance and it likely come from the same group of people who argued against the touch screen on tablets and smartphones. You know, the dinosaurs.

piratesmvp04
piratesmvp04

Just what exactly makes Kinect better than a traditional tablet PC with touchscreen? I'd rather scroll or make input on a touchscreen that with inaccurate gestures in midair.

8string
8string

To be clear, almost anything is better than the average touchpad on a Windows laptop, I've found. So maybe scrolling with a flick towards the screen will be worthwhile! But joking aside, have the machine recognize my face and perhaps a gesture and unlock itself to the last known configuration would be a very useful thing. In addition to the usual game apps, there are likely medical uses for this, and might be able to be front end processing for some kind of body scanning software. An array of four of these surrounding you might be able to allow a better job of helping with remote medical services, allowing a Dr. or PA to see the act of raising an arm for instance to tell what kind of muscular problems a person has. Additionally, there are still huge numbers of people that cannot adequately use a mouse and keyboard. Having computer setups that accept pointing and better voice recognition, ala Apple's wonderful beta of Siri, could lead to breakthroughs that we can't yet fully understand. I think about safe car systems that don't need you to take your hands or eyes off the wheel to properly work.

mirossmac2
mirossmac2

If it will let me interrupt some interminable self-generated process that is immune to Ctrl-Alt-Del and park it while I do things I bought the computer in order to achieve, I'll buy it, too. Otherwise it's just another door into more-of-the-same.

techrepublic
techrepublic

That way I won't be "flipping the bird" nearly as much.

AnsuGisalas
AnsuGisalas

I just wonder... will it take it to mean "chastize thyself, dumbass machine" or will it rather take it personally, maybe throw me a BSOD?

Htalk
Htalk

I can think of lots of great software that could use this interface, very little of it for office uses though.

BALTHOR
BALTHOR

SDK is Software Development Kit.All the ones that I've seen were code.Runtime is anybody's guess.I most recently browsed the Internet with the search question "What is BIOS flashing?".Not one site explained BIOS flashing.Most said that it was dangerous and should only be done rarely.I flashed my laptop repeatedly for hours with no ill effects.

Jim Johnson
Jim Johnson

Kinect might make Windows 8 metro UI more tolerable on non-touch screen devices. Voice recognition getting better too. But these are in my opinion both better navigation tools as opposed to composition tools. In short, they aren't quite ready to replace the keyboard, but might just maybe replace most of what the mouse does.

dennis.m.jackson
dennis.m.jackson

The "touch" UI on smartphones and tablets has become a big player in the commercial marketplace, because the display/input devices for this class of computer are touch-enabled by necessity (space is limited - physical mice and keyboards are too large for effective use). The "non-touch" (i.e. mouse and keyboard) UI is firmly entrenched in the desktop world, where most of the real work using computers is done - likely to be that way for quite a while to come. Windows 8, when released, offers support for both interfaces, but its "touch" UI will not likely be used much on a desktop without finding a way to bridge the gap that non-touch devices like screens, mice and keyboards present. Is a Kinect-like device the bridge that might promote use of a "touch" UI in an inherently "non-touch" computing environment? Will that kind of a hardware bridge + Windows 8 provide the capability to bring the two environments together? It seems like you could easily emulate a mouse with a Kinect-like device, as well as emulate the gestures for the UI on "touch" devices used on mobile devices. If you could seamlessly integrate the two UI models in Windows 8 with a Kinect-like device, wouldn't that preserve investment in desktop hardware, software and methods of operation, while opening up the possibility of an orderly migration to a more touch-oriented UI? Is this perhaps part of Microsoft's grand strategy being introduced with the advent of Windows 8 and Kinect - even though neither have been directly linked to each other from a marketing perspective?

vitec
vitec

I do not think it will become main stream, but it has great potential for handicapped people, Law Enforcement, Military uses, not to mention the Engineering aspect. It's only draw back will be what we can not think of.

BillGates_z
BillGates_z

Now when something goes south i can not just swear at it, i can flip it the bird.

Mark W. Kaelin
Mark W. Kaelin

Do you think interacting with your PC via Kinect is going to become the predominant way you will control your PC? How will the Kinect fit in with other traditional input devices?

twistedg
twistedg

Pretending to be Tom Cruise in The Minority Report seems cool but try doing that for 8 - 16+ hours a day. Who will be there to hold my arms up after the first hour or two? Lol

egmccann
egmccann

Type out a letter with it, then come back. Even voice recognition is iffy after a few decades of development, much less voice *control.* Don't even get me started on how widespread voice control/input would be in an office or call center. As mentioned, some areas will see this as an absolutely revolutionary development - and I am expecting imaging to be one of them. The rest, not so much.

egmccann
egmccann

"Only a shallow mind could not see the advantage of such system over the traditional mouse/pad, or even the touch screen." Tell me how it will help me type better and more accurately than my 90+ WPM on a keyboard. Tell me how it will be more accurate than the extremely fine control I can use to pick specific points with a mouse. And no, I don't see this helping with gaming overall. Will it open up some areas in some genres? Sure. But it's more suited, there, for the consoles (in a living room or entertainment room, with plenty of room TO move) instead of to PC gaming in general. That's not a "shallow mind." That's being realistic.

CharlieSpencer
CharlieSpencer

Outside of gaming and some niche apps, what mainstream apps used by the majority of consumers and businesses will benefit from this form of input?

egmccann
egmccann

... on the specific use. Why is a tablet better than a keyboard/mouse? Sometimes it is, sometimes it isn't. Why is a CAD station tablet or a pressure-sensitive tablet/pen combo (such as those from WACOM) better than touch or a keyboard/mouse? Depends on the application. For everyday use (and I mean both basic "I want to get email/surf this web thing/look at that facebook thing" and at the office typing/creating spreadsheets/etc,) it's not adding anything better (yet.) However, I'm sure there'll be some areas this will be an absolute breakthrough - 3d imaging, perhaps, taking tablet-esque swipes/pinches/zooms to the next stage, for instance.

CharlieSpencer
CharlieSpencer

but even I draw the line at pointing and waving at empty air like a deranged landing signals officer, guiding in planes only he can see. It's bad enough people already walk around talking to no one, like they're rehearsing for the community theater production of "Harvey".

jos.paglia
jos.paglia

Not to mention, what's different between this ("for PC") and the current Xbox version, which can be had for $150 retail, and connects via standard USB? There's no software, there's no support, ... What does the extra $100 get you (not that I want to spend the $150, even).

Charles Bundy
Charles Bundy

It is amazing to me that a sub $1K sensor array is available. I'd say this could seriously augment virtual environments and video conferencing. Imagine the array serving as an intuitive interface for teleoperated machinery! (plus you don't have to wear silly tracker hardware)

HAL 9000
HAL 9000

No I personally don't think that it will make much of a difference directly. However I do see it as the beginning of a New Way of interacting with Computers so as a Starting Point and nothing more it has the potential to result in [b]Interface Improvements[/b] eventually. Even M$ accept this with the advent of their Kinect Development Kit and it's really the first time that they have steeped back from Total Control of their Products because they know that they simply can not compete with those out there who are thinking in different directions to them. This could very well be the beginning of M$ moving from a Software Supplier to a Hardware Maker. Col ;)

Jaytmoon
Jaytmoon

You'd get a great bicept and shoulder workout!

mswift
mswift

Imagine that your new PC is a cube that is 4 inches wide, 4 inches long and 8 inches high and weighs under a pound. It projects the picture onto the wall or flat (like less than 1/8 inch) screen and it projects the keyboard onto what ever flat surface the device is resting on. When you move your hand to the "mouse" the screen magnifier enlarges the area you move to so that you can pick a detailed spot with a finger tip, something you can't do with a smart phone or a device with a small screen. There are already some products that do this. The Kinect helps make progress towards these kinds of tiny and powerful devices.

five.cent.family
five.cent.family

The PC version is much more accurate than the standard XBox version. It also let's you track people much closer (40cm) to the device. I think you have to be about three or four feet away from the XBox Kinect for it to work properly. It IS new tech, and it will be worth it for developers who already have ideas for how to implement it.

CharlieSpencer
CharlieSpencer

Do you expect this system to cost the same as hardware with similar performance specs but in a more 'traditional' form factor?

mswift
mswift

If you have something very small and very powerful you probably only need one device. Phone/tablet/notebook/laptop/desktop all gone. The touch typing thing is the only drawback and some of that will be mitigated by visual and auditory indicators. With the latest IBM and iNTEL breakthroughs this size may go down to 1x1x6 inches in a few generations.

CharlieSpencer
CharlieSpencer

If it's going to do the same thing as a conventional mouse and keyboard, why pay more for it? If you don't need to move the system around, what's the advantage? In terms of a projected keyboard, what do touch-typists gain in compensation for the loss of tactile feedback?