It seems like pop culture is obsessed with the idea of interacting with technology without actually touching a device to do so. Movies such as Minority Report and Iron Man are the frontrunners in this — the idea that the future of technology will be decidedly “hands-off.”

That future could be coming sooner than we think. Last week, at its annualI/O developer conference, Google announced Soli, a project that would allow users to interact with their devices using hand gestures performed near the device, without requiring contact with the device.

“Project Soli is the technical underpinning of human interactions with wearables, mobile devices as well as the Internet of Things,” a Google ATAP (Advanced Technology and Projects) spokesperson said.

Soli was born out of Google’s ATAP group. It’s a fingernail-sized chip that uses radar to read hand gestures and convert them to actions on the device.

So if a user was to touch his or her thumb to their forefinger, Soli would read that as a button being pressed. Or, the user slides his or her forefinger back and forth on the pad of their thumb, that could operate a slider to adjust volume.

Unlike cameras, which are used in other motion sensing technologies, radar has a high positional accuracy, and thus works better in this context than cameras would. It’s able to pick up on slight movements better.

“Radar is a technology which transmits a radio wave towards a target, and then the receiver of the radar intercepts the reflected energy from that target,” lead research engineer Jaime Lien said in a video about Soli.

The radar waves bounce off of your hand and back to the receiver, allowing it to interpret changes in the shape or movement of your hand. Radar is also important to the project, according to Soli team lead Ivan Poupyrev, because it can work through materials or be embedded into objects.

The technology is vaguely reminiscent of the theremin musical instrument developed in the 1920s by Léon Theremin, but much more intricate. In the Soli video, Poupyrev mentioned that the technology could be used interact with “wearables, Internet of Things, and other computing devices.”

The potential for Soli in wearables is perhaps the most obvious use case so far. Small screens make it difficult to select certain apps or features, and being able to perform gestures next to the device might make navigation easier and intuitive.

According to 451 analyst Ryan Martin, it is important that a company like Google gets involved in this space because a project like Soli is important to the wearable and IoT ecosystems as a whole and it’s important that it “be approached from a technology perspective, not a product perspective.”

There are companies that focus solely on gesture-based interactions, but that can be risky and volatile as it will likely just be integrated as a feature. Martin said that wrist-based wearables are actually less efficient if users actually have to touch them, and Soli could be a step forward in making them more efficient and usable.

Other potential use cases could be within connected cars or in the augmented reality (AR) or virtual reality (VR) spaces. Imagine your Oculus Rift or Gear VR could support virtualized “hands” as another input without a third-party accessory. Although, Martin said, it would probably work better as a complement to another input such as voice or touch.

Using Soli as an input tool is the glaring use case for now, but the project could provide value as an output technology as well.

“I think the killer application, or use case, long-term is going to be how to take this technology and have it be scanning around to provide context and enable automation that might not even necessitate gesture-based interaction, it might just happen,” Martin said.

Gillette is one of many companies whose factories utilize high-speed cameras to analyze manufacturing processes and equipment to better understand when maintenance or repair is needed. Soli could provide a similar service to advanced manufacturing facilities by consistently reading the machines and documenting their performance.

Time to market will depend on user experience. As a device feature, Soli needs to be reliable and consistent or it will be detrimental to the partner brand or OEM that integrates it.

“Once the technology is able to meet that end, I think that’s when we’ll start to see it baked into products, but right now it’s definitely in its development phase,” Martin said.

According to the Google ATAP spokesperson, the company will be releasing a hardware and software development kit to developers soon. If you want more information about Project Soli, you can contact the team at