Innovation

An IoT reality check for big data experts and tech manufacturers

Find out three things companies with IoT products and initiatives can do to improve IoT-reality interface problems.

Image: iStock/ML

In the future, affordable self-driving cars will make it easier for elderly and disabled persons to shop and to make other appointments. Robots will vacuum our carpets, assist us with our schedules, and pick and pack goods for warehouses. And automated trucks will know exactly where to spread custom blended fertilizer mixes on specific areas within a single field of corn. But can these IoT innovations do the jobs flawlessly?

As a preeminent example of big data capture and use, the Internet of Things (IoT) is still in the beginning stages of perfecting the ultimate interface: how well it interacts with and engages with the world we see, hear, and touch. No open source or universal interface can guarantee an absolutely perfect engagement, so ensuring that IoT and our physical environments exchange information properly is paramount, and it's an area that big data analysts need to work on.

SEE: The Power of IoT and Big Data (Tech Pro Research)

Examples of big data challenges with IoT devices

For example, in February 2016 a Google autonomous vehicle successfully deduced that it needed to pull into a center lane to avoid sandbags that it perceived in the right-most lane, but it also anticipated that a bus approaching from behind would slow down to enable the car to shift lanes. The bus driver didn't slow down, and the car smacked into the side of the bus.

SEE: Google's Chris Urmson explains self-driving car crash (CNET)

Here's another example: The robo-vacuum cleaner that you purchased will stop as it approaches a wall or a piece of furniture if you have attached sensors to these objects that warn it before it collides with them, but what about that stubborn dirt spot in the middle of the floor that you (not the vacuum) perceive will require extra vacuuming?

SEE: Joe Jones: Roomba inventor. Roboticist. Vindicated pioneer. (TechRepublic)

A geometrical "grid" can be designed that describes to an automated fertilizer spreader how it should spread fertilizer over a specific field, and at what points during the spreading it should change the fertilizer mix to compensate for changes in the soil. But, how do you adjust when there are natural emblements and conditions in the field, such as trees, irrigation ditches, or even an irregular shape to the field that can't accommodate the physical width of the spreader?

All of these problems must be worked out by ergonomic and "reality interface" specialists who are able to describe and mediate for the many different events in a day that humans automatically perceive and adjust for.

SEE: How big data is changing farming (PDF download) (TechRepublic)

3 tips for solving IoT-reality interface problems

On the technical side of IoT, the elements immediately most vulnerable to a failure to adjust for outward environmental conditions are the "on the ground" forces for IoT—i.e., the sensors that directly engage with physical reality. These elements can, for instance, give you false low tire pressure warnings if you are traveling on a highway in Minnesota and the temperature is 10 degrees below zero. This occurs because the tire pressure warning system was only tested by the manufacturer in factory conditions of 68 degrees. Clearly, a broader range of environmental conditions needed to be tested.

What can companies with IoT products and initiatives do to improve the ultimate IoT-physical reality interface? Here are three places to start.

First, expand the number of test cases for the technology and include more environmental parameters for IoT analytics to work on and respond to.

Second, continue to collect a variety of environmental use cases that the IoT technology interoperates with. This can be done by installing mechanisms that auto-report on factors such as tire pressure while driving in extremely hot or cold climates. As these use cases are collected, the IoT technology can be continuously refined to improve its interoperability with physical reality.

Third, build in failover mechanisms so that humans can take over if the technology fails. Although California recently proposed state laws that would loosen its stringent requirement that autonomous cars have steering wheels so human drivers can take over, these self-driving vehicles should operate like any other form of IT: They should have a failover mechanism that either enables them to both operate safely and continuously in time of challenge.

Summary

We are still a long way from matching the capabilities of a human brain and imparting this creative thinking to machines. IoT's next frontier of technological advance should be improving flexible and spontaneous responses to outer conditions.

Also see

About Mary Shacklett

Mary E. Shacklett is president of Transworld Data, a technology research and market development firm. Prior to founding the company, Mary was Senior Vice President of Marketing and Technology at TCCU, Inc., a financial services firm; Vice President o...

Editor's Picks

Free Newsletters, In your Inbox