Big Data

Why the most boring IoT data may actually make the most money

Most people think of IoT as all about smartphones and smart watches, but the smartest money of all may be on dumb, industrial-grade sensors.

industrialiot.jpg
Image: iStockphoto/chombosan

The Internet of Things (IoT) is the latest hot tech meme that Silicon Valley and all forward-thinking company CEOs are saluting. As software eats our world, it leaves an exhaust trail of data about consumers and their hardware. There's money in that data. Lots of money. Industry consulting firm McKinsey estimates that IoT will impact $4 trillion to $11 trillion in annual spending within the next decade.

Today, most companies aiming to monetize that IoT data are focused on consumers. Alluvium, a new startup based in Brooklyn, sees more money and benefit to society in mining the data debris in complex industrial applications. It wants to apply machine learning to dirty industrial things, not consumer devices. We're talking fleets of vehicles, offshore oil derricks, giant petroleum processing plants. If data science is the 'sexiest job of the 21st century,' this is the unsexy side that rarely is explored

Drew Conway, the founder and CEO of Alluvium, is a former member of the intelligence community and co-founder of DataKind, a non-profit dedicated to bridging the data science community and the social sector to enlist data in the service of humanity. Today, he's a leading expert in the application of computational methods for social and behavioral problems at a large scale. I recently spoke with Drew to find out more about Alluvium and why he chose industrial IoT for his new startup.

IoT in the industrial age

TechRepublic: Why industrial? Most startups are all about consumer IoT. Do you really want to compete with GE, the folks who also actually make a lot of these industrial systems?

Conway: The next decade of innovation will come from understanding the data generated outside of the consumer and enterprise web—second generation big data, if you will. If the core value proposition of the first generation of big data was the ability to store and count large quantities from the past, the promise of the next generation will be to observe data streams, understand the present, and affect the future.

SEE: Internet of Things: Five truths you need to know to succeed

Industrial organizations have been instrumenting their businesses for decades. Mostly, that data is used for minimal monitoring or reporting, if it's used at all. A sensor will alert the operator when a drill is failing, for example. But, there has been little effort to build data systems for managing the complex stew of raw signals coming from disparate proprietary devices and, in real-time, take action based on insights gleaned from that data. We developed technology that can do this. It's not industry specific, so we can work in any vertical, although it is early days today.

The challenge for large industrial incumbents like GE and its Predix Platform is that they want to sell software and hardware. They sell closed environments. But customers never buy from one vendor. You don't buy from one manufacturer or OEM. Big industrial customers run multiple different systems, airplanes with different engines, and so on. The buyer's problem is that none of these closed systems talk to each other.

We think there is going to be standardization in industrial verticals—data standards and protocol standards in manufacturing, automotive and aviation, among many others. It stands to reason that there might be some standardization around how processing happens at the edge, in much the same way that MapReduce and Hadoop became a standard for doing distributed computing on the internet.

Stream, baby, stream

TechRepublic: What is a layperson overview of how your technology works?

Conway: We call the technology "Mesh Intelligence." Learning and information extraction happen at the edge of a complex system where data are being generated. The "mesh" means we disaggregate the big data process into a graphical representation of data flowing together. The computation created by the mesh distills the high-volume stream of data into a feature representation. This stream of information is then used to generate models and understanding of the system, which can happen in disparate graph structures that aren't bottlenecked by the funnel architectures that typically exist in current data center deployments.

SEE: IoT developers: Master this coding language if you want to thrive

There's this idea in Industrial IoT where it's an "end-to-end machine." You should be able to connect the reasoning that can be drawn from one machine to another. We adopt the graph structure because it allows any input type to fit into the system. We include not just machines, but operators, historical data, even telematics data from fleet in the field, and then all this stuff becomes part of the mesh. Everything is connected through this graph representation. It's a "share in any direction" mesh intelligence.

The software itself is highly flexible and platform agnostic. This allows for sophisticated machine learning algorithms to run on architectures that make no assumptions of CPU/internet connectivity. We bring the learning to the edge instead of sending it back to the data center HQ for processing.

Mesh intelligence for the IoT masses

TechRepublic: What types of intelligence is mesh intelligence creating that don't exist today?

Conway: I think of it in terms of the hard software engineering problems that need to be solved to manage the complex stew of raw data being generated by sophisticated high capital assets. It's taking a large pipe of raw data and turning into a small pipe of information. This was where we started, and with this foundational architecture in place, our mesh intelligence platform can now perform sophisticated machine learning on this data, in real-time, to support decision makers and operators in complex industrial settings. For our early customers, this includes real-time anomaly detection, predictive maintenance, and production optimization.

TechRepublic: What is the return for operators deploying Alluvium's mesh intelligence?

Conway: Our real-time collective intelligence gives operators global and local insights as the data are generated. At the same time, the mesh intelligence is continually learning from all streams of data, blending machine and human data as it happens. It's an iterative loop that keeps getting smarter.

The payoff in this mesh intelligence can be enormous. If your Uber applications fails, you shift to Lyft, call a taxi, or just wait a few minutes and try again. IoT in the world of consumer devices doesn't carry the same consequences as industrial use cases. Direct costs and lost revenues can mushroom frighteningly fast when things get big and complex. Our research shows it costs about $22,000 a minute for a single line stoppage in an automotive plant. The daily cost in lost revenue at oil and gas refineries from a shutdown can exceed $1.5 million, depending on the size of the facility. Drilling for Liquid Natural Gas (LNG), the daily cost of unplanned downtime in that industry runs about $11 million. That's a line item expense even CFOs at Fortune 500 companies will notice.

Also see

About Matt Asay

Matt Asay is a veteran technology columnist who has written for CNET, ReadWrite, and other tech media. Asay has also held a variety of executive roles with leading mobile and big data software companies.

Editor's Picks

Free Newsletters, In your Inbox