Predictive tech won’t work without consumer data. That’s the bottom line whether you are a tech company selling smart home cameras and sensors or a consumer buying these devices.

Alexa may have enough baseline data right now to distinguish between a sneeze triggered by dust and one that signals an oncoming illness. But, because smart home devices are still operating independently, Alexa can’t talk to other devices to take action to address a possible case of the flu, like scheduling a telemedicine appointment.

Explaining this kind of anticipatory tech and helping consumers opt in or opt out are the two current challenges facing companies building products and services that could predict a person’s next thought or action. There’s also not enough discussion of how this technology could be used by criminals and trolls.

CNET hosted a predictive tech discussion with entrepreneurs leading the way in the anticipatory tech space and privacy advocates at CES 2020, “IoT: Moving Into An Anticipatory Tech World.”

As CNET Editor-at-Large Brian Cooley described it, each company in the conversation is working with a different level of scrutiny. Doug Clinton, co-founder and managing partner at Loup Ventures, is investing in startups including neurotech companies developing computer-brain interfaces technology. Some products in the fund’s portfolio require FDA approval. Clinton is interested in technology that could use a person’s thoughts to take action. Instead of hitting a button on an app to turn down a smart thermostat, a user would think, “Adjust the heat, I’m too hot.”

Rana el Kaliouby, co-founder and CEO of Affectiva, is using machine learning and computer vision to understand emotions by analyzing expressions and vocal tones. Her company is a member of The Partnership for AI, a consortium focused on establishing best practices and educating the public on the impact of predictive technology.

SEE: CES 2020: The big trends for business (ZDNet/TechRepublic special feature)

Michelle Turner, senior director of product management for smart home ecosystem at Google Nest, is operating in the “Wild West” with few formal regulations but high profile problems as well.

Cindy Cohn, executive director of the Electronic Frontier Foundation, was the consumer advocate on the panel who balanced out the best case scenarios from industry insiders with the reality of how predictive tech can be used maliciously. Cohn said adding technology to existing systems that are unfair only makes them worse.

“What if this data is shared with your insurance company, will you get kicked off your insurance? Will the cops show up at your door?” she said. “When we add technology on top of those systems, we wish those problems away or make them worse.”

Educating consumers about predictive tech

Turner from Nest said that there is a shift going on among smart home device makers that recognizes the growing demand for better privacy controls.

“We know we hold very sensitive data, and we have to protect it,” she said. “We can’t deliver these anticipatory tech unless we have access to this very sensitive data.”

Turner stressed the importance of providing consumer education about these devices and creating an easy opt-out system. The panel agreed that this was important, but the real question is who is responsible for developing and enforcing this consumer friendly opt out system. No one has volunteered to build such a system, Cohn pointed out.

“What does it mean if it goes wrong and who is responsible?” she said. “Right now there is a lot of shrugging going on when this comes up.”

Cooley pointed out that there are legitimate disagreements about what is a good use of data and what is bad. This makes it especially hard to set a common standard for industry to follow. Turner from Nest said that individuals should make their own data sharing decisions.

“Every one of us has a different threshold of what is OK for what we’re willing to share,” she said.

Cohn continued that the public should be setting these standards, not the companies making money from the data.

“We have to have the law and the policy to protect us, and we have to have a society ready to deal with it,” she said.

SEE: Prescriptive analytics research report 2019: Tech leaders open to emerging technology
(TechRepublic Premium)

Diversifying the data set

Cohn also brought up the risk of building technologies that limit the customer base to white people in affluent neighborhoods.

“You’re going to really hurt those other people if you don’t anticipate how other people will be hurt by tech developed by middle class or rich people,” she said.

el Kaliouby said Affectiva has addressed algorithmic bias from the start by creating a database with information from people of different ages and genders.

“If our data set is just composed of older white guys, our tech is not going to work on people who look like me,” she said.

Clinton said Loup Ventures looks for founders who understand the malicious use cases for predictive tech and address privacy issues directly. He also said that some problems are inevitable.

“You know the components of the system, but you don’t know how the tech will react in the real world and there will be an unknown that we can never anticipate,” he said.

el Kaliouby of Affectiva said she sees the most promise for predictive tech in improving health and wellness.

“Our first application was with autistic kids and understanding non-verbal communications,” she said. “We are also working with Parkinson’s patients and even flagging signs of suicidal intent.”

Clinton said that Loup Ventures is particularly interested in the human ear and computer-brain interfaces with the ability to collect information from that data rich part of the body.

Improving competition to protect privacy

Cohn likes the idea of adversarial interoperability, which would allow consumers to switch from one tech service to another seamlessly.

“Right now you really have to choose your prince and then you are beholden to that prince; things don’t work if you want to leave,” she said.

Cohn also said that the US needs to shore up laws and policies to create more competition to give consumers more choice among tech services.

“You sign on and get tracked all the time, or you don’t sign on at all–there’s a lot of space in there for innovation,” she said.