How tech companies lost control of their products

Quantitative futurist Amy Webb discusses the roots of smartphone addiction and other unforeseen tech problems.

How tech companies lost control of their products

CNET and CBS News Senior Producer Dan Patterson sat down with the Future Today Institute founder and quantitative futurist Amy Webb to discuss the roots of smartphone addiction and other unforeseen tech problems. The following is an edited transcript of the interview.

Dan Patterson: I would be terrified Amy but I wasn't paying attention. I was checking my phone while you were talking there. And of course, that phone was manufactured in China as well. Let's talk a little bit about my addiction of my smartphone. Why can't I stop?

Amy Webb: Why do you think you can't stop? I have a theory but what do you think?

Dan Patterson: What I actually think is different than what I am supposed to think. What I actually think is that intentionally or unintentionally, I certainly as a journalist, I can't put myself into the minds of others, I can only report and use what people tell me. But I think that some very large firms that didn't start as large firms, figured out how to do certain things using this fantastic new platform like you were speaking about a moment ago. We had this convergence of technologies that happened about a decade ago. And it created a super powerful processing device that I can put into my pocket, and soon it will be a femoral and around us everywhere, we call that IoT.

And I think that unintentionally or intentionally, some people figured out that there was a business model tied to my attention. And this wasn't some scheme to get me to use this thing over and over and over. It was just like, artificial intelligence, many different technologies, that tied into my dopamine systems, and these reward systems. And I was rewarded for my behavior and as technology became more sophisticated, over the last decade, my behavior became tied to using one thing I call my smartphone. And now, my dopamine systems have been rewired so that I can't do anything without first making sure that this thing is in my pocket. So that's what I think, is my honest answer. But what I think is less important than what you think because you study this.

Amy Webb: Yeah, I guess I agree with you. I don't think that any of the company set out to create a device that was addictive. Rather, I think companies do not spend enough time with uncertainty. Instead, leaders tend to over predict or under predict change, especially that change relative to them.

Dan Patterson: Because the stakes are very high with uncertainty.

Amy Webb: Sure and so I don't think these companies are evil. I don't think the big nine is evil. I don't think their leaders are intentionally trying to sabotage humanity or democracy in any way, I really don't think so. I think that how we arrived at now has much more to do with a fundamental lack of planning. And sometimes when you are creating a game changing, groundbreaking, fantastical new technology, the desire, which I completely understand is to make the thing work. Not to say at the same time that you're trying to make the damn thing work. At the same time that you're trying to make the thing work, thinking through the next order implications of whatever that thing is. And to some extent, you can't know them in advance.

SEE: Artificial intelligence: Trends, obstacles, and potential wins (Tech Pro Research)

Dan Patterson: Explain what you mean by next order implications?

Amy Webb: So a great example is Google Maps. You might actually know this. Do you know what Google Maps was when they acquired it? It was called Keyhole. Do you remember?

Dan Patterson: Oh, yeah.

Amy Webb: So a bazillion years ago, there was this thing called Keyhole. And I could remember sitting with an enormous desktop computer.

Dan Patterson: And they were name-checked in the television show that's fantastic about American politics. We all forgot about from the late '90s and early 2000.

Amy Webb: West Wing?

Dan Patterson: West Wing.

Amy Webb: Oh, really?

Dan Patterson: Keyhole is name-checked in the first couple episodes of West Wing like crazy. So is Rightly, which Google acquired to create Google Docs and Google Drive.

Amy Webb: I remember sitting with my dad and showing him a satellite view of our house, which I thought was cool. And my father was mortified when he saw this because he's like, what right do they have to take a picture of our street and show our house? I think the challenge is when Google acquired Keyhole, the whole thing became Google Maps. The idea was to help us get around, but you can't escape the business opportunities. And this is part of the challenge that we have going forward, the stuff has to make money because it costs money. This is a business and as you rightly pointed out, in this modern world, our attention is currency. There's no way to get around that.

Therefore, what companies ought to be doing is mapping out in advance if we pursue this path, because we cannot control the evolution of this technology. Or better yet, we cannot control how consumers will use and possibly abuse this technology. What are all the catastrophic scenarios that we can think of? And if those catastrophic scenarios and optimistic and pragmatic but if these scenarios come, some iteration of them, come to fruition. What are the next decisions that get made and so forth and so on, you keep going and going and going.

Dan Patterson: Kevin Kelly calls this the Technium. Kevin Kelly, the founder of Wired Magazine, who started his life as a dirty hippie and became a technophile. But it is, well we may make some explicit and some implicit decisions, technology advances regardless.

Amy Webb: And I guess I would say that is not true because that would assume that we are all cogs in somebody else's pre-ordained pre-mapped story that's being told.

Dan Patterson: So you are saying then, that there is some agency when it comes to the founders or now the controllers. We're so used to thinking of these big tech companies as well, the founders make the decision they still do. But what we're also seeing a power seed from one generation to the other. But-

Amy Webb: Well, there's two things to unpack there, which and I'll get to the second one about agency in a second, but let me go back to something you just said, which is really interesting. And that is this idea that we believe that the founders are still--

SEE: Technology that changed us: The 1970s, from Pong to Apollo (ZDNet)

Dan Patterson: In control--

Amy Webb: In control and making these critical decisions. They are making some critical decisions. I would argue that the decisions that are being made every single day by all of the people who are working in the trenches are far more important and far more long lasting. I'll give you an easy example that anybody can understand. When you were training a machine learning algorithm or deep learning algorithm. When you're training a new system to do things like recognize an object, that system needs something called a corpus. A corpus is a large set of data. Those data sets don't just appear. And in fact, there are a handful of usual suspects that are used within the industry to do some of that initial training. It's very difficult to get all that data put together in a way that's readable by machines caught takes time, has to be tagged. There's like a lot that goes into it to get the thing ready to use.

Those databases are flawed, they are full of bias and everybody knows this. And yet every single day that somebody uses one of those existing databases, it's a tiny decision, right? But it's a decision that has implications. It is a decision, which ultimately led to a couple years ago, somebody uploading photos of themselves on Google Photos. These were two people of color and Google Photos labeled them as gorillas.

So Google itself isn't racist, right? But the people who are running the founders aren't racist. And I would argue, probably the people on the team who built the thing, which ultimately resulted in somebody, a person of color being tagged as a gorilla, I don't think that person was probably racist. It was a series of tiny decisions, which led to a really bad outcome. The world that we are quickly moving into is one, in which much of the decisions are being automated.

And if you reverse engineer that decision, make that automated decision making process to its individual components, that to me is something. So it's not just Sundar Pichai of Google. It's not just Jeff Bezos, right? It's not just the founders, and the leaders of these companies who have an overwhelming say, in our futures. It's all of the decisions being made by all of the people who work at all of these companies on a daily basis. They have an enormous amount of moral and ethical responsibility that they bring to their work.

Also see