In our age of the smart machines, techno-sociologist Dr. Zeynep Tufekci raised a red flag about the implications of digital tools monitoring our online behavior.
"We are in a new age of machine intelligence," said Dr. Zeynep Tufekci at IdeaFestival 2015 in Louisville, Kentucky. "And we should all be a little scared."
"Machines are increasingly a form of intelligence approaching a biological level of complexity," she said.
Tufekci, assistant professor at the University of North Carolina, Chapel Hill, New York Times contributor, and Berkman Center Fellow at Harvard, among other things, is a self-proclaimed "techno-sociologist." Originally from Turkey, she got her start as a computer programmer. Tufekci soon realized the major implications of computers on our lives and has devoted her career to studying the intersection of technology and society.
At Thrivals 8.0, an IdeaFestival program aimed at high-school students, Tufekci distinguished between different kinds of machine intelligence. There's the "smart toaster" kind, she said, and then there's the kind of intelligence that can see what we're doing. She isn't concerned, for example, about the software that was used in the airplane she flew on to get here to Louisville. "I'm concerned," Tufekci said, "about how these machines are watching me." And she laid out a number of good reasons why we should all be concerned.
The ways in which our online behaviors are being monitored are often subtle, she said. "They're watching me on my phone. They're watching me on Facebook. They're even watching me when I want to hide," Tufekci said. "Machines are a form of intelligence, and they're being built into everything."
Tufekci takes aim at Facebook, in particular, for mining our behavioral data. Just by monitoring our "likes," Tufekci said, Facebook knows an incredible amount about who we are--information we haven't actually disclosed. They often know our political preferences, sexual orientation--even whether or not we may be depressed. The scary part? Facebook may know it before you do. And it can use that information to manipulate you--to sell you things or surface content that affects the way you feel and the actions you take.
"We have to be aware that these [algorithms] are nudging us," she said.
Machines, Tufekci warns, are also prone to errors. They make different mistakes than humans make. "Our human institutions are designed for human weaknesses," Tufekci said. But we haven't built machine-error tendencies into our systems quite yet, because we're just beginning to learn their patterns.
"We should be a little scared of all these inferences," Tufekci said. "We should be worried that they're making mistakes."
Tufekci has written about the VW scandal, warning of the danger of smart machines. "We have to deal with their sneakiness in new ways," she says. "We are entering into a new era of smart machines with tons of ethical questions."
What does the future hold? Tufekci stayed hopeful we will find solutions. She suggested that competition is the most likely catalyst to keeping Facebook honest and forcing it to be more conscious about its behavior.
"Facebook is only ten years old," she told the audience of mostly high-schoolers. "Here you are! Create an alternative!"
- TechRepublic is covering the nexus of innovation at IdeaFestival 2015 this week
- Google engineer's swarm of mini robots could be the future of exploring Mars, and much more (TechRepublic)
- Volkswagon and the era of cheating software (New York Times)
- Facebook and the tyranny of "like" in a difficult world (Medium)