A brief history of internet counterculture

If you want the good, bad, or ugly, you can find it on the internet.

A brief history of internet counterculture

Dan Patterson, a Senior Producer for CBS News and CNET, interviewed Brian McCullough, host of the daily Techmeme Ride Home podcast, about the dangers of counterculturalism on  the internet. The following is an edited transcript of the interview.

Dan Patterson: How did the internet suddenly become a horrible place?

Brian McCullough: It always was. I must have gotten on the internet in '89, '90, and one of the first things that I found, aside from pornography, was The Anarchist's Cookbook. There have always been these dark little corners of the internet. There has always been actual crime and criminals, and syndicates have been empowered. That's sort of the thing with all of technology: technology gives and takes away in equal measure. With every Bitcoin advance so that money can be made digital and frictionless across the globe, but that also allows for easier money laundering and things like that. It was always there.

The difference is, I think, is social. I think the difference is that now with social, not only can all those little niches and corners and those rocks--those dark, damp rocks--they can be cultivated, and they can be watered, and they can thrive in their own little niches. And then, at the same time, if people actively do it, they can be promoted and broadcast, and so these tiny little dark corners that people knew were there, but weren't in your face, they're now in all of our faces.

SEE: The Dark Web: A guide for business professionals (free PDF) (TechRepublic)

Dan Patterson: Correct me if I'm wrong, Brian, but I would suspect with your historical knowledge and your anthropological bent that you've encountered many of these forces and the specific actions that some of these people are doing and talking about. I'm kind of skirting around the overt racism, the overt sexism, the violence, the pornography, not the banal things that we dislike, but the actual overtly horrible things. I also encounter these things. I have data tools, as a reporter, that help me kind of poke into these darker corners.

There are so many people who say, 'Look, I just don't feel it. I don't see it. I don't understand it. Why are all of these people talking about the horrible web? I don't get it.' If I'm somebody who feels as though the internet has not made my life worse, help me understand the specific actions, actors, and forces that are behaving in a way that is not just distasteful but downright criminal or really bad.

Brian McCullough: Again, it comes back to the democratization of voices and the ability to brag--as we're sitting here in CBS News headquarters, 40 years ago, this is a platform that could reach millions of people. You can be the horrible story out of New Zealand. You can be a mass killer, and you can have a camera on, and you can broadcast horrific hate and violence with the same level of success in terms of reaching people that CBS News has.

The problem is that when we say democratization, when we say everyone has a voice, the platforms are all equal now, and that can be excellent in the sense that a brilliant 12-year old in Ecuador can learn things and disseminate ideas, but also, a horrible racist person somewhere can not only spread their hate but be indoctrinated into terrible things, because there is no functional difference now from what's broadcast by CBS News than what's broadcast by a YouTube channel, or just a live Twitch stream or whatever.

Dan Patterson: I understand that social platforms help amplify terrible things. Help me understand specifically the role algorithms and recommendation engines work in ways that aren't just Netflix telling me the best next awesome thing to watch. There are recommendation engines built into Google and YouTube which have been blamed for extremism or at least amplifying extremism. Facebook, of course, is all built on algorithms, and they have also been blamed for amplifying extremist behavior. Help me understand the role that recommendation engines and algorithms play in amplifying this type of behavior.

Brian McCullough: This is one of the debates inside Silicon Valley internally about how things have turned bad. When social networks started in the early- to mid-2000s, the idea is that more usage is clearly better, and not just for our bottom line where more shares, more likes, more time spent on our platform means we can sell more ads--there is that--but it was also assumed that more usage meant people were getting more out of your platform, so all of the engineering that was done on social networks for the last 15 to 20 years was done in aid of creating greater engagement.

These algorithms are--especially on YouTube, and personally, I think YouTube is the most egregious example of this. You watch a video, you enjoyed that video. They know because they have all the data; they can see that you watched it all the way through. You watched it a second time; they can see how your mouse traveled. There's even the ability to watch your gaze and things like that. So they say, all right, they can see in their listings someone that watched that entire video also liked this video. They send you another one and another one and another one. Their idea--again it serves two masters, more views, more ads, more money for them, but also their assumption has been if you're engaged, you're enjoying, they're doing their job, they're delivering. Like you said, Netflix wants you to always find something interesting to watch.

Unchecked, that can lead you down a rabbit hole where you watch--this is literally true, people have written articles about this--you can watch videos about Martin Luther King, and if you follow the suggestion algorithms on YouTube, you can find yourself within a half an hour, two hours, being sent anti-Semitic or racist videos. The problem is that the algorithms were only designed for one metric, so it was always quantity over quality. It was always more engagement over taste.

Dan Patterson: But Brian, I'm a smart person. I can't possibly be manipulated. And yet you're telling me that I have been. How?

Brian McCullough: When people talk about information silos, this is kind of what they're talking about, because as smart and sophisticated as you might be, you're also self-selecting, in your own way, for the things that give you pleasure, that you also agree with, that titillate you. So, in your own way, it's almost like the algorithms aren't forcing you into bad things. They are also designed to react to human nature.

Again, I think that one of the things that Silicon Valley is realizing is that it's not just a one-dimensional thing where we design algorithms for greater engagement. There does need to be--until that magical world of AI being perfect and solving all of our problems--there needs to be human curation, and this could get into the debate of putting your thumb on the scale and censorship, but at the same time, never in media was there unfettered broadcasting of ideas. 

There were always editors going back to [Johannes] Gutenberg and the books. There were people selecting--and not just to hold voices down, though that certainly happened--but I think that what Silicon Valley and the engineers that come up with the algorithms have realized is there does need to be a human component because curation always had a value and a purpose.

Watch more interviews with Dan Patterson and Brian McCullough

Also see

Pixelated unrecognizable hooded cyber criminal

Image: Getty Images/iStockphoto