Enterprise Software

Behavioral marketing: Why Johnny can't opt-out

Behavioral advertising can be intimidating if privacy is one of your concerns. Thankfully, we're able to opt-out of being tracked. Or are we? Michael Kassner investigates opt-out tools.

Whether we like it or not, behavioral marketing is here to stay. For those who disapprove, I offer solace. We at least have the option to opt out. Or so I thought.

"The current approach for advertising industry self-regulation through opt-out mechanisms is fundamentally flawed."

That quote is from, "Why Johnny Can't Opt Out: A Usability Evaluation of Tools to Limit Online Behavioral Advertising," and the conclusion reached by the research team of Pedro G. Leon, Blase Ur, Rebecca Balebako, Lorrie Faith Cranor, Richard Shay, and Yang Wang, all from Carnegie Mellon University.

Here's more good news:

"Users' expectations and abilities are not supported by existing approaches that limit online behavioral advertising by selecting particular companies or specifying tracking mechanisms to block.

Users have great difficulty distinguishing between tracking companies. They also lack sufficient knowledge about tracking technology or privacy tools to use existing privacy tools effectively."

Wonder how the research team came to those conclusions? Here's how. They first segregated the privacy tools into three groups: opt-out applications, settings built into web browsers, and cookie-blocking software. Armed with this information, the team then created a survey, specifically designed to determine how well volunteers understood each of the methods meant to inhibit tracking.

We seem to have a problem

The problem appears to be real. But, it's unclear to me whether it's user-related or software-related. And, I owe it to you, the reader, to find out which.

Thankfully, I recognized the lead author, Dr. Lorrie Faith Cranor. When I interviewed Dr. Aleecia McDonald for an earlier article, she brought up Dr. Cranor's work several times. So, I contacted Dr. Cranor, mentioning my dilemma. She was happy to clear up any confusion I had. I also had some questions on how the survey was conducted. That's where I started.

Kassner: I would think participant selection is important in a survey like this. How did you go about selecting people? Cranor: We sought nontechnical participants who were not knowledgeable about privacy enhancing tools, but interested in trying them. Since we were using IE9 on Windows 7 and Firefox 5 on Windows 7 and Mac OS X as our testing platforms, we recruited participants who had experience using one of these operating systems and browser combinations.

We recruited participants from the Pittsburgh region using Craigslist, flyers, and a university electronic message board. Recruitment material directed prospective participants to a screening survey. From those who completed the survey and met our screening criteria, we recruited five participants for each of the nine tools we tested, for a total of 45 participants.

Kassner: You tested nine tools that fit in the following categories:

How were the tests carried out?

Cranor: Each of the 45 sessions was moderated by one of two researchers. Participants were randomly assigned to the tools considering their browser and OS preferences.

We began each session with an interview to gather the participant's perceptions, knowledge, and attitude about online advertising. We then showed the participant an informational Wall Street Journal video about online behavioral advertising.

Next, we asked participants to perform three types of tasks using a computer in our laboratory configured with their assigned Internet browser and operating system.

Installation and Initial Configuration: We provided a simulated email from a friend suggesting they try the assigned tool. After installing, the participant answered questions designed to measure his or her perception and understanding of the tool. Configuration of Specified Settings: To evaluate participants' ability to use the tool, we asked each participant to configure their assigned tool to a set of specifications (fairly protective settings) we provided. The participants then answered questions related to configuration. Fine Tuning Settings to Resolve Problems: We then asked the participant to perform five browsing tasks with the tool installed and active. We advised the participant to change the tool's settings if need be. After which, participants answered questions about their experiences. Kassner: Now I'd like to look at the survey findings — particularly the following synopsis:

"Users tend to be unfamiliar with most advertising companies, and therefore are unable to make meaningful choices. Users liked the fact that the browsers we tested had built-in Do Not Track features, but were wary of whether advertising companies would respect this preference.

Users struggled to install and configure blocking lists to make effective use of blocking tools. They often erroneously concluded the tool they were using was blocking online behavioral advertising when they had not properly configured it to do so."

The paper offered several examples in the appendix to back up their conclusions. The configuration window for TACO is one such example:

Here's a test. Do you understand Targeted Ad Networks, Web Trackers, and Cookies well enough to effectively configure the app? Most of the volunteers did not.

Professor Cranor, how would you fix these issues? Would you place more emphasis on user education? Do the tools need to be more intuitive?

Cranor: There are many things that can be done to clean up the user interfaces, including removing a lot of jargon, simplifying the interfaces, and making the workflow clearer to users.

Some of the tools have default settings that are not what users expect, so changing these settings would probably help. User education would likely help as well. I think if the ad industry is serious about their opt-out solutions, they need to run ads that explain how this works.

Internet Explorer needs a more holistic approach to privacy protection rather than providing several different privacy tools with different user interfaces that are nearly impossible for most users to figure out. Ultimately, we may need to rethink this approach to privacy protection. Where we ask users to distinguish between hundreds of trackers from companies they have never heard of.

Kassner: I'm betting most of the results were expected. Am I wrong; were there any surprises? Cranor: We expected to see many of these problems. Still, I was surprised by how frequently people were confused by these tools and had trouble using them. I was especially surprised to see so many people who mistakenly believed they had configured the tools in a highly protective way when, in fact, they had not. Kassner: One last question; it's from my writing mentor. He understands online behavioral advertising, having proofed several of my articles. In fact, on my insistence, he tried several of the opt-out tools. And, just like your survey participants, became frustrated.

With what you learned and what you know about online behavioral advertising, what would you suggest he do to avoid being tracked?

Cranor: I personally use Ghostery on a regular basis. I like the fact that it blocks trackers, but also has an easy way for me to monitor what it is blocking and to selectively unblock some trackers when they are needed to prevent the websites I am visiting from breaking.

But, to use Ghostery effectively, you have to make sure you explicitly tell it to block trackers. And, sometimes you have to do a bit of detective work to figure out what you need to unblock when things break.

Final thoughts

I'd like to thank the research team headed by Professor Cranor. They started the ball rolling. Now it's up to us. How should we proceed? Tell developers to make the tools more intuitive or help each other better understand how the tools work. Thoughts?


Information is my field...Writing is my passion...Coupling the two is my mission.

Editor's Picks