TechRepublic’s Karen Roby spoke with Catherine Zhu, special counsel at Foley & Lardner, about the changing landscape of data privacy laws. The following is an edited transcript of their conversation.
SEE: IT expense reimbursement policy (TechRepublic Premium)
Karen Roby: When you talk about businesses and it comes to data privacy, where do you see businesses making mistakes? Where are some of those things that they’re not doing or not considering that they should be?
Catherine Zhu: I work with a lot of earlier-stage businesses and, I think, depending on the stage, there’s different potholes and things that business can run into. I would say on the earlier-stage side, a lot of companies that I work with, with respect to data privacy, sometimes they’re not thinking about data privacy at the start. Because when you’re starting a company, there’s a lot of different things that you’re trying to do. You’re trying to get your product to market. You’re trying to get investment money. You’re just trying to get the ball rolling. And it’s easy to kind of push data privacy compliance and principles later down the road at that stage.
And I think that makes sense. But I think where it can really come back to hurt a company is when you push it down too far and you’ve built up all these operations and processes and everything without taking data minimization into account, without taking data privacy into account, it’s almost like an accumulation of “privacy debt” in the same way that you can accumulate technical debt, which makes it difficult later on to go back and revise all those processes and operations that are now baked in.
So, I would say, starting off as a company it makes sense to prioritize your resources because you have limited resources, but pushing privacy compliance too far down the road can definitely hurt you.
I think for the larger businesses, they tend to have more resources. For example, the ones that I work with, they might even have an internal privacy team. And then, it really becomes about staying on top of the rapidly changing regulatory landscape and making sure that the changes that are coming either in the form of past laws or trends that are coming on the regulatory front that your organization is adapting to those in a timely manner and not leaving any gaps there.
Karen Roby: Catherine, about some of the things coming down the pike and what we’re seeing from a regulatory standpoint: Is there anything that’s kind of stood out to you as of late that you think is important to mention?
Catherine Zhu: I think so, on the U.S. side, there has been a lot of regulatory change in the last, I want to say, two years. And before that, in 2018, that’s when Europe passed their big GDPR legislation, which was a huge change in not just European data privacy law, but the global way of thinking about privacy law. So, especially for the U.S. However, in the last two years, these new regulations have been rolling out at a very fast clip, starting with the California Consumer Privacy Act that went into effect in early 2020, which became the most stringent data privacy law when it was passed in the United States for consumers. Since then, we’ve seen Virginia pass their own data, privacy law, as well as Colorado recently in the last few months. And in California, there’s actually been an update, a rather significant update to the consumer privacy law that’s going to take effect at the end of 2022.
So, things are changing very quickly. Whereas before, even three years before, there wasn’t a governing consumer privacy law in the U.S. to look to, we suddenly had a very kind of complicated and stringent one starting in 2020. And now, it’s rapidly evolving into a patchwork of different state laws that need to be accounted for, especially for companies that operate across states.
People are wondering, is there going to be federal privacy legislation passed so that we cannot do a multi-state analysis? That’s an open question. Are more states going to come out with their own consumer privacy laws, like New York, Florida, Washington? That’s also a possibility, those are being discussed. So, really keeping track of what’s happening at both the state and federal level, I would say, has been a hallmark of the last two years on the U.S. side.
Karen Roby: When we look at the consumers, I mean, we’re all consumers so this is something that consumers deserve. I mean, there’s so many questions out there, and people are confused, and they have no idea where their data is going, and who’s trading it, and who’s doing this and that with it. And privacy should be of the utmost importance.
SEE: Expert: Intel sharing is key to preventing more infrastructure cyberattacks (TechRepublic)
Catherine Zhu: Yeah, that’s right. I would say there’s almost been a change in the public sentiment where maybe 5, 10 years ago, people didn’t really care if companies collected their data. Maybe the mindset was the more, the better. And I think that’s really turned around in these last few years where people, as well as regulators, and in businesses as a result are thinking, “We actually do need to protect this data. We need to set limitations on the data that’s being collected. We need to minimize the data that’s being collected.” So, there’s really been a shift, both in the public sentiment as well as the law. So, I would agree with that.
Karen Roby: Yeah, you can definitely feel that that change has come on. I mean, I know just myself, I get really nervous when something I’m filling out, or doing, and they’re asking questions and it’s like, “Oh, what are they doing with this?” And you just get nervous. And understandably people that don’t work in this business or really understand tech and data privacy, I mean, it’s a lot to take in. Talk a little bit about, Catherine, you recently put together an article regarding dark patterns. Talk a little bit about that. What does it mean? What do people need to know?
Catherine Zhu: As I mentioned earlier, in my legal practice, I mostly advise businesses, a lot of them on the earlier-stage side, for data privacy compliance. The dark patterns article was really kind of sensing a shift in the regulatory atmosphere for data privacy.
I’ll just start with what dark patterns are. Dark patterns have been around for a long time. They’re essentially a design feature that is manipulative. For example, you go on a web app, or a mobile app, and a pop-up comes up, and it asks you for information. And maybe the option to provide that information very much looks like the only option, and the option to not provide information is like very small and in the back somewhere. So, that’s an example of a dark pattern.
Another dark pattern is you go onto your account for a certain subscription, you’re trying to opt out and it won’t let you. And it’s very, very difficult to do that. Or some advertisement comes through, it asks you for your email, it tells you, you’ll get $25 if you give them your email. You put in your email, then it asks you for your phone number. So, it’s a way that the user interface can designed to manipulate consumers either into doing something that they didn’t actually want to do or prevent them from doing something like opting out that they set out to do.
Dark patterns, they’ve been around for a long time, but I think they’re starting to become more and more problematic as we’ve moved to more of a digitalization of society. And the article talks a little bit more about that. And we’ve seen, on the regulatory front, that both federal and state regulators are starting to pay attention to this. At the state level, both the Colorado and California consumer privacy laws that went into effect are banning the use of dark patterns as a legitimate means for getting consent. So, if someone gave you their consent or opted in because you used a dark pattern, like a manipulative interface, that is not going to be considered legitimate under these laws.
SEE: Ransomware attack: Why a small business paid the $150,000 ransom (TechRepublic)
At the federal level, the FTC has authority to prosecute companies for deceptive trade practices. And they held a workshop in April of this year, specifically analyzing the use of dark patterns. Now, it’s a tricky area because it’s hard to say what is and isn’t a dark pattern. Sometimes it’s very obvious, but sometimes it’s more subtle. So, if you read the article, it also talks about how the use of automated technology, where we’re iterating on input, that can lead to a proliferation of dark patterns without human intervention. And so, if we’re not cognizant of the impact of these dark patterns, then we can easily find ourselves just awash in them.
Finally dark patterns, from a societal standpoint, they tend to have a disparate impact on different groups, especially historically disadvantaged groups: children, older adults, people who do not have high digital literacy. So, if we do allow the unregulated proliferation of dark patterns, there likely will be a disparate impact that re-entrenches existing inequities.
I think for all of those reasons that has really piqued the attention of regulators. And, as a result, I think businesses need to stay aware of this trend in privacy regulation. And it might impact product design, user engagement and a lot of different aspects for businesses.
Karen Roby: Catherine, businesses have to stay up to speed on that, as it could impact their products and how they roll things out. So, I think we’re finally at a point, where businesses can’t just put their head in the sand and say, “Well, we didn’t know.” But we’re finally, I think, getting to a point where you have to know this. And if you’re going to be in business, it’s just like anything else, you’ve got to know the rules and the laws and what goes along with all of that, especially as it relates to people’s private information.
Catherine Zhu: I definitely agree with that, Karen. I think, at this point, data privacy and data protection have really become table stakes, especially if you’re operating a technology business. So, even at this stage I would say, there’s no way to ignore it and definitely not in the future.