5 things developers should know about data privacy and security

In a post-Cambridge Analytica world, developers are more important than ever to the data privacy and security of the software they build.

In this week's episode of Dynamic Developer, I had a chance to speak with Joshua Kail, the Director of Public Relations at the LA-based PR Firm PPLA, about what should developers know about data collection.

Now, if you're wondering why on a developer show, we're talking to someone in PR? Josh isn't your normal flack. He has 14 years of experience working in communications and PR with companies that do everything from AI and ML to block chain, data analytics, security and storage. And he was also the senior PR manager for Cambridge Analytica's agency from the end of 2016 through early 2018, so he has a really good firsthand experience with what can go right and what can go wrong when it comes to data collection, privacy and security.

5 things developers should know about data privacy and security
16:18

These days, Josh is writing about the future of data policy and ownership, which are important issues for everyone that's really involved in the development of a new app or a system, from developers to project managers and everybody in between so that's why he's here talking about some of those things that developers need to know about data collection. The following is an edited transcript of the the interview.

1. Data should be owned by the individual

Bill Detwiler: So Josh, let's get right to it. What is the first thing that you think developers need to know about data collection, maybe that they don't know already or isn't that apparent?

Joshua Kail: Well, I think first on a general societal level, we're really at a precipice between everything that we've considered to be ourselves and who we are and our data prior to the last election in Cambridge and everything imposed, and we really need to see how we're going to approach our data ownership as an individual, as a business and as a society. One of the foundational parts of this is how we approach it from the development level because it's such a tightly inner wound tech issue. One thing that we really need to start looking at it differently is the separation between data ownership and data privacy and data security because a lot of times I feel when people are talking about this, it gets all balled up into one.

Data security, which is something we should all hopefully be very familiar with is about when an outside evildoer tries to come in and pull what data is controlled by a company on behalf of the individual. What I'm concerned with and what I think we really need to be focusing on from development onward, is the aspect of who owns that data to begin with and whether it is the inherent right of the corporation or the app owner who has access to that data and ownership of that data or we as individuals. And by looking at it with that constructive viewpoint, as far as where that lies, I think we can breathe out from that a bit more. My perspective is that the data is owned by the individual, and the assumption that companies or organizations have inherent ownership of it the moment they collect it, is the wrong direction we should be going in.

2. Developers must be mindful about data collection

Bill Detwiler: I know something that you and I have talked about is being proactive when it comes to data ownership in the development process and security, talk a little bit about that.

Joshua Kail: Sure, so when you're talking about proactive behavior around data collection and data ownership from the developer level, it's really being in the mindset of whatever you're developing, whether there's an app, a program, a platform, whatever, really being mindful of where the information that comes in is coming from, what's retained by it, what's kept and how it's being used. Now, a lot of that is going to be influenced by policies like GDPR, the California privacy act that has legal mandates in as how far it can go, but there's still such a wide open field in terms of interpretation, likelihood of being caught from it and the risk reward emphasis on taking that debt or not, we really need to look at it that fundamental level, how is this behavior being approached the moment it comes in and where it's coming from? Can it be forgotten easily? Can it be disregarded? And should you be pulling in all the information that you are even though you're capable of it?

So being proactive in that, and I understand as a developer you don't have full control over what you're developing. Unless you're doing it for yourself, you have people that are hired to do it. If you're independent, you have a hierarchy of people at work that are telling you what to do from the top down. But there are ways of just thinking about that so that you can either voice those issues early on and start that conversation, or avoid potential ethical dilemmas or legal dilemmas proactively in that development where it allows for.

SEE: Open source can thrive in a recession says Drupal creator Dries Buytaert (TechRepublic)

3. A chief data officer (CDO) can help balance the needs of everyone involved in data collection

Bill Detwiler: Who are some of the key people in that process? You mentioned, developers can sometimes feel like the cog in a big machine, they're just one part of the process. The project managers or the product owners might dictate, hey, here's what we want this product to do. Here's the features we want it to have. Here's how we want it to look. Here's the data we wanted it to collect. Who are the important people in that process? Or is there one person that's really more important than the others in that process to help developers have a bigger role, or to guide that process in a way that doesn't get companies into legal, ethical trouble when it comes to data privacy.

Joshua Kail: Right. Obviously it depends on the organization in the scenario you're at. One, you as a developer, being that mind, being that voice, is crucial because of all of them, you're the one who has the power to speak up or put something to note or put a suggestion in. From there going up, I think that's where we're seeing that role of a CDO coming into a more understood level because ideally the CDO has one foot in development, one foot in marketing, and understands the nuances of each. Traditionally you have the developers and you have the data scientist side of things and you have the marketing side of things and they often butt heads, and as someone who's worked in between those two forces, it's not always the most copacetic of relationships.

So having someone in place that understands the technical challenges ahead of it, but also understands the application and the potential use for it, can help go in there. If you have a CDO in place that understands these nuances, that can work with obviously the legal team, but understands where things are going from a societal perspective, it will help get those foundational data ownership policies into place and thoughts into place. The other thing is, I think if you look at data security or general cybersecurity and what's happened there, it's a good fortunate teller for where things could potentially go with data ownership because the developer, while certainly puts some mindfulness into data security, it's not their sole function. That's where the rise of these layered in cybersecurity solutions and vendors come in to help fill in some of the spaces where that data security fails on the pure development level, whether it's due to zero day vulnerabilities or just general gaps where the evolution of multiple sign in security options and things along those lines.

And that goes back to the developer being mindful. Being mindful of what these options are and what these potential issues are at this point, the more that you put into it now, again at this point it's really more of a discussion and getting the mentality change and hard action development changes in terms of data collection, but the more that you get that going now, over time, hopefully we'll move in a direction different from where we did with cybersecurity, where we have these outside vendors and then security becomes a separate independent thought that gets put into it rather than a holistic view of, here's what I'm creating, here are the challenges we have on it, here are the solutions that we can go in to actually help the end user at every level of their concern.

Bill Detwiler: You brought up the CDO, that chief data officer, someone who actually has not just an overview of the entire process and isn't just working with everyone across the board from legal, development, product, everything, but also has the authority to make decisions.

Joshua Kail: Right.

Bill Detwiler: And can be the arbitrator there.

Joshua Kail: Right.

4. Developers can push the conversation on data ownership

Bill Detwiler: What about the business pressures that are put on individual developers and how do we address that?

Joshua Kail: Sure. This is where the reality things come. An idealist can go in and say change everything, make sure that the all the blocks are in place. Screw who's hiring you, do what you know needs to be done and carpe diem. But that's just how you become an unemployed developer rather than a successful one. So again, part of it is raising the voice where appropriate, where you feel comfortable too, but the other one from an organizational and from a professional basis, this goes into the need of coming together as an industry and saying these are the expectations. These are the ethical protocols that we have in place or that we are pushing forward for this so that it becomes more of a forest. You're not an individual developer screaming into the wind, you're unionizing in that most classic and American of senses in order to make a social and a technical change within the system. And this is something I believe is going to be needed.

Again, it's not all on the shoulders of the developer. In order for this to actually have meaningful change and it has to go at the technical level, but it has to go on the greatest societal level and how we as individuals approach our own data and how we value our own data. It can start with the developer because on the greater issue, I think we've seen this many times over, individuals are less concerned with their data ownership than they should be, for a multitude of reasons, and so having it built in and getting them in that perspective will also help with it. But if you come together as an industry, and as we come together as multiple industries all along the way, different cogs in that ultimate machine of whatever is being developed, that's where the change can happen over time.

But the important thing now, is to start that momentum so that the opposite doesn't happen. Because without the voices coming forward, without the consolidation of ideas and the pushing and pressure for change in how we approach data ownership and data collection and ultimately analytics, starts with that voice, starts that momentum to move it forward.

You're not an individual developer screaming into the wind, you're unionizing in that most classic and American of senses in order to make a social and a technical change within the system. Joshua Kail


Bill Detwiler: You mentioned something about the public, that they often seem to be less concerned with their data and security and privacy around their data until they read a big news story or until something happens to them specifically, then maybe some of the folks inside the development process. And so how important though is it because I'm like you, I see the same trend, but sometimes I think it's because people just don't understand or they don't know what is actually either being collected or how that data is being used, so how important is transparency when it comes to, we've all see the multi page thousand word EULAs that no one ever reads.

Joshua Kail: Right.

Bill Detwiler: I've been in IT for 20 plus years. I've been in tech media for 20 years, I don't read the end user license agreements. I don't read all the terms of service. You assume, being a little paranoid, I assume everything is collected, right?

Josh Kail: Right.

5. Be transparent about the data you're collecting

Bill Detwiler: But how important is it to be transparent throughout the process and especially at the end with an app that you're building as a developer or just in general?

Joshua Kail: Transparency is key whether it's heard or not. The user agreements are terrible because you have the legal aspect of it and it boggles things because everyone wants to cover all of their bases, you're never going to read it. But imagine if that was taken entirely away and you said no visibility into it, so that option was totally gone. When it comes to the data ownership on a wider scale, the reason that I believe most people don't care as much as they should is because there're no pain points. When my data is being used for targeted advertising or for marketing towards me, I get annoyed maybe if an ad pops up too much on my feed, but I don't feel like I had anything taken from me even though I absolutely did so because there are no pain points, there's no action.

And Cambridge is a great example, because in January everything with Cambridge really started tumbling down. When the breach first started coming out with Facebook having all this data being out there that was stolen from them, you would have expected people to drop off Facebook, advertising revenue to drop off, all this other stuff. What the reality was, you had people posting on Facebook, attention Facebook, I don't give you permission to use my data, which means nothing, and you had no drop off on Facebook of any measurable moment and ad revenue went up. So there's no pain point there, there's no facilitator for change.

The only place I can see as an example, and this is on the data security side of things, when Ashley Madison got hacked. All those people that were on Ashley Madison became public and you could look them up on an easy to find directory. That was a 100% of pain point for anyone on those directories and while their behavior that led them to Ashley Madison to begin with probably didn't change, their behavior on how they looked at their data and accessibility online 100% changed because they felt an immediate response to their data being in a place where they didn't have control over it.

So that's the key of it. Transparency comes into play here because, twofold, one, as long as we still have the option to look through that window, as long as we can see where it's being used for by companies, even if we don't feel a pain point, even if we don't care that much, it increases that awareness. As all this other stuff comes in, that helps elevate that value and elevate that importance, then the transparency that was already established that we have policies for and expectations for now becomes exponentially more important because then when our consciousness, in terms of the crucial aspect of this issue comes into place, that transparency is already in place and we can see where it's going and we can make the necessary changes when it's facilitated.

More Dynamic Developer interviews and stories