Jan. 28 was Data Privacy Day, a date recognized by Congress in 2009. Its purpose for the past 11 years has been to spotlight issues involving consumer and business privacy and data protection to promote greater awareness and education about the concepts.
Every company, large or small, needs to have a principled focus on data privacy in order to safeguard operations and customers. I discussed the topic with Rina Shainski, chair and co-founder of Duality Technologies, a data encryption provider.
SEE: Cheat sheet: Facebook Data Privacy Scandal (free PDF) (TechRepublic)
Scott Matteson: How do policymakers and enterprises around the world view data privacy in 2020? What trends should we be looking out for?
Rina Shainski: Our right to personal privacy, specifically our data privacy, has become an increasingly important issue, driven by the realization that the growing abundance of data (“needle in a haystack”) doesn’t “shield” individuals anymore. In fact, the opposite is true, due to the exponentially growing power of artificial intelligence and machine learning that know how to “feed” on such copious amounts of data.
More stringent regulation has naturally followed, beginning with the introduction of GDPR in 2018 and continuing with CCPA, which went into effect in January of this year. What sets GDPR and CCPA apart from the previous generation of data privacy regulations are the heavy fines that regulators can now impose on non-compliant companies. Potentially harsh sanctions are designed to force businesses to comply, which impacts how they use data. I believe that the momentum of data privacy regulation will continue into 2020 as evidenced already by federal data privacy regulation initiatives, with tailwinds from the success of existing regulations.
The push for additional legislation is being driven by ongoing reports of intentional and non-intentional misuse of data, risks to privacy, and consumers’ growing lack of trust in data aggregators, combined with dissatisfaction with how their private data is treated. The big challenge that is emerging in 2020 is how to reconcile the benefits of smart services driven by data and AI with individual privacy, which is recognized as a “basic human right.”
SEE: 5 things developers should know about data privacy and security (TechRepublic)
Scott Matteson: What are the challenges involved and solutions available?
Rina Shainski: Technology (AI and ML) is being “blamed” for our current data privacy imbroglio, but technology is what can help solve it as well. Privacy enhancing technologies (PETs) represent a new, emerging category of technologies, and are increasingly being used to protect data privacy while enabling data use. Prior to the emergence of PETs, previous solutions tended to rely mostly on de-identification and anonymization, which usually involved removing personally identifiable information(PII) fields from data sets. However, anonymization technologies have been rendered insufficient by the advancements in AI and machine learning capabilities, which enable re-identification of anonymized data. PETs in the realm of secure computing, such as homomorphic encryption, multi-party computing (MPC), zero knowledge and differential privacy are introducing new paradigms for protecting various modalities of data usage. For example, my company, Duality Technologies, enables data science computations to be performed on encrypted data, which allows sensitive data to be analyzed and processed by our customers’ partners while remaining protected. That opens up a whole new world of data-driven business and research collaboration, maximizing data utility while protecting data privacy.
SEE: Is your data policy ready for California’s new consumer privacy act? (TechRepublic)
Scott Matteson: How does privacy differ between consumers and businesses?
Rina Shainski: Consumers are often the owners and the source of private data. Enterprises are usually custodians, aggregators and processors of these consumers’ data. Data privacy regulation usually applies to enterprises that generate private data – often as a result of providing services—and then aggregate and process it.
Consumers, by using various services, generate a “digital footprint” that can be refined and used by the service providers for the benefit of better services, research, and scientific progress in many fields. But the same “digital footprint,” when misused, can allow individuals’ data to be terribly exploited which, as we have seen, adversely affects peoples’ lives as well as entire economies and political systems. This leads to a crisis in trust between consumers and enterprises.
SEE: What businesses need to know about the California Consumer Privacy Act (CCPA) (TechRepublic)
Scott Matteson: I see a growing trend among many to abandon social media entirely. Is it too drastic a step? If so, what are some options people can take to be able to use social media with minimal risk?
Rina Shainski: Social media is only part of the story. An individual’s “digital footprint” is not confined to social media. For example, our location data is tracked by a whole plethora of apps running on our smartphones, and app owners trade this highly lucrative information, making their “free of charge” use model something of a misnomer. The same also applies to some of our healthcare records, financial information, and other incredibly sensitive data.
Social media tends to expose a greater amount of information about individuals, as people willingly share their personal stories with their social network, while indirectly allowing the social media service provider to trade on their personal information for financial benefit. In return for the “free” services consumers get from social media providers, consumers allow companies to monetize their personal data and content. Social networks which allow users to pay to use their services in return for a guarantee that their personal data will not be monetized could provide a reduced privacy-risk upgrade to the status quo.
While it is an advantage to receive personalized services and personalized healthcare, which need our data to train the relevant AI applications, we still must maintain the right to keep our data protected. This is the big challenge of the new era of this data economy.
SEE: Cisco study finds huge returns for companies investing in privacy (TechRepublic)
Scott Matteson: How will privacy concerns and remedies be shaped in the future?
Rina Shainski: The Age of Big Data is here, and converting data, including personal data, into economic value is fueling the emerging data economy. AI/ML is the engine accelerating the progress of many innovative capabilities that are of enormous benefit for all of us, scientific discovery and medical breakthroughs, to name just a couple. However, in parallel, our resources and ingenuity must be also invested in protecting data privacy in this new data-driven world, or the Data Economy will not be able to progress apace. I believe that privacy enhancing technologies will play an exceedingly important role in protecting data privacy while enabling the extraction of value from data through privacy-protected collaborations across ecosystems and data value-chains.