Building a slide deck, pitch, or presentation? Here are the big takeaways:
- Peter Thiel’s data consultancy Palantir was used at firms like J.P. Morgan to track every aspect on an employee’s life, in an effort to protect against insider threats.
- Big data vendors like Cambridge Analytica and Palantir offer helpful analytics, but can put companies in regulatory trouble or tarnish their brand.
Data privacy has become a topic of critical concern among tech and business leaders, following revelations that Cambridge Analytica harvested the data of some 87 million Facebook users to build targeted political content.
However, Cambridge Analytica is only one of perhaps many firms doing similar work. Another company that has engaged in such controversial activity is Palantir, a data analytics consultancy founded by PayPal Mafia member Peter Thiel.
According to an investigative report from Bloomberg, published Thursday, Palantir was used at companies like J.P. Morgan to protect against insider threats. But, it was eventually used by an employee to spy on company emails and listen in on employee phone calls, including those of senior executives.
SEE: Big data policy (Tech Pro Research)
Palantir scrapes and combs through large swaths of data, using its tech to analyze the information and make connections between certain networks of people, the report said. It was originally used by the government to track terrorist activity following the September 11, 2001 attacks on the World Trade Center.
The Los Angeles Police Department (LAPD) also uses Palantir extensively, the report said, to predict which individuals are likely to commit crimes. This list of “chronic offenders” is given to patrolmen, who are also given orders to stop them as much as possible, often using tickets for crimes like jaywalking. Data collected on the subject is added to a digital card that some LAPD officers can access without a warrant, the report noted.
Palantir also has a contract with Immigration and Customs Enforcement (ICE), where it helps handle case management for immigrants.
Big data and predictive analytics are used heavily throughout the enterprise in many industries. And while this information provides value in efforts such as targeted advertising and marketing, it also bring many risks to a company that partners with those providing such services.
With the heavy political pressure facing Facebook over its data debacle, it’s very likely that the US will soon see regulations akin to the EU’s General Data Protection Regulation (GDPR) for handling personal data. If that is the case, companies that partner with vendors that perform functions like those of Cambridge Analytica or Palantir may find themselves out of compliance.
Outside of the regulatory or compliance risks, a company also puts its brand and reputation at risk by engaging with such vendors. Many companies–such as Mozilla–pulled advertising from Facebook after the Cambridge Analytica scandal was revealed, as to not be associated with the controversy. As more is known about the work of Palantir, it could damage partner companies’ reputations with their customers and employees alike for using the service.
This risk extends to the big data work a company does in-house as well, Sheryl Kingstone, a research director at 451 Research, told TechRepublic. “They don’t have to go to Cambridge Analytica,” Kingstone said, as they can do similar things with data they have collected internally. Other big platforms from tech companies like Amazon and Google can also be used.
Pointing to a recent 451 Research article, Kingstone noted that customers used to be willing to share their data with a company for “something of value” like personalized offers. Now, however, “the trust is going down,” Kingstone said, as customers are beginning to realize just how much data is out there. And companies will have to work hard to become more transparent if they want to win those customers back.
“As a society we are entering a pivotal moment where we need to decide whether the benefits of sharing our data outweigh the potential downside risks including reduction of privacy and potential behavior manipulation,” said IP Architects president and cybersecurity expert John Pironti. “Individuals like the idea of free services that add value and benefit to their lives, but often do not realize that someone has to pay for those services for them to continue to evolve and be successful. If you are not paying for a product then you are the product.”