Cellebrite, an organization that sells smartphone hacking equipment to law enforcement, military, and government intelligence bureaus has been hacked, according to a report published by Motherboard. The hackers have expropriated 900 GB of data from the company through a customer support portal, including customer information, technical documents, support tickets, and usage logs from currently deployed Cellebrite “Universal Forensic Extraction Device” (UFED) systems.

SEE: Mobile hacking firm Cellebrite confirms server breach (ZDNet)

While the data has been made available to Motherboard, as well as passed around among certain individuals on IRC channels, the full trove of data has not been publicly released at this time. Cellebrite’s statement on the hack noted that “…the company is not aware of any specific increased risk to customers as a result of this incident,” and that “The company is working with relevant authorities regarding this illegal action and are assisting in their investigation.”

What exactly is Cellebrite?

Cellebrite is an Israeli company, founded in 1999, which originally produced devices that facilitated the transfer of data between phones–these devices were sold to mobile carriers for use in retail settings for customers upgrading to a new phone. While that business continues, the smartphone hacking subsidiary was started some time afterward.

In 2007, Cellebrite was acquired by Sun Corporation, a Japanese electronics firm perhaps best known in the United States for developing and publishing home video games in the late 1980s and early 1990s, such as Blaster Master and Spy Hunter.

Hackers hacking hackers, or, is turnabout fair play?

The legal and moral justification for selling hacking tools is a tenuous one, particularly when such organizations engaged in the sale of hacking tools accept clients that employ less than scrupulous methods. The Michigan State Police have been previously challenged over their use of Cellebrite devices in ways which may violate the 4th Amendment.

Outside of the United States, Cellebrite devices have been used in countries with poor human rights records, such as Turkey, Bahrain, and the United Arab Emirates, according to support tickets among the trove of data released to Motherboard. Of note, the 1998 Wireless Telephone Protection Act explicitly prohibits the use or sale of devices capable of cloning SIM cards, a feature that Cellebrite’s UEFD systems include.

It should scarcely be surprising that the activity of selling access to the information of others is one that would raise ire in security research circles, particularly as much of the heavy lifting of this work–finding and documenting software vulnerabilities–is more often done by academics and professionals in security research, rather than the organizations marketing to them. The hack of Cellebrite is the most recent event in a relatively recent dispute between organizations that sell packaged exploits of security vulnerabilities, and independent security researchers and hackers.

In 2015, 400 GB of data–including source code–was distributed online following a hack of the uncreatively-named Italian firm “Hacking Team” by a hacker identified as “Phineas Fisher.” ZDNet reported at the time that “Hacking Team” had previously denied selling spyware to Sudan, while a receipt for €480,000 ($530,000) from Sudan was found among the leaked documents. Among the products the Italian company was selling were open-source code from security researchers such as Collin Mulliner.

SEE: IT leader’s guide to the Dark Web (Tech Pro Research)

The moniker “Phineas Fisher” was likely derived from a different hacking event, as that hacker was apparently previously responsible for the release of a 40 GB archive of source code and client lists from “Gamma Group,” the vendor of the FinFisher malware package, which was also sold to repressive governments that used it for spying and retribution against domestic political rivals. Reporters Without Borders in 2013 named Gamma as an “Enemy of the Internet.”

What’s your view?

Should tighter controls be placed on the sale of packaged exploits? Should security researchers adopt a standard license prohibiting the use of proof-of-concept code by vendors of packaged exploits? Share your thoughts in the comments.

See also