Nearly a dozen employees have quit to protest the tech giant's work for the Defense Department's 'Project Maven,' where AI is used to analyze drone footage.
Since its inception, Google has promoted an employee-inclusive decision making process and popularized their internal motto of "don't be evil." But Monday, about a dozen Google employees resigned in protest of Google's involvement in the development of artificial intelligence (AI) software for a Defense Department drone program called Project Maven.
The employees told Gizmodo that since news of Google's involvement broke earlier this year, senior management has been less than forthcoming with the decision-making process on the issue and believed it had been addressed sufficiently through their statement and a few employee listening sessions.
More than 4,000 employees of the company signed a letter last month condemning Google's work with Project Maven and demanding more accountability with how the company deploys its products.
"Google is implementing Project Maven, a customized AI surveillance engine that uses 'Wide Area Motion Imagery' data captured by US Government drones to detect vehicles and other objects, track their motions, and provide results to the Department of Defense," the employees wrote in the letter. "We cannot outsource the moral responsibility of our technologies to third parties. Building this technology to assist the US Government in military surveillance - and potentially lethal outcomes - is not acceptable."
SEE: Employee political activity policy (Tech Pro Research)
The issue has only gained more steam as technology scholars, academics, and researchers chimed in on the larger implications of AI being weaponized by the US military. A petition signed by 90 academics calls for major technology companies to sign onto an international treaty that would ban autonomous weapons systems.
"With Project Maven, Google becomes implicated in the questionable practice of targeted killings. These include so-called signature strikes and pattern-of-life strikes that target people based not on known activities but on probabilities drawn from long range surveillance footage. The legality of these operations has come into question under international and U.S. law," the academics wrote in the petition. "These operations also have raised significant questions of racial and gender bias (most notoriously, the blanket categorization of adult males as militants) in target identification and strike analysis. These problems cannot be reduced to the accuracy of image analysis algorithms, but can only be addressed through greater accountability to international institutions and deeper understanding of geopolitical situations on the ground."
Google has defended their involvement in the program, saying their technology will handle tedious tasks that waste soldiers' time, while also making drone surveillance more accurate.
"An important part of our culture is having employees who are actively engaged in the work that we do. We know that there are many open questions involved in the use of new technologies, so these conversations--with employees and outside experts--are hugely important and beneficial," a Google spokesperson said in a statement after news of Project Maven became publicized last month.
The spokesperson added in the statement that their work was "intended to save lives" and that they were working on internal policies to govern complicated decisions involving AI technology and defense contracts.
Both the Defense Department and Google have adamantly denied that AI will be used in combat situations, but Marine Corps Col. Drew Cukor was quick to add the phrase "any time soon," during a defense-tech conference speech last year.
Google's response to the situation was not enough, according to the former employees who spoke to Gizmodo, who said in the interview that "the strongest possible statement [they] could take against this was to leave."
In addition to the letter released by nearly 4,000 employees and the petition signed by academics, the Tech Workers Coalition created their own petition criticizing Google not just for Project Maven but for doubling down on the controversy by bidding heavily on a contract to work on the Pentagon's JEDI program, an effort by the military to integrate cloud computing into their work.
Google is in competition with Microsoft and other tech giants for a number of Defense Department contracts, and US military officials have repeatedly said publicly that they are in an "AI arms race" with the rest of the world. According to the Wall Street Journal, the Defense Department spent $7.4 billion on technology involving AI last year alone.
But industry stakeholders are already ramping up calls for tech companies to be more transparent about their military work and at least have policies in place to adjudicate decisions of this magnitude.
"According to Defense One, the DoD already plans to install image analysis technologies on-board the drones themselves, including armed drones. We are then just a short step away from authorizing autonomous drones to kill automatically, without human supervision or meaningful human control," the International Committee for Robot Arms Control wrote in their open letter to Google's leaders.
For other tech leaders, the resignations sparked by Google's work with Project Maven is a warning sign of the potential unrest that can come from such a huge disconnect between employees and leadership. Company leaders must work to be transparent about their goals with employees, so as to avoid the issues that come from holding opposing goals.
The big takeaways for tech leaders:
- About a dozen Google employees quit due to their opposition to Google's involvement with the Defense Department's AI-drone program Project Maven.
- The employees told Gizmodo that Google's leaders had failed to address their concerns about the program and largely ignored their complaints about Google's involvement with the US military in any capacity.
- How to implement AI and machine learning (ZDNet special report) | Download the report as a PDF (TechRepublic)
- Startup uses AI and machine learning for real-time background checks (ZDNet)
- Amazon AI: Cheat sheet (TechRepublic)
- AI shouldn't be held back by scaremongering: Michael Dell (ZDNet)
- Google employees demand end to company's AI work with Defense Department (TechRepublic)