Google has quietly secured a contract to work on the Defense Department’s new algorithmic warfare initiative, providing assistance with a pilot project to apply its artificial intelligence solutions to drone targeting.
The military contract with Google is routed through a Northern Virginia technology staffing company called ECS Federal, obscuring the relationship from the public.
The contract, first reported Tuesday by Gizmodo, is part of a rapid push by the Pentagon to deploy state-of-the-art artificial intelligence technology to improve combat performance.
Google, which has made strides in applying its proprietary deep learning tools to improve language translation, and vision recognition, has a cross-team collaboration within the company to work on the AI drone project.
The team, The Intercept has learned, is working to develop deep learning technology to help drone analysts interpret the vast image data vacuumed up from the military’s fleet of 1,100 drones to better target bombing strikes against the Islamic State.
The race to adopt cutting-edge AI technology was announced in April 2017 by then-Deputy Defense Secretary Robert Work, who unveiled an ambitious plan called the Algorithmic Warfare Cross-Functional Team, code-named Project Maven. The initiative, Work wrote in an agency-wide memo, is designed to “accelerate DoD’s integration of big data and machine learning” and “turn the enormous volume of data available to DoD into actionable intelligence and insights at speed.”
The first phase of Project Maven, which incorporates multiple teams from across the Defense Department, is an effort to automate the identification and classification of images taken by drones — cars, buildings, people — providing analysts with increased ability to make informed decisions on the battlefield.
“The technology flags images for human review, and is for non-offensive uses only,” a Google spokesperson told Bloomberg. “Military use of machine learning naturally raises valid concerns. We’re actively discussing this important topic internally and with others as we continue to develop policies and safeguards around the development and use of our machine learning technologies.”
The idea is to essentially provide a recommendation tool, so …
via The Intercept