The disclosure that Iraqi insurgents were able to intercept live video feeds from U.S. drones has focused the spotlight on a familiar IT security issue: data encryption.
In a story that's receiving widespread attention, the Wall Street Journal yesterday reported that Iranian-backed groups in Iraq and Afghanistan were tapping into live feeds from Predator drones using a $26 software tool called SkyGrabber from Russian company SkySoftware.
The hitherto largely unknown software product doesn't require Internet connectivity and is designed to intercept music, photos, video and TV satellite programming for free. Insurgents in Iraq, however, were able to use SkyGrabber to grab live video feeds from unmanned Predator drones because the transmissions were being sent unencrypted to ground control stations.
The fact that a sophisticated, multi-million-dollar aerial surveillance system could be compromised so easily because of a fundamental security oversight is stunning, several security analysts said.
“Frankly, this is shocking to me,” said Ira Winkler, president of the Internet Security Advisors Group. (Winkler is also the author of Spies Among Us and a Computerworld columnist.) “You have one of the most critical weapon systems in the most critical regions transmitting intelligence data unencrypted,” Winkler said.
While the intercepted data is likely to be of limited use to insurgents, it's still valuable, he said. “After all, one of the key attributes is, not knowing [that] a Predator is in the area,” said Winkler. “Everyone involved should have known much better.”
The apparent fact that the U.S. military knew of the vulnerability for a decade but assumed opponents wouldn't be sophisticated enough to exploit it is especially troubling, said James Lewis, director and senior fellow at the Center for Strategic and International Studies (CSIS). “The theory is that we encrypt the uplinks so that people can't take over the drone, but that we don't need to encrypt the downlinks,” he said.
“Those sorts of assumptions always get us in trouble,” said Lewis, who earlier this year led a group that developed a set of cybersecurity recommendations for the White House. “You can be sure that the insurgents weren't the only folks watching the feeds,” he said.
Alan Paller, director of research at SANS Institute, a Bethesda Mad.-based security training institute, said the incident highlights a “systemic problem” permeating most new weapons systems. “The designers see IP connectivity as a great capability enhancer and bring in designers to help them integrate the capability,” Paller said. “But those architects and designers think security is a compliance activity for security professionals and not their job. They are incapable of protecting the systems they design and build.”
The exception that proves the rule is the drones used by the CIA, all of which feature transmissions that are properly encrypted and protected, he said. “They understand how cyberattacks work.”
While the spectacular nature of the gaffe puts it in a class of its own, the compromise of the drone feeds is not very unlike countless data compromises involving loss of unencrypted corporate data. Though security analysts have long pushed the use of encryption as one of the most effective ways of protecting data, numerous companies have yet to implement the technology — in many cases because they're unwilling to spend money to encrypt data. At other times, concerns about complexity and key management have contributed to a reluctance to embrace encryption.
John Pescatore, an analyst with Gartner Inc. and a former analyst at the National Security Agency (NSA), said the drone incident stems from some “really bad decisions” made years ago about not encrypting the data sent back by drones. “Confidentiality on that downlink really should have been a mandatory requirement from the start,” said Pescatore, who has ranked the incident in his list of worst encryption failures ever.
Others on Pescatore's list include: a 2006 bid by Visa, MasterCard and Amex to hand out RFID cards that promised 128-bit encryption, but weren't actually turned on; Microsoft's attempt to encrypt the user password in Windows CE by simply XORing it with the word Pegasus spelled backwards; and sloppy key generation procedures by the Germans during World War II that allowed Allied cryptographers to break the Enigma encryption machine.
“Note that brute force was not needed in any of these cases,” Pescatore said in a blog post. “These incidents are all just based on dumb operational decisions to either not include, not turn on, or not manage security at all. Sound familiar?”