“Software flaws account for a majority of the compromises organizations around the world experience”
A gang of cyber thieves known as the Carbanak Ring recently made off with hundreds of millions of dollars in an online bank robbery that spanned the globe. They launched their caper with a salvo of malicious e-mails. The very fact that such a simple approach was effective demonstrates how cyber intrusions are enabled by a hi-tech sector which offloads the cost of its sloppy engineering onto the public. Never mind the industry-wide campaign of subversion conducted by government spies. Poor cyber security doesn’t just appear out of thin air. No sir, it’s baked in.
News of the theft broke a few days back in the New York Times which reported that over 100 banks in 30 countries had been hit by intruders. According to the Times they gained access to corporate networks by sending bank employees e-mails laced with malicious attachments. This is a technique known as phishing (or spear-phishing if the e-mail recipients are specifically targeted). On an aside, sending a malicious e-mail to a high-ranking executive in hopes of a big payday is known among cyber crooks as whaling.
Anyway, additional facts provided by the computer forensic specialists at Moscow-based firm Kaspersky indicate that the attacker’s spear-phishing operation leveraged specially crafted Microsoft Word documents which capitalized on security holes in the Windows operating system. For technical wonks in the audience here are the gory details:
“All observed cases used spear phishing emails with Microsoft Word 97 – 2003 (.doc) files attached or CPL files. The doc files exploit both Microsoft Office (CVE-2012-0158 and CVE-2013-3906) and Microsoft Word (CVE- 2014-1761).”
Kaspersky’s findings highlight an issue which the mainstream press is loath to address. While it’s true that people often click on things that they shouldn’t, like infected e-mail attachments, having users shoulder all of the responsibility is a pointless exercise in blaming the victim. Users should be able to open documents. That’s what documents were made for: to be opened. The same holds with regard to passwords and web browser hyperlinks. Software can be implemented to enforce password complexity requirements so that users choose strong passwords. Likewise users should be able to click on web links without ending up on the receiving end of a drive-by download.
The sad truth is that cyberattacks like the one chronicled by Kaspersky flourish due to sloppy engineering. For instance, intruders commonly rely on unpatched flaws, known as zero-day bugs, to steal data and wreak havoc. It’s part of the public record that the Stuxnet worm developed by the NSA utilized multiple zero-day flaws, as did the Equation Group’s malware. So go ahead, hector users until they’re paranoid and erect all the digital safeguards you want. Malicious software payloads wielding zero-day bugs will sail through your defenses as if they weren’t even there.
One reason that sub-standard engineering is so commonplace is that security isn’t a genuine priority for most hi-tech vendors. It’s more of sales pitch, a branding scheme, used to entice more susceptible members of the audience. Why spend money auditing code when it’s more lucrative to simply push new products out into circulation as quickly as possible?
Pssst, hey kid, use our latest chat program. It’s encrypted!
Existing market incentives encourage this stance as hi-tech vendors are allowed, by law, to treat security incidents as a negative externality. Ever wonder what’s buried in the fine print of most End User License Agreements (EULAs)? Now you know. When a bank is hacked as a result of poorly designed software it’s the bank that pays to clean up the mess, not the software company which sold them the faulty apps. Until this changes and hi-tech companies are held financially liable for their engineering screw ups we can expect the ongoing parade of massive data breaches to continue unabated.
But there’s also another more sinister reason why cyber security sucks: private sector monoliths like RSA collaborate with spies to construct hidden backdoors. In an effort steal secrets the spies at Fort Meade have worked with major American hi-tech companies and gotten them to imbed subtle yet intentional flaws in their products.
Some of these backdoors go all the way down to the hardware, where they’re accessed using obscure firmware hacks. As someone who has built rootkits I can attest that the hardware-level stuff is nasty: it can bridge air-gaps, successfully resist eradication, and persist across multiple platforms. The underlying attack vector is so powerful that strong encryption is of little protection. If U.S. spies can manipulate a machine’s firmware, as described in leaked NSA documents, swiping an encryption passcode is a cakewalk.
It’s ironic that U.S. officials complained loudly about Chinese companies embedding backdoors in their products when classified documents reveal the United States as a truly prolific actor in this domain.
During the uproar following the first round of Snowden leaks President Obama made symbolic gestures about changing the NSA’s predilection for zero day bugs only to leave a gaping loophole for cases which demonstrated “a clear national security or law enforcement need.” So if you’re wondering what’s behind the never-ending stream of high-profile cyber-attacks? Bad security isn’t an unfortunate accident. It’s a matter of official policy. A top-down scheme that benefits a small circle of spies at the expense of society’s collective well-being. Computer security for the 1%.
Bill Blunden is an independent investigator whose current areas of inquiry include information security, anti-forensics, and institutional analysis. He is the author of several books, including The Rootkit Arsenal , and Behold a Pale Farce: Cyberwar, Threat Inflation, and the Malware-Industrial Complex. Bill is the lead investigator at Below Gotham Labs.
 Shon Harris, “CISSP Exam Guide,” Sixth Edition, McGraw Hill, 2013, page 297.
This article was originally published by AlterNet.