There’s a relatively newly discovered (within the last few months) computer worm called Stuxnet, which exploits several Windows vulnerabilities (some of which were patched some time ago) as it installs itself on people’s computers. It largely replicates through USB memory sticks, and not so much over the Internet (though it can replicate through storage devices shared over networks). And it’s something of an odd bird. Its main target isn’t (at least for now) the computers it’s compromised, and it’s not trying to enslave the computers to send spam, collect credit card numbers, or mount attacks on web sites.
It’s specifically designed to attack one particular industrial automation system by Siemens, and it’s made headlines because of how extensive and sophisticated it is. People suspect it’s the product of a government, aimed at industrial sabotage — very serious stuff.
The folks at F-Secure have a good Q&A blog post about it.
There are two aspects of Stuxnet that I want to talk about here. The first is one of the Windows vulnerabilities that it exploits: a vulnerability in .lnk files that kicks in simply by having an infected Windows shortcut show its icon:
This security update resolves a publicly disclosed vulnerability in Windows Shell. The vulnerability could allow remote code execution if the icon of a specially crafted shortcut is displayed. An attacker who successfully exploited this vulnerability could gain the same user rights as the local user. Users whose accounts are configured to have fewer user rights on the system could be less impacted than users who operate with administrative user rights.
Think about that. You plug in an infected USB stick, and you look at it with Windows Explorer. You don’t click on the icon, you don’t run anything, you don’t try to copy it to your disk... nothing. Simply by looking at the contents of the memory stick (or network drive, or CD, or whatever), as you look at its icon and say, “Hm, I wonder what that is. I’d better not click on it,” it’s already infecting your computer. And since most Windows users prior to Windows 7 ran with administrator rights, the worm could get access to anything on the system.
You need to make sure this security update is on your Windows systems.
The other aspect is interesting from a security point of view. From the F-Secure Q&A:
Q: Why is Stuxnet considered to be so complex?
A: It uses multiple vulnerabilities and drops its own driver to the system.Q: How can it install its own driver? Shouldn’t drivers be signed for them to work in Windows?
A: Stuxnet driver was signed with a certificate stolen from Realtek Semiconductor Corp.Q: Has the stolen certificate been revoked?
A: Yes. Verisign revoked it on 16th of July. A modified variant signed with a certificate stolen from JMicron Technology Corporation was found on 17th of July.
I’ve talked about digital signatures in my other blog, at some length. When the private keys are kept private, digital signatures that use current cryptographic suites are, indeed, secure. But...
...anyone who has the private key can create a spoofed signature, and if the private keys are compromised the whole system is compromised. When one gets a signing certificate, the certificate file has both private and public keys in it. Typically, one installs the certificate, then exports a version that only contains the public key, and that certificate is made public. The original certificate, containing the private key, has to be kept close.
But it’s just a file, and anyone with access to it can give it to someone else. Shouldn’t, but can. If you can compromise an employee with the right level of access, you can snag the private key and made unauthorized “authorized signatures”.
In most cases, it’s far easier to find corruptible (or unsophisticated) people than it is to break the crypto. And if the stakes are high enough, finding corruptible people isn’t hard at all. The Stuxnet people may well have a host of other stolen certs in their pockets.
Comments