Security leads to insecurity?
By rtenhove on Sep 07, 2007
The new machine arrived, with XP SP2 installed, as expected. I set the machine up, carefully purged the unwanted demoware, added the firewall, anti-virus and anti-spyware armor, and plugged it into the network to retrieve its patches from Microsoft. After all, SP2 is fairly old; there have been a lot of patch Tuesdays since 2004!
I pointed IE6 at the windows update site, and things didn't go too far before I received an error message which simply gave me an eight-digit hexadecimal error code, and an offer to search for it on Microsoft's support pages. "Wow, I've found an obscure problem, if they don't have a text description of the error code!", I thought to myself.
It wasn't an obscure problem at all. In the name of security, IE6 was shipped with its security settings preset such that it could not even download the ActiveX controls MS uses to drive the update process. Further, when I found the right MS support pages for the mysterious hex error code, the fix it recommended wasn't quite right. I had to study firewall logs to discover what was going on, and adjust IE6's settings accordingly. After that, things went smoothly. (43 "high priority" updates, by the way, and many of them roll-ups of past updates, all security- or stability-related.)
My first reaction to this little adventure was predictable enough: surely this situation results in a lot of support calls to the computer manufacturer? That has to be expensive, shipping machines with one guaranteed support incident built right in. Not to mention the initial impression the customer must have. It really sounds like somebody forgot to QA the product properly after the security changes were made.
Then I had a more chilling thought: how many people would actually try to fix this problem? How many would even be aware of it? (The new, tighter security configuration includes shipping with automatic updates set to "on", and you have to know where to look to see if the updates are working at all!)
This was all beginning to sound like fodder for the Inside Risks column in the Communications of the ACM. In the name of better security, my new computer was configured to heighten the likelihood that I'd unwittingly use a system with three-years worth of vulnerabilities, many of them exploited by now. By increasing security settings locally (on the PC), they actually broke the larger security system that includes timely patches. Ouch!
This is an instance of a lesson we often forget in the software industry: local optimization can lead to suboptimal global system behavior, often in unexpected ways.