I'm back, on a different PC as the old one's mobo died. There's always hidden impact when one swaps PCs or "just" re-installs, such as lost passwords, bookmarks etc. that were scattered all over MS's messy user profile subtree. So it goes... also, this article is one I found tedious to write, having written the same sort of thing so often before. Once it's done and out of the way, I can get on to more interesting stuff that's come up since!
For software to be "safe", it must behave consistently with the level of risk that the user was expecting to have undertaken (or avoided). If software is not safe, then it no longer represents the will of the user, and therefore it's not secure - because even if you know who the user is, you are not getting the user behavior your organization expected.
Don't take risks on behalf of the user
Software that acts ahead of user intent, has to bear full responsibility for whatever follows as a result of that action. Examples abound; autorunning scripts in arbitrary "documents" or unsolicited email "messages", autorunning CDs as they are inserted, autorunning HTML scripts when "opening" a directory on the hard drive, exposing RPC services to the Internet, creating and exposing hidden "admin" network shares, "touching" arbitrary files on the hard drive as part of background services or persistent handlers, etc.
Display risk information in terms the user understands
Users understand data vs. program, view/read vs. run, Internet vs. my own computer files. Use this level of concept, with a "More information..." button leading to background and technical details.
Pitching this information in "your" language, such as corporate IT-speak of user accounts and so on, or raw tech detail such as file name extensions, helps some folks while alienating others.
Dumbing-down the language so that risk info is lost - hidden file name extensions, the generic "open" concept, blurring data vs. program behavior, displaying the local PC's content as if it were a seamless part of the internet - helps absolutely no-one. Stop doing that, please!
Be bound by the risk information you displayed
If a file is displayed to the user as "ReadMe.rtf" and it's internally a Word .doc with autorunning macros, do not assume an "honest mistake" and take the higher risk of running those macros.
If a file is displayed to the user as a safe-ish file type, but your generic "open" code sees an MZ marker hidden inside that indicates raw code, do not assume an "honest mistake" and run as raw code. The same goes for raw code within .pif and .bat files; if these are not truly .pif or .bat, then generate an appropriate error and abort. Yes, this will break those poorly-written apps, and force them to be fixed.
That is entirely appropriate Darwinian filtering - bad apps must die! We use settings like "Options Explicit" to trap bad code before it's released, while it is cheaper to fix; apply safety sanity checks to trap bad code after release, to limit its market success and spread.
Do not allow content to mis-represent itself
Once again, examples abound. For example, we are supposed to forget about file name extensions and trust icons instead - yet the most dangerous file types of all (.exe, .pif) can set whatever icon they like, and thus spoof any "safe" file type.
For another example, consider pop-ups spawned by web sites that look like internal system dialog boxes. Consider how the "cancel" or [X] UI elements can be coded to execute the material, contrary to user expectations.
Allowing arbitrary web sites to run code on visitors' PCs (and thus "own" them, in terms of Microsoft's "Rule #1" security mantra) is terminally stupid. It will be painful to stop doing that, because the Internet's web developers have grown to depend on the ability to poach user resources and interact deceptively or coercively with them. Pick that fight, and win it.
XP's SP2 was a step in the right direction in beating a retreat from "Ieeeee! Fore!!" stupidity ("all the world's a web page, and we are but icons in the clickstream" or "the network is the computer - so do you feel like troubleshooting a million-processor hydra you can't even access?"). If we can't get the wolves back into Pandora's Box, then (as an industry) at least have the decency to admit you screwed up big time, akin to the wasted years of trying to fly by flapping arms, or cure infectious diseases via leeches or holes drilled through the skull - and listen to what we are trying to tell you. Sometimes the answer is "nay"; don't shoot the messenger for saying so.
07 July 2005
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment