The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.06 - November/December (2009 vol.11)
pp: 4-5
Published by the IEEE Computer Society
Arnold W. Bragg , RTI International Inc.
ABSTRACT
If security measures are too strict, users will find ways to circumvent them in order to do their jobs. Hence, systems can be lesssecure with stringent administrative policies than without them. IT Professional's editor in chief, Arnold W. Bragg, describes this conundrum in his final address to the magazine's readers.
My second term as editor in chief ends with this issue. I've thoroughly enjoyed working with our readers, contributors, editorial and advisory boards, IEEE Computer Society staff, and the many others interested in exploring the interface between engineering and the information technologies. For my valedictory, I'll tell you a story. I've changed the names to protect the innocent.
Several years ago, I attended what was supposed to be a 50-minute panel discussion of issues in statistical computing. The panelists were computer scientists and statisticians; like many in the audience, they were working with healthcare data sets containing tens of millions to tens of billions of observations. The first speaker, a well-known statistician, was uncharacteristically animated.
"You may have heard that I changed jobs six months ago. The company I used to work for was great, but I couldn't do my job any longer. A competitor was pleased to hire me and most of my staff," he stated. Frowning, the moderator rose from her chair. The other panelists exchanged puzzled looks; had someone changed the agenda?
"I worked with data containing personally identifiable information—names, addresses, dates of birth, social security numbers. I understood why this data required a high level of security. A breach would have been catastrophic to my company. We had dozens of healthcare-related contracts, some of them quite large. Any incident involving that data would have jeopardized an entire division. Our competitors—including my current employer—would have had a field day.
"My title was 'senior research fellow.' I did lots of hands-on work using visualization tools. To get to my data, I went through the following steps every morning. Everyone using this data did so," he explained. The moderator had moved nearer the podium; she was still frowning.
The speaker paused, sighed, and continued. "First, I booted my desktop PC. Then, I entered an ID and password for the full disk encryption product, even though my PC contained no sensitive data. I entered another ID and password to log in to my desktop's operating system. I opened a browser window to initiate a virtual private network tunnel, which required another ID and an RSA-tokenized password. I opened a second browser window to log in to a server via the VPN tunnel to get a virtual desktop; this required an ID and password. I opened a third browser window to get to an SSH client on the virtual desktop so I could log in to a secure Linux host with yet another ID and password. The virtual desktop couldn't retain information from my previous session, so I had to enter the server name, port, connection type, and a half-dozen configuration commands every time. Finally, I spawned a secure FTP session so I could move data sets around within the secure realm. This required another ID and password."
He silenced our chuckles with a scowl. "There's more. The desktop timed out after three minutes of no activity. The virtual desktop timed out after 10 minutes. The VPN tunnel timed out after an hour. The Linux session timed out after 30 minutes. The secure FTP session timed out after 15 minutes, even with activity. Moving the mouse or pressing the shift key wouldn't restart all of the countdown timers; some of them required active keystrokes. When one of them timed out, I had to remember and enter the appropriate password. I was constantly moving from one window to another just to keep the timers alive."
He began shouting. "If I muffed any of the passwords twice, I got locked out and had to spend 20 minutes on the phone with tech support! I had to change all those passwords every two weeks! I couldn't use the same password because various components had conflicting password naming restrictions! All my keystrokes were logged! The browser was customized to make it more secure, so I needed a different browser to fill out my timesheet each day! The system was unreliable because of all the potential points of failure! I wasn't allowed to print! It took 25 percent more effort to do anything! If I went to a meeting or out to lunch, it was back to square one! Punched cards were easier than this!" A security guard had moved to the front row; he munched a doughnut and remembered that his water bill used to come on a punched card.
The moderator deftly snatched the microphone from the speaker: "If the other panelists agree, I'd like to give John five additional minutes." She worked for a competitor and wanted to hear more. We all wanted to hear more. Vigorous nods from the panelists (one clearly delighting in the misfortune of his more distinguished colleague), spirited applause from the audience, and a crisp "Please continue; that must have been painful," followed. Frowning, she returned the microphone.
"Painful? Well, yes but that's not the point. It was painful, but so were a dozen other workplace irritants. No, the point is that we were less secure than we had been before all this. I was the custodian of that data. My career was on the line."
He had us. You could have heard a pin drop, even though the floor was carpeted. "Don't you see?" he barked. "We—my staff, me, everyone—could NOT work this way. Sometimes we were forced to … well, I won't go into that. You get the idea." We got the idea, but wanted to hear him confess his crimes. He returned to his seat.
The other panelists waltzed around the issue. The statisticians ruminated over the merits of scrambled compound identifiers. The computer scientists proposed using this technology, that product, those tools. Argument ensued: "This is too expensive!" "That isn't secure!" "Those are open source!" (Was that last one supposed to be a blessing or a curse?) Many in the audience, including the security guard, began moving toward the exits, grabbing what remained of the bagels and Danish on the way out. The moderator thanked the panelists, frowned, and left the stage.
The speaker's former employer, and perhaps the employers of many in the hall that day, had broken several of Scott Culp's "10 Immutable Laws of Security Administration" ( http://technet.microsoft.com/en-us/library/cc722488.aspx), the Second Law most egregiously: "Security only works if the secure way also happens to be the easy way."
"If you turn the [system] into a police state, you're likely to face an uprising. If your security measures obstruct the business processes of your company, your users may flout them. Again, this isn't because they're malicious—it's because they have jobs to do. The result could be that the overall security of your [system] would actually be lower after you implemented more stringent policies." [S. Culp, ibid.]
My spies tell me that the speaker's former company hasn't been compromised—yet—but it doesn't win as many healthcare IT contracts as it used to. Perhaps the 25 percent security surtax has made it less competitive. The Immutable Second Law says it has almost certainly made them less secure.
Arnold W. Bragg is this magazine's soon-to-be-departing editor in chief. Contact him at awbragg@yahoo.com.
26 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool