Monday, September 17, 2007

Design and Change in Highly Secure Corporate Settings ( a brief reflection )

The question posed was how do you design and develop in highly secure corporate settings, are there standards? what do you do when the corporate environment makes it too difficult to meet these standards?

As consultants we took standard security measures that went one step farther - everyone was required to lock computers (all laptops) upon leaving even briefly for a biobreak, all texts closed, printed matter of any kind was turned face down or placed in locked cabinets or shredded, no forwarding of internal email, text or images to external addresses, and use of Pretty Good Privacy encryption for FTP or transferred files over the Web to 3rd parties

When leaving for the day, all printed materials were removed from tables and desks, and all laptops went with the users. The building itself was cell dead, because it was a Faraday cage basically. As a scrum/agile team we shared a single phone which eliminated all but the most important and direct calls, such as arranging to be picked up from work.

This was not only the norm in the environment but specified in our contracts. Developers were required to perform a urinalysis (commonly called a 'pee test') prior to getting hired - but as it turned out that was not a requirement of the main company. A lead coming in refused on legal/privacy/moral grounds, and was transferred to another subcontracting firm where invasion of privacy was not promoted. Even better than that, the new subcontracting firm was honest, with the transfer came a $5 an hour raise. (He converted to full time almost immediately).


As PM/ team lead I requested everything be removed from all public and private working spaces which worked well. No casual public discussion of design / dev topics outside of our working environment and team members.

In prior orgs (which go unmentioned here) I encountered extreme difficulty explaining why one should use security, what the role of PGP was, and why use it (they had regulations against using any kind of encryption!) I insisted on testing Web security in application design. Finally my request went to an internal review board (Audit committee), and it gained backing for the spend (about a million to fix back end problems), using the following logical statement - "How many years do you want to have your CEO in jail for breaking privacy laws under HIPAA because the UI allows mal-use. " etc.

Some corporations are so far behind the curve on technology it is a struggle to work with them. I found shared terminology (language), and a safe phrase which worked - to a point - in convincing them to change, it was: "As your consultant I would not be doing my job if I neglected to point out X..."



A large part of being successful involved getting others onboard, through explanation and education of what is reasonable security (security audits in test, for Webapps and applications) and what isn't (pee tests for one class of workers), through associated risk assessment.

The first time an employee told me that he was doing a 'pee test' I thought it was some kind of software test for backend stuff I'd never heard of. He had to repeat himself - it was embarrassing. Then several others stepped up to say they had undergone urinalysis too.

No one even wants to say "pee test" much less do it - it just does not seem professional. If the level of what you are doing is highly specialized, you handle other people's lives, such as being a Space Shuttle Pilot, and need security, such as coding landing software for planes, and/or there are some really good reasons, such as you are observed coming to work apparently stoned, etc, ok that makes sense. But for designers and developers working on software projects, for the most part it's a scary and unnecessary invasion of privacy, with a questionable effect on security.

Usually fighting against established business practices that no longer make sense is a waste of time because the wave of change itself seems to swamp the environment eventually flattening all prior concepts of what should be used, done, or what standard processes and procedures are.

The natural quality of change in secure environments should be practical, do-able, applied uniformly for good reasons, not because "we have always done that" or "it's the rule" or "I am just following orders" - but for logical reasons that work to provide the level of security needed, even if it needs to change or be set as a standard in the future. I have found that the topic of change and security is an especially difficult one, which people resist for many reasons.

(photos in this article shot by Linda Lane, 2007)

No comments: