I had a teacher once say that IT is riddled with TLA’s (Three Letter Acronyms). He thought it was hilarious. It wasn’t until I started really looking into IT and security especially that I realized he was right. In the realm of technology there are some acronyms that most people know HTTP, IP, and PC and so on, but when you add Security it turns into something you would expect in your alphabet soup. PCI-DSS, SOX, FISMA, ISO, HIPAA, HITECH, UDP, TCP, CERT, IR, XSS, CSRF, PWN, IPSEC, SSCADA, and the list goes on. I am sure that you could guess some of them but the first six are probably the most debated. Payment Card Industry – Digital Security Standards (PCI-DSS), Sarbanes-Oxley (SOX), Federal Information Security Management Act (FISMA), International Organization for Standardization (ISO, don’t ask it doesn’t make sense to me either), Health Insurance Portability and Accountability Act (HIPAA), and the Health Information Technology for Economic and Clinical Health are some of the security standards that businesses have to worry about.
Aside from these are any internal audits that companies have to pass. Many times all this adds up to one thing, confusion. Take a company that handles their employee’s healthcare records as well as having a federal contract while being a publicly traded company. This company has to deal with parts of HIPAA and HITECH as well as FISMA and SOX. You would think that these standards would correlate and go hand in hand but they were all developed independently so they have different requirements. This is where Security Professionals are the most challenged. Whether they are securing their network or auditing a network using these standards, there is a challenge.
Most often what happens is that a company that is trying to meet the requirements of these standards does one of two things, they either do the bare minimum to meet the requirements right before the deadline or they essential put everything behind Security and do things that will make the company more secure from their point of view but do so at the cost of usability. Now I have written about how usability and security need to go hand in hand so that isn’t the angle I want to take right now.
My main focus is that when companies think they have to choose between security and usability it creates not only a hard time for users but it creates a situation where users do things in order to get around the security measures, thereby creating security holes that weren’t accounted for. Such examples of this are writing down passwords and usernames, saving usernames and passwords on the browsers, saving documents on a USB drive, and trusting links that may not be legit. While this can be solved with good user training there is no need to put that burden on the users, especially when if a company is compromised because of the workarounds the company still ends up paying the fine which can amount to millions of dollars.
Basically my suggestion is for all companies to stop looking at security servers and networks and start securing Information. That way it leads to looking at the data they are securing and not what is holding it. This might force them to walk through what users are going to do once their applications and network is set up and working. Hopefully this will allow them to start truly incorporating both usability and security into their business.
As a side note, if you are interested in the true cost of a security breach there is a research project that I was a part of a few years ago that was presented at a conference. The video is kind of poor quality but the information is valid. I didn’t present it but did work on the external costs, those aside from any possible fine that is part of a security breach. http://vimeo.com/5384048