Why do we need Information Management

    I am guessing most of the readers of this blog are in the University of Washington’s Masters of Science in Information Management (MSIM) program.  For those that aren’t, the MSIM program focuses on connecting Technology, People and Information.  I am sure you have heard the statistics about how much information is out there.  With the advancement of the internet, we have caused the amount of information in the world to explode.  All of this is well and good but the problem arises when you try to make sense of the information.  I was watching a TED talk recently that was basically an overview of what the MSIM program is without meaning to.  The talk is given by Thomas Goetz and it focuses on two things, first the use of fear to accomplish things and secondly the idea that more medical problems could be solved not by better medicine but by better information presentation. 

     As a security professional the  idea that fear wasn’t the best way to relay information was something that I hadn’t considered before.  If you have heard any sort of talk in regards to Computer Security you have heard that a hacker can steal you identity, your bank account and with a little effort your first-born.  Okay so I am exaggerating a little but every talk I have given or heard about Computer Security has been about the negative effects of not securing your network.  Then after giving presentations about how there is never a secure system they wonder why executives haven’t approved their expanded budgets.  I believe we, as security  professionals, are going about this all wrong.  Instead of focusing on how impossible security is, we need to start focusing on how we can make the network better overall with the enhancements that security brings.  In this realm I have found that UX people do a good job for the most part.  When they make a presentation about a new website design they don’t sit there and say how little traffic and how confusing the current User Interface (UI) is and then sit down. They quickly go over part of the problems the current UI and then go on to show how well their UI will work and what it can bring to the table.  Now this might just be an issue for Security professionals but I have a feeling it isn’t.  Overall, as professionals, we need to focus on the idea that has been thrown around this blog, and that is the Value Added principle.  Focus on what value you are going to add to the company and how much it will help in the short and long-term. 

     Now as a final statement, this doesn’t only apply to people working.  If you are looking for a job focus on what you can do for the company.  If you can get the other person even a little bit excited about what you could do for them or the potential you have to help their company you will stay in their mind.  And believe me the more good things you give the interviewer to remember you by the better. 

     Now I realize that this may not be new to most  of you but I found the talk incredibly interesting.  I have a link to it below in case anyone is interested.  What are you thoughts?  Is it better to go all positive?  Are there any drawbacks of only focusing on the Positive? Or is it better to talk about a combination of fear and potential?

Wikileaks and weak links

Photo of an unlocked gate padlockThis post is about Wikileaks, without being about Wikileaks. We know the most recent Wikileaks release was an overwhelmingly large set of data, generated by a fairly low-ranking intelligence analyst, and contains potentially sensitive information. The aspects of the Wikileaks scandal that fascinate me, however, are the human and organizational factors affecting data security.

Why did Bradley Manning do it? He must have known he would be subject to a long prison sentence (at best), and made no efforts to hide his actions. Assuming he was acting rationally, the benefits he imagined from doing so outweighed the prospect of certain punishment. Manning must have evaluated the volume and nature of the data at his disposal – data owned by his organization, effectively the U.S. government – and chose to place his individual motivations above those of the organization to which he belonged.

His own Wikipedia page and various media reports describe Manning’s “disillusionment,” and some opinion pieces paint him as “disgruntled.”

Disgruntled at the age of 23?

This fact points to the causes of the leak: it’s a people problem, more than an information problem. This includes security clearances, i.e., how many eyes need to see the information, but the solution is not about security clearances. Safeguarding organizational data such as that shared in the Wikileaks event is ultimately a management issue, for the following reasons:

A change in employee behavior is a crucial signal to management. It would surprise me if Manning’s behavior changed overnight from unassuming analyst to data thief. A good manager should look for changes in employee behavior that signal a shift in attitude. Furthermore, a manager should ensure he has enough information to act on if restricting or revisiting information flows becomes necessary, particularly in the event that an employee’s risk profile changes.

Digital natives exhibit different workplace values than their older counterparts. At 23, Manning is a digital native. Individuals under the age of 30 have grown up with technology in a world where a sense of possession is poorly defined in digital terms. Digital natives have a different notion of right and wrong in sharing information than previous generations of workers, even when information is proprietary to their organizations. Generation Y is also less loyal to organizations, and expects authority figures to earn their respect, rather than commanding it automatically. (I realize the Army is a very special kind of organization; however, the military cannot claim to be modernizing for warfare in the Information Age and expect to preserve outdated management philosophies, particularly when recruiting overwhelmingly from the digital natives demographic.)

Technology itself distracts from the human issues. Security specialists discuss access protocols and authentication procedures, but focusing on such issues is like staring at the end of someone’s finger when she points to a mountain in the distance. Internal data leaks are a real threat, but they are also perpetrated by people. The Information Age is changing the relationship between people and organizations. Adding to the urgency of the problem, today’s technological capabilities allow people to share and act on information as quickly as they think to do so. When “think it – do it” is the norm, it is important for an organization’s management to communicate expectations about information use and dissemination and to assess and monitor, in an honest way, the risks associated with information flows.

The landscape of information behavior is undergoing a major shift, and technology is merely an enabler of behavior. An individual’s ability to act impulsively, and with powerful tools that can execute enormously impactful actions digitally, should prompt organizations to manage closely the human aspects of internal security threats. It takes one weak link in an organization – unmonitored, disillusioned – to commit a destructive act with sensitive data. Although individuals should be empowered to make ethical, informed decisions when acting on behalf of their organizations, management culture must continue to adapt to the new Information Age, and its digital natives.

Photo by -Tripp-. Used in accordance with a Creative Commons 2.0 license.

Too many TLA’s

I had a teacher once say that IT is riddled with TLA’s (Three Letter Acronyms).  He thought it was hilarious.  It wasn’t until I started really looking into IT and security especially that I realized he was right.  In the realm of technology there are some acronyms that most people know HTTP, IP, and PC and so on, but when you add Security it turns into something you would expect in your alphabet soup.  PCI-DSS, SOX, FISMA, ISO, HIPAA, HITECH, UDP, TCP, CERT, IR, XSS, CSRF, PWN, IPSEC, SSCADA, and the list goes on.  I am sure that you could guess some of them but the first six are probably the most debated.  Payment Card Industry – Digital Security Standards (PCI-DSS), Sarbanes-Oxley (SOX), Federal Information Security Management Act (FISMA), International Organization for Standardization (ISO, don’t ask it doesn’t make sense to me either), Health Insurance Portability and Accountability Act (HIPAA), and the Health Information Technology for Economic and Clinical Health are some of the security standards that businesses have to worry about.

Aside from these are any internal audits that companies have to pass.  Many times all this adds up to one thing, confusion.  Take a company that handles their employee’s healthcare records as well as having a federal contract while being a publicly traded company.  This company has to deal with parts of HIPAA and HITECH as well as FISMA and SOX.  You would think that these standards would correlate and go hand in hand but they were all developed independently so they have different requirements.  This is where Security Professionals are the most challenged.  Whether they are securing their network or auditing a network using these standards, there is a challenge.

Most often what happens is that a company that is trying to meet the requirements of these standards does one of two things, they either do the bare minimum to meet the requirements right before the deadline or they essential put everything behind Security and do things that will make the company more secure from their point of view but do so at the cost of usability.  Now I have written about how usability and security need to go hand in hand so that isn’t the angle I want to take right now.

My main focus is that when companies think they have to choose between security and usability it creates not only a hard time for users but it creates a situation where users do things in order to  get  around the security measures, thereby creating security holes that weren’t accounted for.  Such examples of this are writing down passwords and usernames, saving usernames and passwords on the browsers, saving documents on a USB drive, and trusting links that may not be legit.  While this can be solved with good user training there is no need to put that burden on the users, especially when if a company is compromised because of the workarounds the company still ends up paying the fine which can amount to millions of dollars.

Basically my suggestion is for all companies to stop looking at security servers and networks and start securing Information.  That way it leads to looking at the data they are securing and not what is holding it.  This might force them to walk through what users are going to do once their applications and network is set up and working.  Hopefully this will allow them to start truly incorporating both usability and security into their business.

As a side note, if you are interested in the true cost of a security breach there is a research project that I was a part of a few years ago that was presented at a conference.  The video is kind of poor quality but the information is valid.  I didn’t present it but did work on the external costs, those aside from any possible fine that is part of a security breach. http://vimeo.com/5384048


Putting the Organization in Info Management

Cheeky screenshot of text exchange with a big, uncaring bank
You know this is a dramatization of the event because Big Bank doesn't answer texts after 5pm

Information management can be described using a couple of different but fairly similar models. The University of Washington’s iSchool depicts a triangle-shaped model of Information, People, and Technology. However, our readers might notice this blog examines the intersection of People, Information, Technology and Organizations (this model is explained in greater detail by Ping Zhang and Robert I. Benjamin in a paper titled “Understanding information related fields: a conceptual framework”).

We’re square rather than triangular, if you will.

Why do we add organizations? Because gathering and acting on information changes fundamentally in an organizational context. And sometimes, information behavior within an organization can be downright bizarre or frustrating.

Here’s an example: I went out to dinner a few weeks ago with friends, and my debit card was declined (happily, the waiter did his best to not treat me like a deadbeat). Since the card was declined for no obvious reason, I had a mystery on my hands. Unfortunately, customer service representatives (CSRs) at the national bank where I have my checking account were stumped as well.

Eventually, two weeks later – after three calls to the 1-800 customer service line, two trips to the local branch, and a dozen fact-finding missions through the online banking portal – my debit card was still not operational and I had been told it might be because the number had been stolen.

Think about all the failures in my interaction with the bank: I had several types of contact with different outlets of the organization, and none of them were satisfactory.  At least three CSRs were unable to access my account because I had opened my account in a different state (each of the representatives did sheepishly suggest I could open another account at the branch and then they could help me; I declined those offers).

I can do without naming the bank because this isn’t meant to be a Consumerist-type rant. But I think the episode does bring to light the irrational and haphazard information strategies organizations seem to employ. As a person and a consumer, I scratch my head when representatives of the bank cannot answer my questions or help me understand what is happening with my account access. But the madness of the situation also affects the bank employees: imagine the exasperation of working a front-line CSR job and having one’s hands tied routinely in a significant number of common issues.

But for the organization, this information strategy is working on some level. I imagine – the finance industry being particularly yoked by multiple layers of regulation – this national bank has designed its policies and procedures to serve up a savory dish of compliant operational spaghetti. Somewhere, a satisfied auditor completes an X on a checklist when a CSR in Washington cannot access my account, what with its Massachusetts provenance.

This is operational reality in the modern banking industry, and I mostly understand why my encounter with the bank was so dissatisfactory. However, I think such encounters are at the very least opportunities for learning in organizations. Holistic customer experience (a process of design that includes all touch points in dealing with customers, or even vendors, of organizations) should focus on tasks vital to customers at all service delivery points.

Here at infoscussion, we believe information management model has four facets – and that an organization’s needs can be separate from, but equal to, those of the people involved with the organization.

The good, the bad, and the censored?

Empty Library Shelves

I got in trouble at the library once.

This story is pretty old, since it starts with a card catalogue (and by that, I mean skinny drawers containing actual typewritten cards). I was 10 years old, and looking for Judy Blume novels I hadn’t already read in the “Author” card catalogue. I found a book titled Forever.

When I lugged my stack of books up to the circulation desk, the librarian on duty was the dour one with a tough-on-talking stance. She slapped through the books inserting due-date cards and held up the copy of Forever: “Does your mother know you’re reading this?” she asked.

“No,” I said. And then I thought: Do I need to hide this book under my bed?

It wouldn’t have mattered. The librarian called my mother on the phone later that day. When Mom asked me about the book, I thought I was in trouble, which seemed pretty lame; I had skimmed the novel earlier and found almost no good parts. You know what I mean.

Mom explained that I was not in trouble. “I told the librarian there was no problem,” she said. “If there’s a book available for checkout, and you remembered your library card, then it’s your privilege to read it.”

A lesson imparted to me when I was young (and not just in this instance), was the importance of having unlimited access to everything the library offered. It also gave me the notion that it is impossible, and probably a little dangerous, to judge books or resources or information objects as being inherently “good” or “bad.”

In the digital information age – where 10-year-olds no longer stand on tiptoe to search through card catalogues – we seem to have access to boundless information through simple Google searches. Because of the access we have online, there are situations in which modifying information searches based on people’s needs is reasonable or necessary. For example, Google users can adjust search “safety” settings. In addition, “lifestyle” search engines, such as ImHalal.com, fulfill a specialized user need.

But some information access issues have the potential to affect all users. Last week, a bill called the Combating Online Infringement and Counterfeits Act (COICA) was introduced to the Senate (full text is here). The bill would enable both the courts and the Attorney General to blacklist Internet domain names; the vague wording of the legislation seems to threaten a slippery slope to government censorship. Just as the librarian made a value judgment about what information was good for me, COICA appears to expand the power of government to judge the utility of information for all users in the U.S.

This sort of legislation should inspire us to think about democratization of and access to information, and the role government should take in the information age. While I believe intellectual property should be protected, the scope of this Senate bill makes me uncomfortable. It is not clear to me why a government-mandated domain blacklist is necessary if we have a vibrant digital community capable of policing its own, as well as (mostly) responsible content hosts willing to comply with relevant laws.

I do not intend to make a political statement with this post, but developments like COICA remind me of a recent statement by Google’s CEO, Eric Schmidt: “Washington is an incumbent protection machine,” Schmidt said. “Technology is fundamentally disruptive.”  As users in the digital age, we should be aware of the gains to information access privileges we have made, and work to preserve the benefits of better access to information.

Photo by Lasse C.  Available under a Creative Commons Attribution-Noncommercial license.