Too many TLA’s

I had a teacher once say that IT is riddled with TLA’s (Three Letter Acronyms).  He thought it was hilarious.  It wasn’t until I started really looking into IT and security especially that I realized he was right.  In the realm of technology there are some acronyms that most people know HTTP, IP, and PC and so on, but when you add Security it turns into something you would expect in your alphabet soup.  PCI-DSS, SOX, FISMA, ISO, HIPAA, HITECH, UDP, TCP, CERT, IR, XSS, CSRF, PWN, IPSEC, SSCADA, and the list goes on.  I am sure that you could guess some of them but the first six are probably the most debated.  Payment Card Industry – Digital Security Standards (PCI-DSS), Sarbanes-Oxley (SOX), Federal Information Security Management Act (FISMA), International Organization for Standardization (ISO, don’t ask it doesn’t make sense to me either), Health Insurance Portability and Accountability Act (HIPAA), and the Health Information Technology for Economic and Clinical Health are some of the security standards that businesses have to worry about.

Aside from these are any internal audits that companies have to pass.  Many times all this adds up to one thing, confusion.  Take a company that handles their employee’s healthcare records as well as having a federal contract while being a publicly traded company.  This company has to deal with parts of HIPAA and HITECH as well as FISMA and SOX.  You would think that these standards would correlate and go hand in hand but they were all developed independently so they have different requirements.  This is where Security Professionals are the most challenged.  Whether they are securing their network or auditing a network using these standards, there is a challenge.

Most often what happens is that a company that is trying to meet the requirements of these standards does one of two things, they either do the bare minimum to meet the requirements right before the deadline or they essential put everything behind Security and do things that will make the company more secure from their point of view but do so at the cost of usability.  Now I have written about how usability and security need to go hand in hand so that isn’t the angle I want to take right now.

My main focus is that when companies think they have to choose between security and usability it creates not only a hard time for users but it creates a situation where users do things in order to  get  around the security measures, thereby creating security holes that weren’t accounted for.  Such examples of this are writing down passwords and usernames, saving usernames and passwords on the browsers, saving documents on a USB drive, and trusting links that may not be legit.  While this can be solved with good user training there is no need to put that burden on the users, especially when if a company is compromised because of the workarounds the company still ends up paying the fine which can amount to millions of dollars.

Basically my suggestion is for all companies to stop looking at security servers and networks and start securing Information.  That way it leads to looking at the data they are securing and not what is holding it.  This might force them to walk through what users are going to do once their applications and network is set up and working.  Hopefully this will allow them to start truly incorporating both usability and security into their business.

As a side note, if you are interested in the true cost of a security breach there is a research project that I was a part of a few years ago that was presented at a conference.  The video is kind of poor quality but the information is valid.  I didn’t present it but did work on the external costs, those aside from any possible fine that is part of a security breach. http://vimeo.com/5384048


Putting the Organization in Info Management

Cheeky screenshot of text exchange with a big, uncaring bank
You know this is a dramatization of the event because Big Bank doesn't answer texts after 5pm

Information management can be described using a couple of different but fairly similar models. The University of Washington’s iSchool depicts a triangle-shaped model of Information, People, and Technology. However, our readers might notice this blog examines the intersection of People, Information, Technology and Organizations (this model is explained in greater detail by Ping Zhang and Robert I. Benjamin in a paper titled “Understanding information related fields: a conceptual framework”).

We’re square rather than triangular, if you will.

Why do we add organizations? Because gathering and acting on information changes fundamentally in an organizational context. And sometimes, information behavior within an organization can be downright bizarre or frustrating.

Here’s an example: I went out to dinner a few weeks ago with friends, and my debit card was declined (happily, the waiter did his best to not treat me like a deadbeat). Since the card was declined for no obvious reason, I had a mystery on my hands. Unfortunately, customer service representatives (CSRs) at the national bank where I have my checking account were stumped as well.

Eventually, two weeks later – after three calls to the 1-800 customer service line, two trips to the local branch, and a dozen fact-finding missions through the online banking portal – my debit card was still not operational and I had been told it might be because the number had been stolen.

Think about all the failures in my interaction with the bank: I had several types of contact with different outlets of the organization, and none of them were satisfactory.  At least three CSRs were unable to access my account because I had opened my account in a different state (each of the representatives did sheepishly suggest I could open another account at the branch and then they could help me; I declined those offers).

I can do without naming the bank because this isn’t meant to be a Consumerist-type rant. But I think the episode does bring to light the irrational and haphazard information strategies organizations seem to employ. As a person and a consumer, I scratch my head when representatives of the bank cannot answer my questions or help me understand what is happening with my account access. But the madness of the situation also affects the bank employees: imagine the exasperation of working a front-line CSR job and having one’s hands tied routinely in a significant number of common issues.

But for the organization, this information strategy is working on some level. I imagine – the finance industry being particularly yoked by multiple layers of regulation – this national bank has designed its policies and procedures to serve up a savory dish of compliant operational spaghetti. Somewhere, a satisfied auditor completes an X on a checklist when a CSR in Washington cannot access my account, what with its Massachusetts provenance.

This is operational reality in the modern banking industry, and I mostly understand why my encounter with the bank was so dissatisfactory. However, I think such encounters are at the very least opportunities for learning in organizations. Holistic customer experience (a process of design that includes all touch points in dealing with customers, or even vendors, of organizations) should focus on tasks vital to customers at all service delivery points.

Here at infoscussion, we believe information management model has four facets – and that an organization’s needs can be separate from, but equal to, those of the people involved with the organization.

Lessons Learned on Design, Analysis, and Agile

Sticky notes clustered together for an agile planning meeting

When I was at InfoCamp a couple weeks ago, Erin Hawk and Samantha Starmer both spoke about how they’re working to integrate design into the Agile Software Development process.  This got me thinking about some of the successes and failures I’ve seen in Agile projects and how those related back to the integration of the analyst role into those projects.

Since Agile was a methodology created by developers for developers, it tends to be fairly development centric, and a lot of Agile implementations neglect both design and requirements processes.  As someone who has been in the development, analysis, and design roles of systems implementations, I have seen Agile applied with a variety of results.  Overall, the projects that successfully integrated designers and analysts into the Agile process tended to have much better results.

Many projects are unsuitable for developers to be the primary group interacting with stakeholders (that’s another post altogether).  Integrating designers and analysts into an Agile project helps ensure that the end product meets both business and user needs the first time (ie; on time & on spec). Not to mention, it makes the developer’s job easier by cutting through a lot of the ambiguity around a project and reducing the number of times developers will have to change what they’ve made.  Unfortunately, since Agile stresses speed and iteration over carefully planned designs, it’s not necessarily easy to integrate that into the process.

Here are four of the lessons I’ve learned on how to succeed in Agile development from an analyst’s point of view.

1 – Be an agile waterfall

Agile development focuses on rapidly developing, testing, and releasing a product, which has a tendency to squeeze out designers and analysts . However, many projects and products are so complex, and have so many stakeholders, they require specialists (aka designers & analysts) to handle that complexity.

The best Agile projects I’ve worked on keep all of the phases of the Systems Development Life Cycle, but compressed them down into small and fast chunks.   By reducing the project into a cluster of discrete but related deliverables, ie; user stories, you can keep all the phases of the SDLC but still move at a rapid, iterative pace.

2 – Use staggered phases to keep designers & analysts ahead of developers

One of main reasons that Agile has gained so much popularity is that developers and organizations became fed up with lengthy requirements gathering and design processes.  Such processes tend to lead to specs that are out of date by the time the developers get around to them.

By having designers and analysts work one phase (or two phases at most) ahead of the developers, organizations can deliver just-in-time specifications.  Ideally, the development team can put together a rough set of initial stories and prioritize work accordingly at the beginning of the project.  After that, an initial sprint (called sprint zero by some) can give the designers and analysts time to get ahead of the developers while they set up dev environments and take care of any other start up items they have.  Sprint length will vary from organization to organization, but in my experience two weeks is a good length of time for each group to do a thorough job with their current tasks.

3 – Coordination & communication is key

I can’t stress enough that everybody needs to be in constant communication throughout the process.  As analysts are working on requirements, they should be giving developers a heads up on what’s coming down the pipe.  This keeps the developers abreast of changes, and there will always be changes.  While developers are programming a current set of tasks, analysts can provide feedback and integrate any changes the developers want back into the requirements documents.  A good project manager is often key to this process, as project managers are skilled at keeping up communications and preventing the geeks from siloing up.

4 – You’re either all-in or you’re all-out

Either everybody is working on the project, or nobody should be working on the project.  One of the biggest wastes of time  is when development resources get pulled from a project, but the analysts and designers keep going.  In the best of cases it takes 6 months for the developers to come back, and keeping the rest of the team on the project will lead to a waterfall style mountain of  obsolete design documents.  In the worst case, the project gets cancelled, resulting in wasted effort by the design and analysis teams. If one team gets pulled from the project, management should sideline the project until everyone can get back to it.

Those are my 2 cents on the matter, and I’d love to hear about your experiences in integrating design and analysis into Agile projects.

Business vs IT

                After reading the last few posts by nickmalone and Jordan I started to think back about the companies that I have either observed or been employed by and I realized one thing.  There’s a large disconnect between the technical people and the business people.  Now I realize that some of you already know this but hear me out.  I think this causes more problems than many realize.  NickMalone’s post is a great example of what can happen when that gap isn’t addressed, and Jordan’s advice of value added statements is a great way to start fixing the problem.

                As many of you know, I have my Bachelors in Information Systems, which is more of a pure Information Technology degree as it had very few business classes.  I was very happy just learning about how to program and network computers.  The more I learned about networking specifically the more I thought I knew about succeeding in the business environment.  The last quarter of my Bachelor’s degree I took an Advanced Oracle Database class.  That class introduced me to a whole new thought process behind IT, that of IT is there to solve business problems.  Before that I hadn’t really but how IT related to business.  Now some of you may be saying, of course IT is there to solve business problems, what else would it be doing.  But I want you to think about any current or past IT project that you may be on.  What was the purpose?  There are always cut answers of improving user experience or improving the way the business functions but what really was the main goal behind the project.  What was it going to do to help that particular company succeed? 

                That is what I believe is the main point/goal  behind Nick and Jordan’s last posts and what I believe is the fundamental problem in IT departments.  To many IT professionals can make computers do amazing things but they have forgotten that IT is there to help businesses succeed, not the other way around.  I am sure there are cases where it is different but even for companies that specialize in IT consulting or Software design, every IT system should have a purpose and should be directly correlated to a business function.  Once that business goal or function is made aware and focused on I believe that IT projects will be smoother and stop having the Business needs vs. Personal needs issues that NickMalone talked about.

                Now I don’t mean to sound like this is all a problem with bottom-level IT people.  When was the last time you heard about an Executive of a company want to implement some new technology that they read or heard about?  Maybe they heard about want to implement a type of Social Network on their intranet and tell their direct reports to start making a business case for it.  Here again is another face of the same problem.  You shouldn’t make business cases for technology.  You should make “Technology Cases” for business problems.  It might be a slight adjustment but think about how many times technology gets implemented without proper planning so it fails.  If executives, managers and underlings alike were to start the planning and implementation phases of every project linking everything back to specific business problems, businesses would spend less and be more productive overall.

                Now I realize that I don’t have the years of experience of others who are reading this blog so I ask for your thoughts.  When was the last time you started a project that failed?  Did you know the main purpose behind the project?  Was that purpose if you did know it?  Have you seen a difference between projects that directly relate back to business goals and ones that are unclear of business goals?  Now this doesn’t address fully the change management side of things but I but I believe that if employees truly understood how these specific technology implementations helped not only them but the business as a whole you would have less push back, and believe me, being in Security, I know about users pushing back on new technology implementations, but that is a whole different post.

This feedback goes to 11

Guitar AmplifierI used to believe a person could learn more about management from a bad boss than from a good boss: it is easier to articulate what is missing from a working relationship than to notice the efforts of a good manager. Now, I think the truth is that a person always has expectations for a working relationship. The gap between expectation and reality is where learning, through constructive feedback, takes place.

When I left the workplace to attend graduate school full-time, I had a great boss. He was a great boss because he was a master of feedback: timely, thoughtful, economical, progressive feedback. Feedback is a personal information exchange we engage in throughout the working day; because of this, I would argue that maintaining a healthy feedback routine (outlined below) with other individuals is the foundation of a good working relationship.

Good feedback is timely. Work is a series of unending interruptions. It is natural to feel pestered by an employee or team member asking for feedback, but it is also important to support the priorities of the organization. Often, if I procrastinate on giving feedback, it has to do with not managing my own time well, or – this is worse – feeling I do not know the subject matter well enough to give meaningful feedback. In the latter case, my feedback should be: “I’m not the right person for an answer;” a polite “no” can also be appropriate feedback.

Good feedback is thoughtful. If I am the right person to give feedback, then it is my responsibility to really examine the item or issue in front of me. My former boss was good about this: his feedback contained questions that demonstrated he had thought about the item. Alternatively, if he had not set aside sufficient time to look at the item, he would set a time when we could sit down together. Courtesy creates goodwill.

Good feedback is economical. I mean this in two ways. First, feedback needs to be exactly as long as it needs to be. A good feedback routine gives each party a chance to clarify points, if needed, but it is a matter of personal discipline to make sure all points are salient. The second element of economical feedback means that it should be intended to maximize future efforts: it is better to determine at 10% completion that a project adds negligible value than to let sunk costs pile up. Employees and project teams will experience more satisfaction when they know all efforts are regularly analyzed to ensure added value.

Good feedback is progressive. There should be a common thread linking all feedback sessions, particularly between a manager and his employee. If a manager is unable to both criticize shortcomings of a project and praise improvements over time, he is either criticizing too much (which can paralyze an employee’s continued improvement) or leaving out recognition of improvement. As an added benefit, progressive feedback all but guarantees that an employee will know where she stands at regularly-scheduled formal reviews.

No one executes good feedback perfectly all of the time (certainly not me). And everyone experiences an occasional Terrible, Horrible, No Good, Very Bad Day that will derail the best intentions. In the long term, however, simply being mindful about the feedback one gives and receives goes a long way to improve working relationships.

Photo by Andres Rueda. Available under a Creative Commons 2.0 Generic license.

What I Learned at Info Camp

Word cloud of session descriptions for InfoCamp: Information, Design, Research, User, IA, etc...Last weekend, I attended InfoCamp, a community-organized “unconference” for people excited about information and how we manage and consume it.  This was its 4th year in Seattle, and I must say, it blew me away.

At first, I was pretty skeptical of the whole “unconference” idea.  There weren’t any predefined topics or sessions other than the keynote and plenary speakers and all the breakout sessions were run by conference participants.  I thought this meant the breakout sessions would be poorly prepared, and thus I wouldn’t like them.  As it turned out, there was a pretty good mix of highly polished presentations and more spontaneous ones.  The polished ones were good, and the spontaneous ones seemed to morph into enjoyable and informative Q&A sessions.

Overall InfoCamp gets an A in my book and I’m certainly going back next year.  And who knows, maybe I’ll even volunteer to run a session myself.  Without further ado, here’s a quick synopsis of the sessions I attended.

Content Management Strategy – More than just words

Vanessa Casavant gave a interesting & entertaining talk about how content management strategy fits into an organization.  The answer seemed to be right in the middle.  She explained that the role of a content strategist is to ensure that information & features being published by an organization have a clear alignment with the organization’s mission, and that they aren’t sending mixed signals about the organization to the end consumer of that information.

My key take aways were that content on the site needs to be in alignment with an organization’s strategy and that messaging should be consistent across the board, rather than a haphazard spray of content all over the organization’s website.

DAM Systems for Creative Agencies

Tracy Guza gave an overview of how creative agencies (mostly advertising firms) can employ Digital Asset Management (DAM) systems to manage their pictures.  This included file storage, searchability, meta data, and the whole nine yards.

I was particularly interested in this session because I work for a large stock photography company, and I wanted to learn more about how creative agencies use and manage the images they buy from us.  It was enlightening to learn some of the challenges facing a firm that manages 50,000 images, which are very different than the challenges my company faces in managing 14 million images.

Economics of Online Advertising

Jeff Huang led a discussion on how the economics of search advertising work.  This included the auction and fraud prevention.   Much of this was old hash to me, but it was interesting to get questions answered by an expert who has worked at all three of the big search engine companies.

Priority Inbox

Ario Jafarzadeh, one of Google’s designers on Gmail gave a talk about the design process behind Priority Inbox and all the iterations they went through to get to the current design.  It was pretty interesting to hear him describe all the decisions they made from what type of icons to use to whether they should use a video or a written document to explain it to users (turns out, they decided to make one of each).

This session was like manna falling from the sky.  I wrote a post about two weeks ago describing how much I liked priority inbox and speculating as to why it had taken so long for someone to do this.  I got my question answered, or least got his answer to it.  He explained that email is so personal that email providers have to get this sort of thing exactly right or else they’re going to upset their users pretty badly.  Apparently the machine learning (AI) behind this was so complicated that it took Google until now to build it. While I’m sure that’s true, I still think part of the answer is that nobody thought of doing that for email, or at least that nobody was willing to invest the time and money to make it happen until now.

If you’re interested in the keynote and plenary speaches, which were both good, you can find out more here.

The good, the bad, and the censored?

Empty Library Shelves

I got in trouble at the library once.

This story is pretty old, since it starts with a card catalogue (and by that, I mean skinny drawers containing actual typewritten cards). I was 10 years old, and looking for Judy Blume novels I hadn’t already read in the “Author” card catalogue. I found a book titled Forever.

When I lugged my stack of books up to the circulation desk, the librarian on duty was the dour one with a tough-on-talking stance. She slapped through the books inserting due-date cards and held up the copy of Forever: “Does your mother know you’re reading this?” she asked.

“No,” I said. And then I thought: Do I need to hide this book under my bed?

It wouldn’t have mattered. The librarian called my mother on the phone later that day. When Mom asked me about the book, I thought I was in trouble, which seemed pretty lame; I had skimmed the novel earlier and found almost no good parts. You know what I mean.

Mom explained that I was not in trouble. “I told the librarian there was no problem,” she said. “If there’s a book available for checkout, and you remembered your library card, then it’s your privilege to read it.”

A lesson imparted to me when I was young (and not just in this instance), was the importance of having unlimited access to everything the library offered. It also gave me the notion that it is impossible, and probably a little dangerous, to judge books or resources or information objects as being inherently “good” or “bad.”

In the digital information age – where 10-year-olds no longer stand on tiptoe to search through card catalogues – we seem to have access to boundless information through simple Google searches. Because of the access we have online, there are situations in which modifying information searches based on people’s needs is reasonable or necessary. For example, Google users can adjust search “safety” settings. In addition, “lifestyle” search engines, such as ImHalal.com, fulfill a specialized user need.

But some information access issues have the potential to affect all users. Last week, a bill called the Combating Online Infringement and Counterfeits Act (COICA) was introduced to the Senate (full text is here). The bill would enable both the courts and the Attorney General to blacklist Internet domain names; the vague wording of the legislation seems to threaten a slippery slope to government censorship. Just as the librarian made a value judgment about what information was good for me, COICA appears to expand the power of government to judge the utility of information for all users in the U.S.

This sort of legislation should inspire us to think about democratization of and access to information, and the role government should take in the information age. While I believe intellectual property should be protected, the scope of this Senate bill makes me uncomfortable. It is not clear to me why a government-mandated domain blacklist is necessary if we have a vibrant digital community capable of policing its own, as well as (mostly) responsible content hosts willing to comply with relevant laws.

I do not intend to make a political statement with this post, but developments like COICA remind me of a recent statement by Google’s CEO, Eric Schmidt: “Washington is an incumbent protection machine,” Schmidt said. “Technology is fundamentally disruptive.”  As users in the digital age, we should be aware of the gains to information access privileges we have made, and work to preserve the benefits of better access to information.

Photo by Lasse C.  Available under a Creative Commons Attribution-Noncommercial license.