Thursday, September 15, 2011

Risk Model Inversion

Over the last few years I have had several discussions with colleagues, co-workers and friends around turning their mental models of risk management upside down (at least as far as it applies to taking risks within a particular job or consulting engagement).

Way back in 2003 I had just accepted a new job at a fortune 500 company and did "the unexpected". I took a big risk in my first project. Coming from the outside, I was surprised to learn that my employer at that time had not been using conventional UX practices or deliverables in their projects. I went to both my teammates and to my leadership and told them that the best way for my project to be successful was to create wireframes and perform a paper-prototype test on the interface (neither of which had they ever seen before). I did this knowing one basic truth:

Starting at day 1 in any job or engagement, your ability to try new things or to get change-oriented requests approved decreases over time. Stated another way: People don't like to crush the spirit of the new guy/gal.

The enthusiasm and energy of the new recruit is a cherished asset that will erode over time. Most clients, managers and co-workers, in my experience, unconsciously seek to extend that honeymoon period of rose-colored glasses by allowing the new teammate to demonstrate their capabilities (i.e., "give them enough rope to hang themselves").

Many people I personally know have tended to go the opposite direction and follow their instincts in choosing to "play it safe and establish themselves" in their new role before "shaking things up". In my experience, this rarely works and in fact works less the further/higher you go in your career. Case in point: a colleague who opined that one of his friends, who was the president of his firm, was unable to make the changes he thought were necessary to increase the overall chances of success within his shop. This was very curious to me as I sometimes have a hard time understanding why autonomy is not pushed down along with responsibility within organizations. In response to my question he summed it up thusly:

"He has been in his job 2 years and has not been able to meet the one part of the established success criteria. Given his perceived shortcomings, why would the owner/board take the risk now?"

Most leaders put someone in a position to fill some gap. Whether it's solving a problem or taking advantage of an opportunity, the wishful-thinking perception of the leader (especially the higher you go) is that the new resource will be "fire and forget" (i.e., give basic direction and then hear the good news at the end of the project or the engagement).

The dynamics in the large organizations I have worked with, tend toward rewarding those who perform without flaw rather than those who exhibit a presence of strength. This bias leads to any flaw becoming magnified and being used as the catch-all reason to not moving into unfamiliar territory (because most humans and animals alike correlate unfamiliar with uncomfortable). This discomfort with the unfamiliar tends to be smaller in magnitude than the discomfort of crushing the spirit of the new guy, but depressingly is larger in magnitude than the discomfort of saying no to the established person even if the established person has a good track record (see chart below).
The point where the two discomforts become close enough to flip the bias of the decision maker tends to happen somewhere between 90 and 180 days. I believe this is specifically because most of these types of decisions are made based upon qualitative relationship dynamics. The better you get to know a person, the more comfortable you become in pushing back. In other words familiarity breeds contempt where contempt equals comfort in being the source of disillusionment or disappointment.

I'm not sure I know how to solve this, in fact I am sure that I don't know. I only claim to have a method to identify a time to take risks when the tolerance for change is greatest.

Rating Time:

Playing it safe at the beginning: Garbage

Taking timely risks intelligently: Like it

"Remember that postcard Grandpa sent us from Florida of that Alligator biting that woman's bottom? That's right, we all thought it was hilarious. But, it turns out we were wrong. That alligator was sexually harrassing that woman." - Homer Simpson

Wednesday, September 7, 2011

True value


Those who work with me know that I am often wont to say: "The best value good user experience consulting can bring to executives of medium to large enterprises is sleep at night." In my experience, executives in medium to large companies are plagued by a downward spiral dynamic that leaves executives awake at night trying to find answers to questions.

The CIOs ask themselves: "Why do they hate me?"

The CMOs, and many other business leaders, ask themselves: "Why don't they get it?"

The dysfunctional spiral I am referring to is so pervasive in corporate America that many a professional has given up on trying to improve things. People don't see any way out. It's been this way forever. It's been this way before most employees started working for the enterprise and it'll be this way after most people change jobs or retire. There are many contributing scenarios that result in this dynamic and I'll attempt to illustrate a few them here:

Scenario 1: Quantitative success can still mean qualitative failure.

Marketing/Business Executive: "I have an a great idea. Let's build a new system to automate process X. We'll save a million dollars annually!"

IT Executive: "That is a great idea. Let's start the process"

Marketing/Business Executive: "ugh!"

<insert corporate business case budgeting processes here>

Marketing/Business Executive: "OK! My team has made an business case. I am allocating X dollars in budget. Get to it!"

IT Executive: "Great! Let's start the requirements gathering process"

Marketing/Business Executive: "ugh!"

<insert requirements gathering processes here>

IT Executive: "My team has gathered requirements. Sign off here and we can start building it!"

Marketing/Business Executive: "Do I have any other choice?"

IT Executive: "Don't worry. Your team helped make the requirements. The system will do all of the things it says in the SRS."

Marketing/Business Executive: "OK. I guess."

<insert development processes here>

Marketing/Business Executive: "My team tells me that the system isn't what they were led to believe.

IT Executive: "My team tells me that the system meets all the requirements."

Marketing/Business Executive: "My team tells me nobody is going to use this thing."

IT Executive: "That's not my problem."Marketing/Business Executive: "ugh"

Scenario 1 | Epilogue
IT Teams more often than not judge success or failure in quantitative terms and use a checklist like approach to define success. This sort of language is aligned with most business executives, so projects float along until someone figures out it is a failure.

The developers and contractors are labeled as incompetent. Major blame is put on the nature of the organization itself as it is not in the position to make any effort to raise the level of talent in the work force.

The bodies are then hidden and crime scene cleaned up so nobody important gets a bad performance review (but that's a story for another day).

The known contributing factors to this dynamic are as follows:

  1. The deployment centered methodology that is central to corporate culture in America - this orientation creates a development philosophy that believes that a wrong product served on time is sufficient.
  2. The complete lack of understanding or appreciation that corporate leaders have developed with the regard to the skills and activities necessary to create quality experiences (agile methodology has shown some promise to fix this, but as it does not attempt to hit the dysfunction at it's root, only time will tell)
  3. The IT bias towards functionalism and left-brained thinking. The idea that function is not only superior to form, but that form is irrelevant compared to function creates the space for the above scenario to start.
  4. The root as I see it (and yes this is a recurring theme for me) - contempt for others. Contempt bleeds out as as a lack of respect for the perspectives, thoughts, methods, time, effort, etc. of others. Much of American culture, business or not, falls into a narrative cycle wherein everything should be simple and clean. If it is not simple and clean, than someone else is thinking incorrectly.

Scenario 2 | Scene 1: The shuffle.

Marketing/Business Executive: "I have an a great idea. Let's build a new system to automate process X. We'll save a million dollars annually!"

IT Executive: "That is a great idea. My development team can do it."

Marketing/Business Executive: "I've been down that road before. I want to outsource it."

IT Executive: "No! That will cost much more! Let our team do it!"

Marketing/Business Executive: "Well, alright. But it needs to be done next quarter and it can't cost more than X"

IT Executive: "No problem"

Scenario 2 | Scene 2: The deal.

IT Executive: "I saved this project from being outsourced. Don't screw it up."

Team: "With this deadline and budget restriction, we can't afford any training and we can't bring in any experts"

IT Executive: "I don't care to know how many bubbles are in a bar of soap. You asked me to keep the development work in-house and I did it. Now don't screw it up."

Team: "Ugh."

Scenario 2 | Scene 3: The flop.

Marketing/Business Executive: "This isn't what I wanted"

IT Executive: "Yes it is. It got done within the time limit and met the budget."

Marketing/Business Executive: "Ugh"

Scenario 2 | Epilogue
Business executives share the same misguided bias towards quantitative measures that IT personnel do, it just has a different set of targets: money and time. For some reason, executives can't seem to get on the same page about the realities of the contexts that face them (e.g., time constraints of the marketplace, skill constraints of the teams, the need for collaborative design work throughout a project lifecycle, the inherent risks in the waterfall model of traditional SDLCs, etc). With these realities, it's not a wonder that 3/4 of IT projects fail. The wonder is that the ratio is not higher.

The known contributing factors to this dynamic are as follows:

  1. The American business paradigms that elevate short term results above all (this has been discussed in detail by people all over the world for more than 30 years).
  2. The missing roles for research and design disciplines within large corporations (another story for another day).
  3. The rampant practice of empire building within corporate America (I think this one has roots in American culture more than anything else).
  4. Fear of change (this is part of the human condition) 

What I find very curious, is that people don't actually use the same singular focus on budgets and time outside of work. People, in my experience are not as reluctant to bend personal deadlines and budgets to get what they really want. For some reason, there has been a failure in the business community to admit that the current dynamic is inherently broken and that the rules and very structure of the game need to be changed in order to fix it.

I do believe that the injection of UX perspectives is a step in the right direction. However, I believe this step can only reach its potential impact when UX professionals in combination with IT and business professionals separate needs from positions. Focusing on needs rather than positions is the only way, in my experience, to bring the warring tribes together before they kill their projects or one another.

Separating needs from positions isn't as hard as it seems and while it's not the sole province of UX, UX seems to be very well positioned to drive the dialogue. Ultimately, it requires a curious, empathic mind in search of authentic motivations. This is what, for me, separates UX from interface design. A desire to understand the answer to a question simple to pose but hard to answer;"Why?"

Rating time:

Typical american business/IT culture: Garbage

"I want to share something with you: The three little sentences that will get you through life. Number 1: Cover for me. Number 2: Oh, good idea, Boss! Number 3: It was like that when I got here." - Homer Simpson


Saturday, September 3, 2011

Contempt Hall of Fame

Comcast and Blockbuster have been one-upped by Allstate. While not as brilliant as Comcast's outsourcing of system integration to it's customers, in terms of displaying blatant contempt for it's customers, Allstate has got it going on like Donkey Kong! The highlight came when an Allstate manager literally claimed that "Allstate is not accountable for the promises made by its employees". Talk about guts! Telling a customer that anything they or any other employee says to you is meaningless! Allstate must have hired a senior official from the US state department. No other organization knows that the only sure way to avoid dialogue is to systematically shut down the conditions necessary for dialogue to occur.

Think about it, what can you say after this? Given that any actual resolution has no future value what-so-ever, the possibility of a fruitful discussion is completely nil after this basic statement.

My specific disagreement with Allstate is in this case not the relevant factor. The real importance to me is the formalism (an emphasis on the ritual and observance of religious dogma, rather than its meaning) rampant in US companies and how it is destroying customer service inside and outside of enterprises. I saw a brilliant presentation from Netflix the other day which highlights the flip side of formalistic process-centric cultures.

Many organizations teach process as a thing to be worshipped separate from the intended state to be arrived at via adherence to the process. This fundamental flaw creates organizational dissonance (a context wherein an organization has internal discomfort based on mis-alignment of it processes and cultural mores). You'll see this when a company representative says that they want to help you because they believe you are right, but that they cannot because a process prevents them (side note - they never actually say this...it's usually articulated as "I'm sorry. I can't do that", which infuriates me more, because they don't actually mean either of these two sentences, but there does not seem to be much I can do about the fact that actual communication is a dying art form).

This is not the root of the problem however, The real root is a lack of commitment to hire "the right" people (i.e., smart people who are aligned with the vision and values of the company). Both the Netflix presentation and the well respected business book "Good to Great" explain the concept very well. Process centric cultures are created to lessen the effect of bad/mediocre hiring decisions. The long and short of it is explained in two steps:
  1. Hire good people who are aligned with your values
  2. Trust them to make decisions aligned with your values

An old boss of mine described me and another process-breaking colleague thusly: "The difference between you two is that he will act first and beg forgiveness later and that you will act first and then deny that forgiveness is needed at all." This is a pretty accurate description of me - In professional contexts I'm usually trying to act in a manner that is in direct alignment with the long term goals of my leadership.

The sorts of organizations that use formalism, have not grasped a great lesson from the military - "Commander's Intent". When my leadership asks me to do something, I ask an annoying list of detailed questions to suss out what they are actually trying to achieve and I use this understanding of intent to make the appropriate calls on the field. Arbitrary rules are for employees who cannot be trusted to make decisions appropriately. The question I ask of these organizations is this: "If you can't trust your people to make reasonably good decisions, then why did you hire them?"

Rating time:

Rigid processes that sap the spirit and passion from an organization - CRAP

"Lisa, if you don't like your job you don't strike. You just go in every day and do it really half-assed. That's the American way." - Homer Simpson

Thursday, March 24, 2011

What's holding back UX

I don't know if my peers would agree with me.

On second thought I'm pretty sure they would disagree.

Actually, the more I think about it, they might even be skeptical about my claim to be their peer.

Which brings me back full circle to the title of this post - What's holding back UX.

UX is a niche discipline. UX strategy is a niche within a niche. Consulting organizations barely recognize UX at all. Some would argue with this point, but I believe that's because the big organizations that have UX professionals more often than not put them in the role of interaction designer and call it UX. Only a few private industry companies have just begun to allow the discipline into their organizations and a ridiculously small minority have actually created an operationalized talent and role structure to support the discipline. Most companies who hire Information Architects ram them into some other label in their existing structure because HR doesn't recognize the need for a discernment.

Is this because the discipline is "new"?

That just doesn't ring true to me. When businesses started bringing programmers into their companies in the 60's and 70's, I dont think they were shy about calling them programmers or system analysts or some other unique label. And even if they did not know what to call them, it did not take the dozen years that UX has been a formal discipline to create a unique title taxonomy within industry.

So why is UX a niche discipline? I have a unique theory.

Pretentiousness.

I have been formally working in the field since 1999 and was a avid reader of HCI books back in 1993. I still to this day get judged all the time by other UX professionals because I came at UX from technology. In 1999, because I was an engineer, other UX practitioners assumed that I not only lacked an ability to design useable interactions, but even went so far as to say that listening to my input was by definition a waste of time. While not all designers or researchers or strategists or visual designers treated me like this (big props to my peeps who worked with me on cancer.org and vitaminshoppe.com), the vast majority went out of their way to snub me and every other software engineer or architect I worked with because we were, in their eyes, not educated in design. This attitude is still rampant today and may even be more so with the hordes of graduates from formal HCI programs across the country.

There are too many flaws and horrible repercussions of this to name, but I'll go over my top 5:

1) Not all engineers or non-designers are alike. Many of us actually care that something will be adopted and used. While I whole-heartedly agree that there are way too many technologists who are too biased by how much perceived effort it takes to write code to make something work in a particular way, it is not universal. And i truly believe that if anyone took the time to show the engineers the math behind why making something useful, usable and desirable was the right thing to do for the business, they would be on the bandwagon cheering the loudest. Any software geek I have worked with has been easily converted to User Centered methodology once they understood why it was superior during the discovery, concept and design phases of a project. The numbers are just too compelling for a geek to deny.

2) The exclusionary attitude scuttles the whole philosophical premise of UX - people count and deserve to be treated in a way that makes them feel respected. You can't be taken seriously as a practitioner who supposedly cares about people's perception when your demeanor towards your teammates is so arrogant.

3) Revolutionary breakthroughs in any discipline only come from those who can see past the conceptual boundaries that hold back transformative progress. Non-designers have something designers lack - a lack of knowledge of convention of the design industry. The very reason their input is met with disdain is the reason they should be embraced.

4) UX needs more allies. We have to fight to get in on strategy and concept. And sometimes even have to fight during the design process. The more allies UX has, the less adversarial the process will be, and subsequently more opportunities, acceptance and success will follow.

5) Work actually can be fun. The most fun projects I have worked on are the ones where the collaborative multi-disciplinary process was set up as "play time". Weather its a design slam or an ideation workshop, collaborative projects are more fun and are more often more successful (duh...teams that like and respect each other more often than not produce better work).

Some readers may argue that software geeks can often be this way too. I agree, but it's just not as pervasive in my experience.

One seemingly esoteric ingredient that I believe has led us all to this place of pretentiousness is surprisingly enough the semantics of the disciplinary labels them selves. When you label disciplines and people as "Creatives", "Designers" and "Technologists" or other variations on these themes, it is an implicit slight to the people on the outside. Are UX professionals who use crazy hard applications not technical? Are software geeks who solve ridiculous challenges not creative designers? This may be heresy in the field, but I believe that being more careful in how we create and apply these labels will go a long way to starting to tear down the adversarial boundaries between the disciplines.

Rating time:

Pretentiousness of disciplines: Crap

Acceptance of outside perspectives: Like it

"If you really want something in life you have to work for it. Now quiet, they're about to announce the lottery numbers." - Homer Simpson

Monday, January 10, 2011

Official prognostication

I'm a relatively avid user of LinkedIn.com and I can easily say it falls into the bucket of "like it" for me.

This does not mean, however, that the site cannot be improved. I had an idea a couple of weeks back that I believe to be inevitable and it would not surprise me if LinkedIn is the first to do it.

I have given and received my fair share of recommendations on LinkedIn, but I know that they are pretty close to worthless. It is my belief that the recommendations are only really used by the people getting the recommendation, and not at all in the way that they seem to be intended - to allow a third person to gauge if they would like to work with or hire the person being recommended.

Think about it for a second. Do the recommendations actually have the supposed effects that they are intended for?
  • Do recruiters or hiring managers or clients really look to see if a person has recommendations? Maybe, but I doubt it.
  • Do recruiters or hiring managers or clients actually read recommendations? Maybe, but I highly doubt it.
  • Do recruiters or hiring managers or clients take the recommendations into consideration when making a hiring decision? Maybe, but I completely doubt it.
Why would any of these effects take place when everyone knows that the person being recommended can filter all the recommendations anyway? Everyone already knows what your friends would say about you in an open forum if you had complete editorial control.

So if they don't have these effects what effects do they have? I'll throw a couple I find much more likely out there:
  • The recommendation alerts people in the network of the person being recommended that they are a flight risk from their current job.
  • The receiver of the recommendation is likely to post a recommendation for the giver of the recommendation as a gesture of gratitude to the giver.
  • That's it. I'm stumped after that.
So I got to thinking....what would actually result in the desired end state of influencing a person's ability to get an opportunity? Then it hit me.

Create an open rating ecosystem for people just like movies on Rotten Tomatoes or products on Amazon.

Now before you think I'm completely insane, please hear me out.

In my scenario, Person A (let's call her Alice) could decide on her profile to set ratings and recommendations to 1 of 3 levels where Alice's profile calls out what model she has set.
  1. Authenticated - any person who is willing to have their identity linked to the rating can say whatever they want, rate the person as a professional on some sort of scale (as simple as 1 to 5 stars, or as complex as a multi-dimensional rating system, it doesn't really matter for the purposes of this topic) and Alice cannot edit or remove it.
  2. Connected - any person who fits the criteria in number 1 and is already connected to Alice can say whatever they want, rate the person as a professional on some sort of scale (as simple as 1 to 5 stars, or as complex as a multi-dimensional rating system, it doesn't really matter for the purposes of this topic) and Alice cannot edit or remove it.
  3. Filtered - Alice gets final say before anything hits her ratings page.
In this model, people would have the option to open themselves up to criticism from their peers and what would be so wrong about that?

Would flame wars happen? Maybe. But I really don't think so, as people would fear retaliation.

The worst I can see happening is that a couple of so-so reviews might get out and would be out-weighted by the community at large.

What I think would really happen is 2 things:
  1. Over time, people would be compelled to at least move to Connected, in that allowing yourself to be open to criticism would be perceived by others (e.g., hiring managers and recruiters) as a sign of a stronger professional, and because a person with a Filtered ratings page would be perceived as maybe having something to hide.
  2. People would be MUCH more selective about who they connected with because any actual connection could then say anything about them (this could be counter-balanced by having a contextual connection model which is a post for another day).
One additional thing that could also happen but to a smaller degree is that people who chose "Authenticated" would also be perceived differently in that they would either be really stupid, really brave or really good at not ruffling other people's feathers.

As long as there was a good way to visualize the ratings and reviews for people, I believe recruiters and hiring managers would come to depend upon them in the same way consumers depend on online ratings and reviews of products and services.

I think this model will be out there sooner rather than later. Weather it's on a professional site or a dating site, it's just a matter of time.

Rating Time:

Filtered rating systems: Garbage
Open rating systems: Like them

"If something goes wrong at the plant, blame the guy who can't speak English" - Homer Simpson

Sunday, August 22, 2010

It takes two to tango

The amount of coverage that has been given to JetBlue flight attendant Steven Slater is not surprising to me at all. What is surprising to me is that no one has hit upon what I believe is at the core of the hostile encounter.

Some articles blame Slater as an individual, more articles and pundits speak to the trend in air-rage, some blame the downgrades in service. While I understand the thinking behind all of these, I don't think any of them have identified the core dynamic that is occurring repeatedly.

It is my belief that what we are seeing is a variant of the Stanford Prison Experiment, wherein flight, airline and airport personnel from terminal to terminal have been placed in a pseudo-prison guard role and passengers are nearly prisoners.

The authoritarian undertones from flight, airline and airport personnel is palpable. Quite often, the communication borders on contempt. Things that could be phrased as polite requests for cooperation are worded as mandates from an all powerful machine. The security checks further the metaphor and end result is rebellion.

The solution to this problem does not lie in spot fixes like returning peanuts to flights, but rather in analysis and increased training across the entire air-travel eco-system from organizational behavior professionals. Air travel workers need to be able to recognize confrontation and hostility and be trained to both avoid and defuse it.

In the meantime, try a little bit of niceness (OMG! I used "nice" in a positive way! I'm on record for hating the word!) and stop blaming passengers for responding to cramped spaces combined with overt and rigid authority with hostility - it's human nature.

RATING TIME

Prison guard mentality - Garbage

Steven Slater's dramatic exit - I hate to admit it, but I like it

Maybe, just once, someone will call me 'Sir' without adding, 'You're making a scene.' - Homer Simpson

Saturday, August 7, 2010

Why poor design seems to be the rule in business.

I've been working as a consultant advocating good UX design for more than 15 years now and one thing has pervaded almost every interaction with executive management. I'm constantly asked to justify the time and expenditure required for good design practices. Nobody ever asks for a business case to justify the poor design practices that are systemic in corporate IT. My guess is that people do not recognize that the lack of an intentional design is still a design. It's just a poor one (usually).

In thinking about this topic again and again, I think I've had a revelation. I now understand exactly why this attitude is the rule.

It is a habit learned over the last 50 years.

It takes a person about 66 days to form a habit. I could not find any research on how it takes for an industry to form a practice.

Think about it. When computers first entered into business environments, most people did not interact with them, most people interacted with the artifacts that computers could produce and with minority of people who could program the computer using punch-cards. Do you remember punch-cards? Have you seen them in documentaries? This is where the habit started. At this time the equation was very simple:

Cost to design and create a new interface system more usable than a punch-card reader > Cost to train the people who interface with the computer

This was abundantly true for so many reasons:
  • The people who interfaced with the computer in the time of punch-card readers were super geeks and punch-card logic came easily to them
  • The people who interfaced with the computer in the time of punch-card readers were very few in numbers
  • The concept for other possible interfaces did not even exist yet
As time progressed and command line interfaces became the norm, this equation held. The number of people who interfaced with the computer increased ever so slightly, the types of people using them did not shift at all, and a small group of people saw the possibility of graphical interfaces, but the numbers were still overwhelming.

As time progressed even further and WIMP interfaces (thank you Xerox!) became the norm, this equation still held. The number of people who interfaced with the computer increased a little more rapidly, the types of people interfacing with them began to shift as people who used computers in grade school hit the work force, and a different, but still small, group of people saw the possibility of putting standardized graphical frameworks on top of information systems, but the numbers were still overwhelming.

Time moved on yet again and web browsers have now become the norm (thank you Mozilla!), and despite the fact that the equation has finally shifted most businesses do not even realize the basis on which the original decision was made. It's not anyone's fault. There is no "big book of corporate assumptions" lying around that people are supposed to check every couple of years. Just like a habit, the mode of operating has become somewhat unconscious. When executives ask for the business case for good design, I do not believe that they know the basis for the question itself has completely changed.

  • The number of people who interface with computers in business or consumer settings is rapidly approaching 100%.
  • The types of people who interface with computers has dramatically shifted in ways beyond thinking styles; People of all ages now access computers and a new generation has entered the workforce; A generation of workers who don't view their employers as bosses, but as an easily replaceable organization entering into a trade agreement with them.
  • Useful, usable and desirable interfaces and experiences are readily conceivable (thank you Amazon & Apple!)
The equation has changed!

The required investment in user experience pales in comparison to the amount required to train an entire population of job-hopping workers and fickle consumers.

The first step in breaking the habit is admitting we have a problem. If we are to remain economically viable we must challenge our base assumptions.

RATING TIME:

Non-intentional design habit - Garbage

Turning over a new leaf - Like it

"Oh, people can come up with statistics to prove anything, Kent. 14% of people know that." - Homer Simpson