Innovation Games by Luke Hohmann, image via the IG website
This is actually an adaptation of an exercise from a book called Innovation Games: Creating Breakthrough Products Through Collaborative Play by Luke Hohmann. In it, the author describes an activity to understand customer priorities called "20/20 Vision”:
When you're getting fitted for glasses, your optometrist will often ask you to compare two potential lenses by alternately showing each of them. Although it may take some time, eventually you'll settle on the set of lenses that are best for your eyes. You can use a variant of this approach to help your customer see which priorities are best for them.
Start by writing one [product] feature each on large index cards. Shuffle the pile and put them face down. Take the first one from the top of the pile and put it on the wall. Take the next one and ask your customers if it is more or less important than the one on the wall. If it is more important, place it higher, if it is less important, put it lower. Repeat this process with all your feature cards, and you'll develop 20/20 vision for what your market really wants.
I want to point out that I don't think this is the most reliable method for trying to figure out the exact order in which customers would prioritize a specific set of attributes or factors. Something like that would be best answered with a quantitative survey of individual customers, especially if business decisions will be made based on the results.
But the main idea behind it led me to think that it could be modified into something a bit more fine-tuned and methodologically appropriate for a project I was working on a few months ago. I wanted to understand how people defined certain aspects of the customer experience for a particular client's industry. I still tried to get a general idea of how our participants prioritized the factors, but only to see if the data matched up with some quantitative findings gleaned from previous survey research.
The project: understanding the ideal customer experience
The attributes being researched consisted of elements of the customer experience that had already been deemed important by prior research. Since we already had our list of attributes, this was a fitting opportunity to further explore what each attribute actually meant in context, and if there were any differences in definition between participants.
For reasons of confidentiality, I can't share the actual list of attributes from the project. So, I came up with my own example. Say you work in the healthcare provider industry and you have a list of five customer experience/customer service attributes you want to know more about. Here is the hypothetical list of attributes:
- expert knowledge
Modifications to the original exercise / creating the blocks
In order to better suit the research topic/questions and my preference for participatory methods, I made some modifications to the activity before incorporating it.
First, I thought it would be better to involve the participants more by having them do the actual "arranging" of the list of attributes, rather than having me stand up at the front of the room doing it for them with index cards. This would allow them to take ownership of every aspect of the process, except for actually choosing the attributes at hand (although I did allow them to discard or add blocks if desired).
I imagined it might also be fun to split up the group into two teams - not to compete against each other, but just to see what similarities and differences might result and to make logistics easier (three people per group rather than six). This way I would also have more people contributing to the post-activity debrief.
Finally, I thought, why do we have to use paper index cards (as describe in the Innovation Games instructions)? How two-dimensional. Why not use something more interesting and tactile that adds to the hands-on, collaborative feel? I couldn't think of anything that would work better for this very purpose than big sturdy blocks of some sort. I searched the city up and down for giant plastic children’s building blocks and plastic containers and everything in between, but none of these seemed to exist.
So, I paid a visit to the local home improvement store and had an employee cut me some blocks from a couple 12-foot pieces of lumber into pieces about the size of a standard brick (I can't quite recall what type of lumber it was). Yes, they were heavy, but they looked nice and have a virtually infinite shelf life (since the attributes I glued to them can be easily removed or taped over). I was also lucky that my groups were taking place in locations to which I would be driving rather than flying, so I could just throw them in the back of the car instead of having to check them as extra baggage.
Getting the blocks cut
After I had my blocks, I printed two sets of the list of ten customer service factors in a simple black font with colored backgrounds. I sanded the rough edges of the blocks, then glued on the print-outs, taping them down for extra protection.
Blocks, glue and sanding sponge
Two sets of blocks with customer experience factors, with additional blank blocks
Conducting the activity
Fast-forward to about half-way through the first focus group. After spending a significant amount of time discussing some of the other topics in my guide, I finally got to the blocks activity, and here is how it went.
I explained that they would now be thinking about some customer experience factors related to the particular industry at hand using an activity involving blocks. I split the group up into two teams, one for each set. The blocks were arranged in a random order on tables in the back of the room, facing away so that our discussions prior to the activity would not be biased by the attributes written on the blocks.
I instructed them to go to the tables with the blocks and, as a team, to go through the factors and place them in order of importance, with the most important factor stacked at the top, the second most important factor under that, and so on.
I emphasized the importance of everyone's participation and of discussing their individual thoughts and feelings, and that they should try and come to a consensus if there were any disagreements. If they were unable to do so, they should put the block in question to the side. I also gave them the option of removing blocks they didn’t agree with, and provided them with extra blocks in case they came up with new attributes.
The customer experience factor blocks could be arranged in a number of ways.
As the teams worked, I observed their interactions, listened in on their conversations for important insights, and asked probing questions. If the conversation was stagnant, or if one person was making all of the decisions on where to put the blocks, I would try to facilitate some discussion. I made notes of any really interesting points of discussion or contention to remind myself to bring them up during the debriefing, or asked the participants themselves to do so themselves. I also took photos of the process and the blocks in their final order for inclusion into the final report (to showcase the participatory process of the activity).
After about ten minutes, I had the teams stack their blocks facing the main table, and had everyone return to their seats for a debriefing on the activity they just completed. Starting with the first team, I asked a series of questions that played off of the activity and allowed me to get more of a sense of their thoughts on the factors, including these:
- What made you all decide that these were the top three most important factors?
- What makes X important?
- How did these three end up at the bottom of the stack?
- What does this factor provide that the others don’t?
- Were there any missing factors? Any that should be taken out or are interchangeable?
I see the blocks activity as an exercise in social interaction and a useful catalyst for revealing how people define the ideal customer experience. All of the talking, sharing of opinions, debating, and collaborating provided insights unattainable by more traditional market research approaches. In the end, it wasn’t about the order in which the blocks ended up, but the conversation that resulted from the exercise. Plus, it was way more fun than using paper index cards, let alone relying on the old standby of a two-way conversation.
The findings from the project provided a deeper, more insightful perspective to previous phases of research, and ultimately allowed my client to make more informed decisions around specific ways to meet customer expectations and tailor customer experiences to each of the attributes. The client also used the findings from this project and previous phases to design a customer experience survey to gauge how well they are faring against industry competitors.