welcome to the opinionlab blog

Blog Archives

here it’s called the Voice of Customer Action Pledge


I’ve had Forrester on my mind for the last few days. Over that time period, the venerable research agency has had several extremely interesting things to say about the Voice of Customer industry. In a series of presentations at the Forrester Customer Experience Forum in New York and in a published report on the “State of Voice of the Customer Programs,” Forrester made some statements about our industry that certainly made me sit up and take notice.

According to Forrester, Voice of the Customer programs are at an inflection point. Though vital to improving the customer experience, most Voice of the Customer programs fail to reach their full potential. In particular, they struggle to drive actions from the insights they generate, struggle to be fully embedded in the broader companies they serve, and (as a natural result of the previous two statements) they struggle to show financial impact.

Forrester ended its latest published report with two bold statements:

1) VoC programs must expand beyond their narrow reputation as passive listening and measurement stations, to become active parts of broader Customer Experience Management efforts;

2) even in businesses that can put a precise value on customer experience improvement (not always a guarantee), VoC programs that cannot demonstrate tangible return on investment will wither and die–victims of a business climate where every manager is looking to wipe out inefficient spend.

Forrester is right: Voice of the Customer programs cannot be passive. Listening, measuring, collecting, surveying–those activities, though important, aren’t enough anymore. It’s time for a new paradigm to take hold: an action paradigm.  A Voice of the Customer program should be wed to the following process:

1) VoC program identifies actionable items emerging from real-time, contextual feedback from engaged customers.

2) These actionable items pave the way for concrete fixes aimed at improving the customer experience.

3) These concrete fixes yield tangible business results (such as increased sales, lower churn, or lower support costs).

4) The VoC program proves its worth to the organization and becomes relied upon by multiple teams as a critical decision support tool.

Step 1 requires three critical elements: 1) the data has to be real-time; 2) the feedback has to emerge from engaged customers, opting-in to the process rather than being intercepted; 3) the data has to be married to rich contextual information–on which set of pages did the experience occur? What device was the user on? Which manager was on shift at the time of the experience?

But it’s at the end of Step 1 that the rubber meets the road. That’s where the Action Pledge comes in. Will the company take action and make the concrete fixes necessary to improve the experience and generate tangible results? Do they trust the data, do they believe in the findings enough to drive change? Are they confident enough to go from saying “Oh my, that feedback is interesting; let me file it away” to “Holy crap, I better do something about this now, otherwise we might be in serious trouble?”

The Action Pledge is a firm commitment to acting on the Voice of the Customer and not letting it fall on deaf ears. The Action Pledge is something OpinionLab clients commit to right upfront: it’s a sacred promise aimed at driving constant improvement to the customer experience. Sometimes these improvements are about closing the loop with customers: fixing support issues, website errors, poor customer service. Sometimes these improvements are much more fundamental: they might affect pricing strategy, brand positioning, even HR policies. Either way, the Action Pledge turn a VoC program from a passive feedback station into an active platform for transforming the company and re-orienting it around the customer.

When a company takes the Action Pledge, it’s impossible to turn around and say that the VoC program had no impact. From Day One, actionable insights are catalyzing meaningful improvements to the customer experience. That’s when Step 3 kicks in: our clients are keen to measure the impact of their fixes, capitalizing on those that produce recurring results and fine-tuning those that don’t always work the first time around. That’s how programs build momentum, changing the way stakeholders engage with customers and putting the Voice of the Customer firmly at the heart of the company’s decision-making procedures.


timing is everything

Tick tock. Tick tock.

There never seems to be enough time in the day. We’re always rushing from one place to another, wishing those hours ticking by would delay just once. This is especially true when it comes to our businesses and websites. If only we had the time to explore every single user path, click and ask every single customer how and where they are getting to our website. If you are actually doing this, I commend you and want to know how you find the time. If you’re like the rest of us, you’re working with the time you’ve got.

When exploring ways to move the customer satisfaction needle on your website, I urge you to forget about time. Embrace the fact that you’ll never finish searching for the big web payoff, and look for tools and tricks of the trade that can help you save time.

Make Changes. See Changes

It’s a classic frustration with customer satisfaction metrics: despite changes and updates to a website, ratings don’t move. Most customer satisfaction tools are clunky and slow moving by nature, lacking the ability to pinpoint specific problem areas or track the impact of changes made to individual pages. This desired speed and agility is applicable to both small and large websites. The more complex a website is, the less successful a site-wide tracking measure becomes as an indicator of how a website is actually performing. If you’ve got two pages and few links, this might reveal the depth of tactical information you’re seeking. My guess is those sites are few and far between. To uncover issues, replicate problems, and construct solutions on the fly, businesses must combine granular, page-specific data with contextual diagnostics and the ability to structure unstructured  data.

The point being, feedback collection methodologies were made for speed and agility; allowing instant customization of comment card questions for individual pages without affecting the integrity of data and those techniques allow site managers to take action on key areas according to the data–as it comes in. Detailed VoC data and real-time reporting that tracks in the moment interaction uncovers a wealth of information that traditional reporting techniques just can’t.

Timing really is the name of the game.



the “r” word

The very word strikes fear into the heart of any average joe marketer. It’s public enemy no. 1 and in some remote areas of the world the mere mention causes children to cry; spoken out loud it’s even been known to crash websites in one fell swoop.  While a redesign is viewed with the same fuzzy association as the black plague, to us, it’s as simple as dispelling common folk lore surrounding the “r” word.

With more than twenty-five million ratings, comments from thousands of websites and a decade of experience, we’ve tracked quantitative and qualitative  user feedback from around the world. Which means, we’re well qualified to bust any redesign myth by tracking well-trafficked, public websites through their own redesign process. We’ve even identified some interesting patterns in users’ responses that can be extrapolated to any business’ website.

Don’t jump the gun. Before we even jump into what our data indicates, remember that sustainable effects from any web enhancements do not emerge immediately; take your time in collecting feedback and data since user responses typically do not stabilize until discovery and explorations phases are complete. It can sometimes take months for these patterns to emerge. For this reason, tracking metrics are imperative to measuring the sustainable effects of a website redesign. If you do decide to use a pre-post design rather than a tracking measure, assess your frequency of repeat visits and allow ample time before conducting a post-mortem on the results.

Monitor User response

Any website doctor must monitor measures of user reaction to assess comparative satisfaction from the new vs. old. We suggest several aggregate measures for this purpose:

  • Volume of ratings
  • Mean page rating across all rated pages
  • % of negative comments
  • an OpinionLab proprietary metric called the Site Opinion Index (SOI), a measurement that translates a website’s ratings on content, design, usability, and overall satisfaction into an index that ranges from one to one thousand. SOI is designed to be a more discriminating measure than an average page rating for a website.

Know the stages

When examining the shape of data throughout a redesign, there are several stages that formulate a redesign.

  • Pre-launch is the ground zero phase; it’s how your website is performing across all your aggregated measures before you begin your redesign. Don’t be tempted to skip this benchmark.
  • In the two to eight week discovery period, it’s how visitors initially experience your website design efforts. The mix of qualitative and quantitative data at this stage usually reveals “the know” on behalf of the user, where the initial recognition of efforts to improve the website begin to register.
  • Next users explore the redesign, usually through repeat visits to the site, where a stream of open-ended user comments are received. Users sign off on the new and improved design by addressing overlooked opportunities, functionality and expectations.
  • Finally, it’s important to remember your site is a living, breathing organism. Never stop measuring Voice of Customer, listening to what customers need and want and translating that feedback into tactical and strategic plans of action.

Regardless of the chosen metric, measures need to be tracked through each stage of the redesign cycle. Look for long term redesign insights to emerge, taking care to give enough time to collect data from users.  While it may seem like you need a medical degree to tackle a redesign, and lots of coffee for sleepless nights, with careful analysis and time, you’ll find that redesign efforts (and a little knowledge in SOI voodoo) aren’t really that scary after all.


Beware the Power Pukers

It’s as simple as the title suggests. Any organization can collect copious amount of data and puke it back out in real time. But vendors still line up to present their power data pukes as the latest, greatest, and most unique evolution in business intelligence.

I’ll take one nugget of real, actionable insight–something meaty that I can act on–over any number of bland, colorless reports, charts, tables, or dashboards.

We live in an era of superfluous data. The first decade of the web era was about the battle to liberate the voices of real people. Well, that battle has been won and now, we have too much. We are overwhelmed. We suffer from what Clay Shirky has called filter failure. The volume of data has overpowered our basic analytical capabilities. The center cannot hold; the system breaks down, the levees crack and we are drowned in meaningless information. My friend works as a web analyst at the major canadian telco. In a perverse twist on Avinash’s famous 10/90 rule, he spends about 10% of his time surfacing insights and 90% of his time wrestling with a convoluted array of reports, charts, and dashboards from myriad suppliers. How productive is that?

As a marketer, give me something simple, intuitive, and concrete. Give me something I can sink my teeth into, something I can action, something that will help me make more money. Anything else, and you’re just power puking out data.


Sixty top US trafficked websites can’t be wrong

Most audits suck.

OpinionLab is anything but traditional; we actually chose to conduct an audit. Of our own free will.  Take a moment to let our life choices sink in.

We took the sixty most heavily trafficked websites in the United States (as identified by Quantcast audience-measurement services) and took a look at each website’s feedback methodology.

We wanted to know what exactly these sites were doing, and if there were any shared insights which we could then apply to any Voice of Customer (VoC) program. Each website’s feedback methodology was observed, recorded and benchmarked against best practices for capturing VoC.

We already know a web-based VoC feedback system should do the following:

  • Be visible from every page of the website
  • Be non-intrusive and user invoked
  • Be clearly visible and available 24/7
  • Always be located in the same place, on the same web page
  • Accept simple, quantitative ratings as well as open-ended comments
  • Be accessible with a single mouse click
  • Not require the user to provide personal information
  • Not require users to leave web pages they’re visiting

When we compared general VoC practices to these top visited sites we were happy to find that all but one of the audited websites accepted user feedback.  That in of itself speaks volumes, but we wondered if these sites really were bringing their A-game when it comes to VoC collection techniques.

As a collective group:

  • Three of every four audited sites rely on a “Contact Us” link as their primary VoC collection method. These links either launched an online form or pre-addressed email for the user to complete.
  • 98% of audited websites collected some form of site-wide, open-ended feedback from visitors. This is great news for the VoC space and users everywhere.
  • Only about ½ of the sixty most visited websites make the feedback link easy to locate, regardless of the type of link being used.  We recommend locating the link in the same place on each page to increase user feedback volume.
  • Keeping with best practices, 59% of sites allow users to access feedback links from every page.
  • However, half of all users are then required to provide personal information when leaving feedback.
  • 54% of all sites place the feedback link at the same location on every page.
  • While contrary to general VoC practices, about half of all sites return the user to the appropriate page after feedback is submitted.

While many of these sites appreciate the value of collecting feedback, I wouldn’t say they are receiving top grades in the VoC feedback category. That being said, there are always shining stars of every class that not only follow best practices, but also help write them.

Maybe audits aren’t so bad after all.



we all work on commission

“…in a digital world where everything can be measured, we all work on commission. And why not? If you do great work and it works, you should get rewarded. And if you don’t, it’s hard to see why a rational organization would keep you on.

You don’t have to like this era of hyper-measurement, but that doesn’t mean it’s not here.”

Prescient words penned a while back by Seth Godin. Those of us who do business on the web have grown used to hyper-measurability. Every click, every link, every landing page, every post, every microsite, every call-to-action–we know the relevant metrics and we know how to make the ROI computation. When a page sucks, it gets re-built; when an ad fails to yield click-thrus, it gets yanked; when a CTA isn’t delivering results, we shift it.

Those of us born and bred on the web are totally accustomed to thinking like this. The real shock will come when (we’re just about there) this mode of thinking hits our brethren in the offline world: the store managers. Soon, technology will permit us to measure the efficacy of every aisle, every placement, every display, every employee interaction, etc. What a leap from what’s in place now! Brick-and-mortar locations now have access to two sets of data: sales volume and foot traffic. It’s not easy to derive any insight from those metrics. If a store is lagging in sales or is struggling to bring in traffic, where does blame go? It’s probably meted out arbitrarily by a district manager, who uses things like “intuition,” “his gut,” or “his experience” — all of which are thoroughly subjective criteria that have nothing to do with data.

Soon, there will be data where only subjectivity existed before. Soon, the employee at the big box retailer who completely pissed you off and left you swearing you’d never set foot inside that store again will be held as accountable as the landing page that fails to deliver. Those of us who’ve spent years in digital are used to this standard of accountability–for the rest of corporate America, it’ll be a colossal wake-up call.


Make sense of scary data

Change is scary.

When first starting a VoC program, it’s natural to feel a bit apprehensive about the changes ahead, unsure of how exactly how to filter the rich data streaming in and even more unsure of what exactly to do with all the insights you are receiving. But not to worry about data that goes bump in the night; we’re here with a few guiding principles as you develop your VoC program.

Implementation 101

  • Making it easy for visitors to find the feedback link may seem intuitive but you’d be amazed how easy it can be to overlook this crucial step. Floating links are more visible than a link in the header or bottom border.
  • Keep it simple. The beauty of comment cards is the feedback gathered; but if questions are confusing or too long (causing a respondent to scroll down), feedback becomes inhibited.
  • Slow and steady wins the race doesn’t always apply in the game of customer feedback. When it makes sense to, act as soon as possible on relevant feedback.
  • The more you know… well the more you know. Set up alerts and monitors to send feedback to the right people, including automatic data sends to you.

Analyzing the data

  • If it’s out of the ordinary, it probably is. Look for spikes in the volume of feedback; investigate what outside or internal influences could have affected these abnormalities.
  • Make a VoC road map, noting where you want to be and how you’ll get there. Establishing metrics can make or break a VoC program, after all you have to know what the data is measuring before you can make sense of the data. We recommend mapping weekly:
    •  site mean ratings
    •  % top two box overall rating
    • % of negative comments
    • ratings volume and comment volume
    • measures like Site Opinion Index, our proprietary measure of website performance and can be monitored monthly. All these measure can be found in your OpinionLab dashboards, found in the toolkit.

Trending Data

  • Healthy amounts of data usually average about 75 responses a week, and trends should be viewed week to week. If you aren’t receiving this amount or are receiving a lot more than this, don’t panic, response amounts vary from industry to industry and company to company.
  • Don’t judge a data book by its cover. Reserve judgments until you’ve collected a few weeks or even months of data before making any wide sweeping changes or adjustments.

Setting Targets and Benchmarks

  • Utilize all available metrics and data but make sure you establish key performance indicators (KPI’s) that address your objectives. If you need a little help in this area, our team of experts can assist.  We’re kind of like shining beacons of light that way.
  • Compare apples to apples. Or oranges if you prefer. When comparing data, view selected metrics like mean rating, % positive or negative comments across pages within domains to identify high or low performing pages or domains.
  • Remember to make it your own. You can create custom reports within our dashboard, set up alert and monitors and even export data to highlight your primary metrics and the data that matters most.

There is a lot of data revealed when you collect feedback. But with all these tips and tricks data will funnel to the proper places, trends can be spotted with lightning fast speed, and you’ll be capturing strategic and tactical insights, powering your customers’ experience.

Makes you feel a little safer at night, doesn’t it?


A Fire Hose of Analytical Insights

The ACCELERATE conference came to Chicago last week, and I joined a handful of OpinionLab colleagues in heading down to the Gleacher Center for the event.

Billed as a “virtual ‘fire hose’ of analytical insights and observations from many of the best minds in the business,” ACCELERATE is a great venue for mingling with fellow data-driven business people while collecting ideas and information from some strong analytical minds.

One example came from April Wilson of Digital Analytics 101: the idea that small business is the new “big” for analysts. With many SMBs already collecting data, analysts have an opportunity to help shape and mold these organizations into data-driven businesses, enabling entrepreneurs to understand and leverage analytics into intelligent strategic decisions.

Another impactful insight came from Joan King of Crate & Barrel, who discussed the power of integrating OpinionLab and Tealeaf to get an overall, accurate picture of customer voice. Her tip: don’t be afraid of negative information, as these pieces of feedback are often worth the most.

Speaking of feedback, I especially enjoyed another opportunity to see the OpinionLab speaker feedback solution in action. In the course of this day-long show, we collected over 650 pieces of in-the-moment feedback from audience members. And Tim Wilson and April Wilson each won a $500 prize for collecting the best feedback in their categories.

subcribe to our rss feed.