Category: user experience
Shamir Duverseau

Maximize Data with Lean Thinking

Build. Measure. Learn. Those three words are at the core of Lean methodology, a way of doing business that incorporates elements from Six Sigma, agile development, design thinking, and other sources. Lean methodology is a modern application to business that has a longer history in the manufacturing industry, originating in the Toyota Production System in the 1950s. It has since been used by successful startups and large corporations alike, across industries. Lean’s continuous improvement cycle enables companies to make meaningful progress by getting the best use of customer data and intelligence.

When it comes to the digital experience, Lean thinking can be a tool of immeasurable power. From acquiring qualified traffic to converting those prospects into customers to retaining those customers to build lifetime value, a Lean viewpoint can help optimize every touchpoint of the customer journey. As this is especially the case in a considered purchase industry, Lean is now at the heart of how we at Smart Panda Labs are helping our clients drive customer lifetime value.

Here’s how.

Build

Everyone knows that building products and services that meet customer needs is a primary goal of any business. But customer needs are varied and nuanced, requiring answers to a long list of questions. If you wait to answer all the questions at once, or worse, assume you already know the answers, you risk high costs and wasted time at best. At worst, you risk the failure of an initiative, a division, or an entire organization.

This is why the term “minimal viable product,” or MVP, has become so popular and so important. A tenet of Lean and Agile methodologies, an MVP is a product with just enough features to satisfy early customers and provide feedback for future product development.  Each iteration of this streamlined product or service is meant to answer a question or two, or meet a set of demands, but not all demands at once.

We have learned the value of MVPs for our clients’ products as well as our own. So, we build new services and processes, not as fait accomplis, but as MVPs in order to ensure that are meeting client needs.

Measure

Objectivity does not come easily to modern day organizations. While gathering unbiased data is becoming easier, there remains a persistent risk of a biased interpretation of the data.

Lean accounts for this through customer-centric experimentation and measurement, allowing customer interactions and feedback to live at the center of the story. Actionable metrics inform whether your customer is experiencing your product in the way you hypothesize, or if you need to pivot. Either way, customer data and creative intelligence are guiding your decisions, thus maximizing the results.

Our own actionable metrics include feedback from our clients. How do they feel our innovation is helping them? Is it making things easier or harder? Is it aiding them in meeting goals or communicating with teams? The answers to these questions, along with many others, will help us to know whether or not we are moving in the right direction. And these decisions can be based on real feedback, and not simply cool ideas that we fall in love with but bring no benefit to the client.

Learn

“If you cannot fail, you cannot learn.” Eric Reis, the author of The Lean Startup, makes this simple but important point. Not everything works out the way you envisioned. Lean tells us that with every failure comes a wonderful opportunity to learn and iterate. The key is to embrace the opportunity.

For example. One of our clients engaged us to run an experiment on their website. The first test we helped them run failed miserably and quickly. It was designed to be a quick win … but turned out to be far from it. However, the resulting learnings from this failure yielded another experiment that was impactful in both its effect on the business goals (adding seven figures of incremental revenue for the year) and the additional customer insights it yielded.

Failure can’t always be the primary concern. Whether or not we are learning from these failures is what matters. We use our learnings to improve products and services on behalf of our clients, and also to improve the client experience we provide. What makes us better at our jobs also makes for better relationships.

Build. Learn. Measure. This is the backbone of how we harness data and creative intelligence to help our clients drive value from their customers, and it is becoming the method by which we serve our clients, period. If you are reading this, you are more than likely someone’s client. Should you expect any less?

 

Key Takeaways:

  • Lean methodology is a continuous improvement approach that enables companies to make meaningful progress by getting the best use of customer data and intelligence.
  • A key tenet of Lean is the “minimum viable product,” or MVP, which encourages the release of a product with just enough features to satisfy early customers and provide feedback for future product
  • Lean also emphasizes customer-centric experimentation and measurement, so that customer data and creative intelligence are guiding decision making.
  • Lean tells us that with every failure comes a wonderful opportunity to learn and iterate. The key is to embrace the opportunity.
  • As applied to digital marketing strategy, a Lean viewpoint can help optimize every touchpoint of the digital experience—from acquiring qualified traffic to converting those prospects into customers to retaining those customers to build lifetime value,
  • Lean and its backbone of Build, Measure, and Learn is now at the heart of how we improve products and services for clients. It also informs how we improve the overall experience we provide our clients.
Category: user experience
Cheryl Myers

Small Step For Conversion, Giant Leap For UX

This is not our first public display of affection for testing—we love tests, and we love to talk about them. For example, we’ve shared a compelling testing case study on Optimizely’s blog.

Now, it’s my turn.

As a specialist in user experience (UX) and content strategy at Smart Panda Labs, I have developed a close relationship with testing. This invaluable strategy places data at the heart of digital decision-making, providing insights that inform all kinds of choices. For example, the test I’m about to share with you was not designed to focus on your typical testing metric, conversions, but rather to understand how design changes can measurably impact the way customers interact with a user interface.

50 (ok, 2) Shades of Grey

A large real estate client had a series of landing pages, designed by a third party, showing floor plans of available luxury apartments. The landing pages were clean and minimal, and a series of numbered circles indicated how many floor plan options were available to view. However, while a medium grey represented the active floor plan, the circles that represented the subsequent plans were presented in a lighter shade of grey. When the client asked us to improve the design to help increase user interaction, we immediately gravitated to the circles.

We felt that the lack of contrast between the circle that represented the floor plan being actively displayed and the other circles was not enough; the minimal contrast made it seem as if the circles weren’t clickable and instead merely visual cues of the number of available floor plans. We pitched a test to see if greater color contrast would achieve the desired goal.

It’s important for me to mention that we didn’t want to measure the effect this change had on the time users spent on the site, as time on the site can be both good and bad (they can be spending more time because they are engaged or because they are confused and searching!).  So, we focused purely on interactions—were they or were they not looking at all the floor plan options?

BaselineBaseline

Variation 1Variation 1

Variation 2Variation 2

We tested the original floor plan module design against two new variations that each created more contrast, one using a slightly darker shade of grey and one that incorporated a maroon color. This test rapidly showed that the darker grey and the maroon were outperforming the original. Interestingly, the new variations were a dead heat with each other, showing almost a 12% improvement over the original. Users were clearly more likely to view additional floor plans when there was less ambiguity.

Comparison of Baseline and Variation results

This test demonstrates how testing can be used not only to optimize for conversions but to help solve creative problems, like UX design and user interaction. A simple change, paired with a deliberate testing strategy, can help make the subjective objective and inform future design decisions. A test like this can also help make the case for simple and relatively inexpensive optimizations that help improve an experience, perhaps while other large scale changes are in progress or when there’s little time or resources.

CRO and UX Go Hand in Hand

The moral of the story: small changes are easy to test and can lead to significantly more engagement and, therefore, a better overall user experience. These improvements can make a site or campaign more intuitive to the user’s needs and a user journey more enjoyable. This approach, in turn, can prove to be a great supplement to your CRO efforts. In other words, focus on the person and all else will follow.

Key Takeaways

  • Use testing to help make subjective decisions. Allow the visitor’s opinion to carry the most weight.
  • People need to have a good experience before they can make a decision or convert. Use testing to optimize not only the decision but also the experience.
  • Make sure the goals of your test are clear and unambiguous—don’t test for results that could be misinterpreted.