Inform Your Design with Data: Part 3

By Posted in New Relic News 9 January 2014

In parts one and two of this series, we talked about setting up and running a remote user research project in order to learn what was causing the attrition between signup and deploy (i.e. the installation process to get started with New Relic). In this post, I’ll share what we learned, some of the steps we’ve taken to address these issues, as well as some follow-up findings.

What We Learned

No product is perfect. Understatement of the year, right? Research is a valuable way to understand which of your product’s imperfections are most impacting your customers. Additionally, it can give you great insight into which elements of your designs are working well. Every single time I’ve talked with our customers and, more importantly, observed them using our product, I’ve learned a ton. This time we ended up with 3 big findings:

1. People sign up to figure out what New Relic is and does.

Software analytics and application performance monitoring (APM) are both relatively new and were previously inaccessible to most non-enterprise level organizations who weren’t willing a) to spend a lot for on an on-premise solution or b) to build something themselves. As a result, our potential customers aren’t always sure what to expect from New Relic. What does it do? How is it different from Google Analytics or Mixpanel?

Software Analytics

Rather than looking for the information on our website, we found that many people were signing up for New Relic to answer those questions. In some cases, they wanted answers that weren’t directly addressed on the external site, and some customers skipped the website entirely — they just wanted to get their hands on the product to understand it. The problem was that some people would sign up and be unable or unmotivated to go through the deployment process, but they still didn’t have any new information to help them decide whether to use New Relic.

2. People want to make sure they’re on the right track throughout the deployment process, and need more signposts and feedback along the way.

For those who did start down the deployment path, many would hesitate and second-guess their progress if the instructions didn’t match their environment 100% (which happens frequently, given the number of frameworks that can run New Relic). There weren’t any suggestions for how to verify particular steps, and if they made it to the end of the instructions only to find it still wasn’t working, it was hard to figure out where they had gone wrong.

3. It was too easy to get lost in the instructions and documentation. This made it hard for users to find the information they needed.

Adding to the confusion, the instructions for an agent might branch to alternate instructions in the help pages for other frameworks. (The agent is software you install with your application to send performance data to New Relic.) Those weren’t always called out clearly, so people would try the instructions on the page and only later discover that they needed a different set of instructions. Then we found out that the instruction sets had been written based on whichever framework’s instructions looked easiest, not which frameworks were most common.

Redesign of Installation Pages

Based on our findings, we brainstormed a variety of ways to address the problems we had discovered.

We changed the workflow, page layout, and styling for our install instructions. As an experiment, we took the .NET agent instructions and used a different instruction style, trying a more wizard-like experience to customize the instructions for the user at the beginning. To start, we introduced a simplified welcome page that set expectations about what was needed to deploy, provided links to information about what New Relic is and some key FAQs, and added a “Send to My Developer” button, to address those less-technical users who were signing up and unable to move forward.

Welcome to New Relic

Our marketing team continually experiments with optimizing our storefront site, so we’re always trying new things to inform potential customers about our offerings.

We addressed some of the “am I done?” and “is it working?” questions by updating the final step of the deployment instructions to indicate how long they should expect to wait. We also provided troubleshooting links at the bottom of the instructions and added a walkthrough for new users to tour the product while they wait for their data to start reporting anything interesting.

Data in 5 Minutes
Results

So how did our changes fare in the real world? Based on some followup interviews, the new design seems much clearer from a usability standpoint — people just whizzed through setup. The redesigned .NET experience went particularly smoothly: one Python user who accidentally stumbled across it commented, “I wish this were for Python!”.

Tell us about your .NET app

We also have some usage and deployment data to supplement our understanding of how successful, or not, the changes have been. While the “Send to My Developer” button wasn’t implemented until after our follow-up interviews, we do have usage data and have seen fairly steady usage, though low numbers, for this feature since it was released. This indicates that we may be helping some of those people who weren’t going to be able to install anyway.

On the other hand, when we A/B tested the new deployment instructions and compared conversion rates, the two performed very similarly, with the old design slightly in the lead. That comparison test wasn’t run for long, so the findings are suggestive at best.

New Design/Old Design

At this point, our sample sizes from the A/B test and the follow-up interviews are too low to draw solid conclusions, but our results are mixed. On the one hand, usability seems to be somewhat improved with the new designs, but it may have affected conversions negatively. Lower conversions are a problem for our business, particularly if it doesn’t result in higher-quality conversions.

We were really hoping for the new designs to make a big difference to both usability and conversions so I could have a wonderful success story to share with you. Alas, the real world doesn’t always provide such neat, happy endings. But what we did end up with was more learning and new paths to explore. We’ll be talking with marketing about the results of the A/B testing and follow-up interviews. From there, we’ll try to determine what measurements of success are acceptable on both sides of the equation so we can iterate on the designs and keep testing until we see some improvements or determine that what we have is as dialed-in as we’re going to get.

Key Takeaways

If you’ve been following this series, here are some universals we’ve learned throughout the process that you can employ in design pursuits of your own:

  • Customize long or complicated instructions based on user choices.
  • CTAs can push your user to act too quickly, if you’re not careful.
  • If your conversions aren’t what you’re expecting (or wanting), that’s one place to start investigating.
  • Experiment early and often.
  • A/B testing isn’t always conclusive. If the changes aren’t intended to produce purely numbers-driven results, use other methods to test.
  • UX and Marketing teams need to work together to find the sweet spot between balancing User and Business needs.

About the author

beth@newrelic.com'From interaction design and research, to singing opera or welding giant metal art pieces Beth is a renaissance woman at heart. To really make her day, engage her in a good philosophical debate on the intricacies or nature of something.

Tell us your thoughts Or Send us an internal high five

Talk to @newrelic