Going Pro

Welcome to Volume 2 of The UX Newsletter. After a lengthy hiatus to move into a new office (and build new stuff, which we cover in this issue), we're happy to resume sharing tales of UX design, development, and research.

We're getting back in the groove with a couple of stories about the development of MailChimp Pro, our new feature set for quickly growing businesses with large lists. In this issue, developer Mardav Wala goes behind the scenes of Multivariate Testing—sharing details about the UI, usability testing, and code considerations his team worked through.

We follow that with data analyst Emily Austin's look at sifting through and prioritizing customer research to determine feature sets and demands for a Pro product. We've written about UX research in previous issues, but here, Emily shares the analysis that comes after the research is collected. We conclude with links of interest from around the web.

Editors: Fabio Carneiro, Gregg Bernstein, Laurissa Wolfram-Hvass
Code: Alex Kelly

The Multiverse

Mardav Wala, Developer

We started 2015 with a MailChimp navigation redesign and the promise of "even bigger and better things." While we’ve continued releasing product refinements and feature updates, we kept our promise with the announcement of MailChimp Pro.

In the past few months, I’ve worked with Product Designer Michaela Moore and Staff Engineer Guan Liao to build MailChimp Pro’s Multivariate Testing feature. Apart from building a unique feature (which no other email service provider offers), one of the most challenging and fun exercises in this project was implementing an interactive setup screen that allows users to select variables resulting in up to 8 combinations.

The interface of the Pro Multivariate Testing feature

Our design challenge was creating a Multivariate setup and selection process while maintaining MailChimp’s simple, easy-to-use style. To tackle this, we implemented design ideas in the browser so we could interact with a realistic prototype, learn from it, and continue refining.

The new design was a chance to experiment with flexbox layouts in the application. This gave us much more control over things like overall page structure and the alignment and order of UI elements than we’ve traditionally had when using CSS floats and conventional layouts. We also had the opportunity to implement the D3 JavaScript library, which lets us use SVG for displaying data.

After a few iterations of fully-functional prototypes, based on dozens of high-fidelity mock-ups (for just one screen) and scores of copy changes, we were ready to reveal it internally and gather feedback for improvements.

Design iterations of the variable selection interface

Internal Testing and Feature Improvements

Having actual users test your products is important, but when you’re still working through things like basic workflows and usability, colleagues in different departments can provide immediate, objective feedback. MailChimp Email Marketer Brad Gula, Integrations Lead Kale Davis, and Employee Events Coordinator Ashley Wilson agreed to be our internal proxies for “Pro customer” testing.

Brad represents a true Pro user—he sends to a list of 9 million MailChimp users and is constantly experimenting and using A/B testing. As a side project, Kale sends the popular Hacker Newsletter to around 30,000 subscribers. He routinely uses A/B testing to experiment with subject lines and send times on 2 equal halves of his list, then studies his reporting and engagement to inform future campaign decisions. Ashley sends internally to 400+ MailChimp employees. She doesn’t segment, A/B test, or use any of our more advanced MailChimp features. While Ashley's use isn't "typical” Pro behavior, her feedback was equally valuable—we want our tools to be powerful, but they should also be approachable and easy to use.

Including feedback messaging

We were thrilled to see Brad, Kale, and Ashley breeze through Multivariate’s initial variable-selection step, especially since it’s very different from the rest of the app. Our testers helped make it even better by suggesting we add specific, contextual messaging to the page. For example, we added a message to let users know when they’ve hit the maximum number of testing variables. We also created a message informing users that their test will automatically be distributed across the entire list when they choose to test multiple send times.

Appropriate feedback messaging helps guide users through an otherwise complex process

Re-ordering steps

Our internal testers also helped us reconsider steps in the Multivariate workflow that were confusing. Originally, when users wanted to test multiple variations of email content, they cycled through this sequence of steps:

  1. Start at Content Setup screen
  2. Select email variation to edit
  3. Describe/name content variation (something to help identify that particular variation on the Reporting page)
  4. Select template
  5. Design and create content (using MailChimp’s standard drag and drop editor)
  6. Return to Content Setup screen
  7. Repeat steps 2-6 until all content variations for the test are complete

By observing our testers, we saw that step 3 came too soon and was confusing for users—we were asking them to describe an email version they hadn’t even created yet. So we moved that step to later in the process—after creating the version, but before moving on to the next variation.

Adding a description for a content variation

Working through feature ideas

We also talked with our internal testers about a part of Multivariate Testing that wasn’t complete at the time—reporting and results. Based on these conversations, we incorporated a visual link performance feature into the final product. This helps users compare how the same links perform in different content variations.

Visual link performance in MailChimp Pro

Bonus: A/B Testing With Content

From the very beginning, we planned to create an integrated flow for both A/B and Multivariate Testing—meaning that the Multivariate interface (for Pro users) and A/B testing interface (for all other users) would look more or less the same.

All free and paid (non-Pro) MailChimp accounts have access to MailChimp’s standard A/B testing. Once an account upgrades to Pro, Multivariate Testing capabilities are made visible in the existing A/B testing interface. All this is possible with modular and extensible components.

By sharing visual and functional elements between A/B and Multivariate Testing, we were also able to allow A/B users to test up to 3 variations, instead of the usual 2. And all MailChimp users can now test content variations in addition to subject line, from name, and send time.

Pro-inspired changes made their way into MailChimp's A/B testing feature

Ready, Set, Pro!

Multivariate Testing for MailChimp Pro is the result of months of research, designing and building, QA (Quality Assurance), and testing. For the last 5 weeks we’ve been making changes to the UI and refactoring code—all to make Multivariate Testing the highlight of MailChimp Pro. Now that it’s released, we’re listening closely to our customers so we can make it even better!

Does this type of experimentation and collaboration interest you? We're hiring.

Decisions with Data

Emily Austin, Data Analyst

At MailChimp, we’re all about empowering our users. We believe that products should be both powerful and easy to use—and we’re constantly working to deliver experiences that meet both of these criteria.

Enter MailChimp Pro, a set of enterprise-level features that enable MailChimp users to gain a more detailed understanding of their audience. Think of it as a set of data science tools for non-data scientists.

So how did we uncover which features were most important to our users? We started with one simple question:

Who The Heck Is Going To Use Pro, Anyway?

When the Data Science team began this research, we had a nebulous idea of what a MailChimp Pro customer would look like. Rather than moving forward on informed conjecture, we wanted to use our vast amount of customer insights to shape our recommendations of not only what to build, but who we were building for. After all, once you understand who your users are, it’s easier to envision the types of tools they need. Our first order of business was determining how a MailChimp Pro customer differs from all other MailChimp users.

Fortunately, in addition to other research methods, we regularly survey users to learn more about how they use our app and how we can improve it. This was our first stop on the road to understanding what MailChimp Pro would look like.

Customer Segmentation, MailChimp Style

We combed through recent surveys to categorize user feedback by feature. For example, some users desired more advanced reporting tools. Others asked for the ability to stop a send after scheduling it. And a great deal wanted additional A/B testing options. Each time we got feedback about a feature, we added it to a spreadsheet and tagged it for future analysis.

After categorizing responses from nearly 20,000 users, we had several customer segments we could compare to each other—and to the MailChimp population as a whole. We analyzed account data (list size, company age, user industry, etc.) for each segment to develop a clearer picture of users interested in each feature.

Segmenting our data, based on customer characteristics

There was just one problem: any time you ask people what they want, you risk getting an artificially inflated sense of demand. A customer might tell you they love the idea of Multivariate Testing, but that doesn’t mean they’ll buy it or even use it if it’s available. In order to prioritize which Pro features to build and figure out who’d buy them, we had to get creative.

Hacking Together A Demand Estimate

Any time someone closes their MailChimp account, we ask them to complete a short exit survey. While all customer feedback is valuable, this information is particularly useful, because it helps us 1) understand what causes someone to stop using MailChimp, and 2) take action on those reasons. It’s one thing to ask people what they’d like to have, but it’s another entirely to see people close an account because you don’t offer something that’s important to them.

For Pro, we analyzed exit survey responses from users who were similar to our previously defined customer segments. This gave us insight into which features should be at the top of our list, and which ones could wait for a future iteration of Pro.

For example, if a customer tells us they’re closing their account because they want the ability to test content, we can take that as reasonably strong evidence that content testing is important to them. Not only that, but it’s important enough that they’re willing to exert energy to set up an entirely different system to get that functionality. Bonus points if they mention the service they’re switching to, because then we can figure out what specific content testing functionality may have attracted this former customer.

Putting A Price Tag On Pro

After figuring out our Pro audience and what they need, we had to determine how much to charge for Pro features. This is a scary challenge! Price too high, and nobody buys. Price too low, and you lose revenue opportunities.

We reviewed exit survey responses from the same group that matched our target customer segments, specifically looking for responses that mentioned the service a user was switching to.

We used their list size in our system to estimate what their payment would be with the new service (which was, on average, 8 times more than what they were paying us). This helped us understand price elasticity and experiment with pricing scenarios for MailChimp Pro.

Want to experiment with pricing your product like we did? We created a rudimentary pricing calculator. Open the doc, make a copy for yourself, and go nuts. Enter different values into the green boxes to see how price impacts demand. Upper bound is the maximum estimate of demand, and lower bound is the minimum estimate.
Happy experimenting!
We developed a pricing calculator to experiment with demand and profits

Where Do We Go From Here?

Releasing a new product is hard. Like all research and product development, MailChimp Pro was a learning process for us. We were in uncharted territory, answering questions we’d never been asked before. But like everything we do at MailChimp, we’ll continue to learn, iterate, and dial it in.

Emily will speak at length on this topic at SuperNova South 2015 in Atlanta, on October 6.
Enjoyed this issue? Share it and start a conversation: 

Around the Web

The First One [emailthepodcast] is the pilot episode of Fabio Carneiro's email design podcast.

Take on texting while driving [ustwo] with the folks from Monument Valley.  

First Time User Experiences[firsttimeux] a collection of onboarding flows 

Ten Principles for Good Design [vitsoe] are industrial designer Dieter Ram's rules of thumb.

The Psychology of Simple [medium] asks why is it so hard to get "simple" right?

Gifsicle [lcdf]: a command-line tool to work with GIF images and animations

Get better at making things people want [medium].

U.S. Web Design Standards [gsa]: a set of UI components and styles for government websites

MailChimp® is a registered trademark of The Rocket Science Group, which is located at
675 Ponce de Leon Avenue NE, Suite 5000, Atlanta, GA 30308 USA.

You’re receiving this email because you subscribed at theuxnewsletter.com. If you don’t want to hear from us anymore, you can unsubscribe, though we’ll be sad to see you go.