Copy
Issue 10 \\ Survey Says
 

Issue 10 \\ Survey Says

Here on the UX research team, we spend a lot of time analyzing MailChimp user feedback, studying customer analytics, and puzzling over research data. And while all of that is important (and we love it!), we also realize there’s a lot to learn by stepping back and analyzing our own team’s processes and actions.

This week, UX researchers Fernando Godina and Laurissa Wolfram share a glimpse into one of our recent studies and share how our focus on user details and tailored content gave us better survey and interview results than we expected.
Tweet
Share
Forward to Friend

Survey Says...


by Laurissa Wolfram & Fernando Godina
Most of our projects stem from our curiosity about MailChimp customers and why they do things a particular way. This project was no different. We set out to learn more about how our customers move data between MailChimp and other systems, and we began by asking ourselves a few simple questions:

  • Where does their data live?
  • Why did they choose product X as the home for all their data?
  • How does their data move between systems?
  • How can MailChimp improve this process?

After a series of user interviews we had a lot of information, but we didn’t see any real trends. Using some of this interview data, we narrowed our focus to specific questions we could ask in a survey.

Writing our survey questions was a collaborative process among the UX research team. First, we came up with a list of questions and shared them as a Google doc that the rest of the team could edit. We knew what we wanted to ask in general, but we still debated and discussed the best word choice and tone of voice.

Finding the Right People

While work on the survey questions continued, Fernando started pulling together a list of folks we could send to.

Since we wanted to learn specifically about how people manage and move data between MailChimp and other systems, we didn’t want to blast our entire customer list with this survey. To start, we looked at one of our large scale surveys that went out to hundreds of thousands of MailChimp users at the beginning of 2013. This general survey has proven to be super valuable for our research this year, and we’re still combing through it to find specific segments of users we want to learn more about.

For our survey, we filtered down to 5,150 people who told us they import lists “every day,” “once in a while,” or “often.”
We wanted to talk to people who move data between MailChimp and other systems, so we started with those who had previously told us they import lists "every day," "once in a while," or "often."
Once we had our target audience, we imported their contact information into a MailChimp list and began creating our campaign.

Focusing on Email Content

Users tend not to respond well to surveys unless they care about the topic and can connect with it. Our contact list was made up of people who need to import regularly, so we hypothesized that if we let them know we’re specifically interested in learning about importing to make this process better for them, we’d get a higher response rate in the survey.

Trial Run: Sending out a Pilot Email

Before sending the survey to all 2,626 people on the list, we put it through a dry run with a small segment of our list—just in case we needed to rephrase any questions. We also wanted to make sure that SurveyMonkey was pulling the MailChimp customer IDs, names, and email addresses the way it should—otherwise we wouldn’t know what answers came from which customer. We selected 100 random users from our list and tagged them as VIPs. Although this was a hacky way to do things, it let us easily segment these users from the rest of the list. After we sent out the test email, the VIP tags alerted us when people opened and clicked our campaign—which was handy for our small pilot group, but would have been overkill for thousands of respondents.

Based on the answers we received from the pilot survey, we were able to see how our questions were interpreted (or misinterpreted). For example, our pilot prompted us to rework Question 7. We initially asked, “Where are we falling short?” and at least 10 respondents told us they didn’t understand the question. We realized this question was far too broad, and we revised the question to read, “Is there a task you’ve tried to accomplish using MailChimp with another system but haven’t been able to?” Out of our remaining respondents, only 2 people told us they didn’t understand.

We also got some valuable feedback from our respondents by designating our head of user experience, Aarron Walter, as the reply-to address. This gave folks a direct channel to respond to us with feedback or questions, and proved useful when a couple of users alerted Aarron to some technical issues with our survey. We were able to immediately fix the problems before the survey was sent to our entire list.

A/B Testing Subject Lines

After the pilot, we wanted to test one last thing. We’re all about experimenting and learning at MailChimp, and we wanted to try out a couple of different subject lines for effectiveness. The subject line for our pilot email was pretty bland: Tell us about how you use MailChimp. We had a hunch that if we could make our subject more specific and relevant to our users, they’d be more likely to open our email and click through to the survey.

We wanted to tell people, “Hey, you import a lot of information into MailChimp, and we want to talk to you!”, but needed to find the best way to communicate that. So we settled on two subject lines and A/B tested them to see which one would receive the most opens and clicks. Here are the subject lines we tested:

  • <<First Name>>, we noticed you're doing a lot of list importing in MailChimp. How can we make list importing in MailChimp easier?
     

  • How can we make list importing in MailChimp easier?  

 

In the first subject, we address the customer and casually mention that we’ve noticed something specific about their workflow—regular importing. In the second, we still mention importing, but it’s framed generally and in the form of a question—it’s an appeal to our customers for help.

The first subject was a gamble, and we wondered whether or not we could pull it off. We knew that if were careless in our word choice, we could come across as creepy.  For example, If we had just changed one word and said “<<First Name>>, we see you're doing a lot of list importing in MailChimp,” our subject could have taken on a pretty yucky tone. We wanted our recipients to think, “Hey! They noticed me and they actually care about what I’m doing,” and not “Ew. Is MailChimp watching me?”

After 24 hours, the results from the A/B campaign came in. The first, and more direct subject—which specifically called attention something our customers do on a regular basis—“won” by an 8% margin in open rates, and it was sent off to the remainder of the list.
Campaign A won with a 52% Open Rate and a 44% Click Rate. Campaign B lost with an Open Rate of 44% and a Click Rate of 36%.

Final Survey Results

Overall, the final open rate for our campaign was 52%, with a 44% click rate.

By comparison, our large survey sent out at the beginning of 2013 to 574,525 customers had a higher open rate of 59.4%, but only 6.6% click rate. There are still a few people opening our latest survey, and we expect the open rates will end up being pretty comparable. 

The click rate blew us away and we were happy to get the feedback from our recipients. The big jump in the response rate was a result of us making an impersonal process, like taking a survey, more customer-specific and tailored toward the users we needed to hear from.

Follow-Up Interviews

We spent the next few days digging through the survey data, trying to pull out important themes and trends. As we read through the responses, we flagged customers who described workflows or problems that we wanted to know more about. Out of our 761 respondents, we pulled together a list of 40 customers that we really wanted to talk to.

We sent each of them an email that specifically mentioned some notable things from their survey that we wanted to talk about.

The responses we got really took us by surprise. Laurissa, for example, sent out emails to 19 people and heard back from 9 within the first couple of hours and 3 more over the next few days. Typically, we’re happy if we hear back from 2 people for every 10 interview request emails we send out.

Almost everyone was excited to talk to us, and several people told us how much they appreciated that we were interested in their particular problem.

What We Learned

Tailoring content for a particular audience is a long (and sometimes tedious) process, but without that time and effort we might not have gotten such detailed answers from the specific customers we needed to reach. Sure, we could have targeted all 3.5 million customers. And yes, we could have sent out a generic follow-up email after the survey, asking respondents if we could talk to them. But by thoughtfully focusing on a target audience and very specific content, we were able to reach the right people, rather than the most people.

Takeaways

  1. Narrow your target audience: We sent our initial survey to a very targeted list of folks who we knew were importing on a regular basis.

  2. Refine the details: If possible, we wanted to address our users by name, so we made sure we had a first and last name for each email on our list. If we couldn’t find the users’ name in our MailChimp system, we removed those people from the list. This narrowed our list even further.

  3. Be specific and direct: In our survey subject line and body, we clearly addressed an action that we knew was a regular part of our recipients’ workflows. In our follow-up emails, we specifically referenced something in their survey response—and even quoted them directly.

  4. Be human: The campaign was sent from a real person and the reply-to email reached a real person. (Be careful not to take this too far by personalizing your email’s “From” name. The From name should be quickly recognizable, usually your company name.) While we couldn’t speak with each of our respondents individually, our follow-up interviews and conversations let us talk to some of our customers individually and learn from them.

  5. Move fast: We sent follow-up emails for interviews within a week after the survey was sent. It was still fresh on our respondents’ minds!

Wait, There's More

Still hankerin’ for more info about sending more thoughtful and effective survey emails to a more selective audience? Our CEO Ben also talks about this in a blog post he wrote last year on Reducing Irrelevance by using segmentation.

“… great projects, like great careers and relationships that last, are gardens. They are tended, they shift, they grow. They endure over time, gaining a personality and reflecting their environment. When something dies or fades away, we prune, replant and grow again.”

- Seth Godin

UX Around The Web

Ask Us Anything

We want this newsletter to be a dialogue. If you have any questions for our team about pattern libraries, design research, or even how Federico ate some pavement on his bike last week, send them in. Seriously: hit reply and ask us anything. We'll try to answer every email and maybe even share our conversation in future newsletters.
Twitter
© 2001-2013 All Rights Reserved.
MailChimp® is a registered trademark of The Rocket Science Group


view in browser   unsubscribe   update subscription preferences   



Email Marketing Powered by Mailchimp

Love What You Do