CE: First off, can you tell me a little bit about what you do?
CT: Sure! I help our Director of Growth with conversion rate optimization. We do this by experimenting at different touchpoints of the customer’s journey.
Optimizely is our primary tool for experimentation, and we use different analytics tools like Google Analytics, FullStory, and CrazyEgg to determine a winning and losing experiment. In addition to setting up processes for more efficient experimentation, I collect and analyze data before and after running experiments.
CE: Can you walk me through a typical experiment you might run?
CT: A good example would be our Account Settings page, where we had a redesign in efforts to reduce cancellations. I started the experimentation process by collecting baseline data to understand how users interacted with the living design.
Crazy Egg helped me determine how users were engaging with the living design, and for this particular experiment, it helped me see that there was little to no engagement on this Account Settings page. In fact, the only engagement that Crazy Egg showed was users clicking on value prop icons that weren’t even clickable.
The Snapshot told me that users wanted to learn more about what else they could do with their membership and that they wanted to learn more about these features; they couldn't get to it!
So I worked with my designer to create a page where icons were clickable and contextual links were present under each value proposition to encourage users to learn more about such features.
CE: What was the result of that redesign experiment?
CT: Tragically, users in the experiment didn’t even click on the icons. This was really a surprise for us because we based this experiment design on data that we had seen from a previous snapshot. We had hypothesized that enabling clickability to these icons and directing them to learn more about our features would result in more engagement and fewer cancellations.
But it ended up just confusing our users. Perhaps it’s information overload or just too much content because we later saw Snapshot in the Crazy Egg experiment that users don’t end up scrolling down even to read all the value props.
CE: So, aside from you and the Director of Growth, are you the two using the tool primarily, or are there other teams or people you pull into your experiments?
CT: All the stakeholders, from our CMO to our QA engineers, see Crazy Egg’s snapshots. It is part of our experimentation culture to be data-driven. Crazy Egg is a great way to visualize data so that it is easy to digest for everyone across the company. It is also a great tool to quantify user behavior.
CE: Do you come from a growth marketing background, or is this something you’re just getting into now?
CT: I was first introduced to the ability to quantify marketing in my first tech job. I was a BDM for a predictive analytics tool primarily used by marketing teams in the retail/eCommerce space. Then I ended up helping a friend start a health tech company, leading their sales and marketing efforts, so I had to learn SEO/SEM and social media marketing on my own.
From there, I learned how to use other analytics tools like GA (Google Analytics) and Mixpanel.
Before tools like Crazy Egg and FullStory, we couldn't really determine if this image or color or this layout worked better than another. We can now experiment with these subjective designs and quantify user behavior on these artistic elements.
CE: It’s just so cool because it's not opinion; it’s data. You can’t argue with it.
CT: Right! Another good experiment example is an image of an iPad that we had on our homepage. Users would click on the iPad and the icons that surrounded it, maybe thinking a video would play or something, but there was no video. This tells me that users were expecting to see something, but they didn’t get anything, so we set the wrong expectations for our users. Crazy Egg has helped us experiment based on not only user psychology but user behavior.