UX @ Red Gate

Failing to optimise a product page for downloads!

| 4 Comments
Categories: Design, Website Tags: , , , |

I recently worked on a short project to try and optimise sales of one of our products. This isn’t something we’ve tried very much of in the past. We run occasional A/B tests on the website but they’re usually to optimize marketing campaigns or sign ups.

We typically develop our products by working out from user feedback, competitor analysis, market research and innovations from the team, what we could to improve the product. To a certain extent, we trust that doing the ‘right thing’ (improving the user experience or functionality of the application) will ultimately improve sales.

We spend a surprisingly small amount of our attention on the trial / download experience. In this project we decided to try a slightly more cynical approach; could we earn the company a lot more money by tweaking a few things here and there? Were there any choke points that we could remove that would suddenly increase downloads and subsequently sales?

We kicked off the project by first trying to identify all the touch points in our existing evaluation and sales processes. These ranged from marketing campaigns, product pages, download forms, installers, sales contacts, automatic emails to the actual product itself.

After gauging the relative importance, potential upswing and effort required we decided to focus our efforts on the product page and using the application for the first time. Here I’m just going to talk about the experiment we ran on the product page, but we’ll write about the application A/B test in the future too.

Our product pages are the bread and butter of our website. They’re the gateway to finding out about a product, downloading it and ultimately buying it. They receive over half the page views and are where we invest the vast majority of our effort.

Screenshot of old product page

Our existing product page – April 2012

Despite that, we felt that the main indexes for each product look tired, failed to provide a quick overview of the product, and didn’t prioritise the information very well. We thought with a bit of love we’d be able to freshen up the design, increase downloads, promote our other products better, improve the information design and reduce the maintenance overhead by removing extraneous content.

Screenshot of new product page

Our proposed new product page, with re-prioritised content, simpler layout and sticky nav bar.

We were wary that some A/B tests tried in the past had had no measurable impact, so we decided to go all out and try a radical design for our first iteration. We decided not to worry about consistency with our other product pages. If the page was a resounding success then perhaps we’d update all our other pages or tone it down, if it failed then we could just throw it away. The metric we decided to track in the A/B test was Downloads – how many people clicked the download button and then successfully filled out of the form. We’d have loved to have tracked the funnel right the way to the end, to see whether users who saw A were more likely than those who saw to buy B, but couldn’t due to low data volumes and potential privacy issues.

We kept an eye on other metrics too, including adding to cart, but were less concerned by those as the decision to buy our products is predominantly influenced by the evaluation experience, rather than the product page itself.

Before running the test we had no benchmark conversion rate and we had no idea what kind of uplift to expect. This made working out how long we needed to run the test to observe significance rather difficult! (It’s important to decide how long you’re running the test up front and not stop as soon as you A/B testing tool tells you the improvement is significant as this can lead to false positives. Read Evan Miller’s article if you want to understand the maths behind this.)

In the end we settled upon 2 weeks. This turned out to be too short and the test failed to reach significance. There was an observed improvement of around 10%, but as we’d not reached significance this could easily have been noise. We’d underestimated the existing conversion rate (~19.5%) and overestimated the effect of our ‘improvements’!

We therefore decided to run the test again after revising our calculations for working out how long we’d need in order to reach significance based on the observed results of the first test. This time instead of setting a time, we decided to stop the test once 4000 people were in each bucket – this turned out to take about 4 weeks.

The results second time round: no significance at all. In fact the new design was observed to slightly under perform the old one. We were shocked! We’d have been less surprised if the new design had decreased conversions than for it to stay completely unchanged.

Google optimizer results

Our final Google Optimzer results. While it looked to begin with as if the original might out perform our new page, the lines converged over time.

What we think this probably implies is that users arriving on this page have already more or less made up their mind whether they’re going to download or not. The number of people clicking download is already very high (~30%). There’s a fairly large drop off on the next page but still nearly 20% of users arriving on the product page end up downloading.

Average time on page did go up from around 45 seconds to 2 minutes. It’s probably unwise to read too much into this though as it could simply be a symptom of moving to a single page design rather than increased engagement. Clicks through to the shopping cart decreased, although in the new design, nearly a third of the clicks were for the bundle instead of just the standalone product.

So what did we conclude and what do we do next?

We still think there’s plenty of scope for improving our product pages. This just wasn’t the right page in the funnel to try and increase downloads. If that was our aim we’d have been better directing our efforts on marketing campaigns or the download form. However, we’re sure we can still improve our customer’s experience, better promote other products and reduce our content overheads with a better product page.

We’ve taken the new page down for now, but we’ll use what we’ve learned when we revisit them again in the future. Next time we’ll focus on improvements for our users and just use A/B testing to make sure we haven’t in advertently caused conversions to plummet!

If nothing else, this was a useful exercise to show that tweaking our product pages isn’t suddenly going to double sales. It’s also made me even more sceptical of the tests posted to sites like whichtestwon.com.

  • http://twitter.com/uxredgate Red Gate UX

    Interesting post – I especially like the conclusion: ‘use A/B testing to make sure we haven’t inadvertently caused conversions to plummet!’. I think the new product page looks much better and is kinder to the user (less cluttered, information arranged logically etc) so it needn’t be necessary to prove an uplift in conversion, rather just prove it doesn’t perform worse!

  • http://twitter.com/AdamAntacid Adam

    Interesting post – I especially like the conclusion: ‘use A/B testing to make sure we haven’t inadvertently caused conversions to plummet!’. I think the new product page looks much better and is kinder to the user (less cluttered, information arranged logically etc) so it needn’t be necessary to prove an uplift in conversion, rather just prove it doesn’t perform worse!

  • Guest

    Why don’t you guys remove the requirement to enter an email address? Why put up obstacles? And besides, how many g3rhgwrwgrw@gergwergw.com emails do you get in your db?

  • Anonymous

    Do you mean in the download process?

    We have run several experiments in the past and the results have been inconclusive. We’ll definitely revisit the form again in the future. There’s a balance between end user experience, number of downloads and lead quality to be struck.