SuperBreak Homepage Evolution

Posted 12th February 2017, in UI / Visual

Design, qualitative research and AB test of SuperBreak's homepage. The first stage of a series of iterative enhancements to continually optimise the performance of the SuperBreak homepage with a measured, user centered approach, resulting in a +3.72% conversion rate uplift.

Photo of the SuperBreak homepage displayed on a monitor

Involvement:

  • Conception
  • Research
  • Design
  • AB Test Development
  • Stakeholder Management

Background

Existing homepage version before redesign

The SuperBreak homepage is naturally one of the highest traffic pages on the website. In user testing we had often received feedback that it was ‘busy’ and anecdotally, it looked dated and visually unappealing.

The large majority of users interact with the search form, so the busy layout seemed to be just getting in the way of the primary function of the page. The excess information seemed to act as ‘noise’, adding to the cognitive load of using the page and arguably, introduced an element of choice overload.

As the desktop version is also served to tablets, as the content was crammed into three columns, it made the interface small and awkward to use on a tablet.

There were many SEO considerations that would need to be persisted. Also, as the homepage is one of the key pages on a website, stakeholder expectations would need to be managed.

Any changes would need to be measured to protect and hopefully increase revenue, this would also help justify any development effort. Previous AB tests on the homepage had failed so some learnings could be taken from there. I had also conducted around 20 previous qualitative tests of category page layouts through usabilityhub.com for a different project, so knew the new layout was perceived favorably by users.

The new homepage needed to focus on the primary function of the page and be more visually appealing. By being more visually appealing it would imbue trust with the users who would likely have little awareness of SuperBreak as a brand. By allowing users to search they would also have a better chance of finding available breaks which would also improve their user experience.

Most of the users are based in Northern England, so using a large inspirational image of our key destination would appeal to the majority of our users and traffic sources. This could later be personalised and put us in a position to make further improvements, strategically.

Proof of Concept

Previously AB testing the homepage using an agency, they had said that the full width, tabbed search form layout was not possible as an AB test, manipulating the existing HTML with CSS and jQuery via Optimizely. Hence, this version of the homepage had not been tested before. I thought it was possible, so created a ‘tampermonkey’ script which manipulated the page almost entirely with CSS, to prove the case to stakeholders and to see if it was in fact feasible as an AB test. The proof of concept worked. I also tweaked much of the content to shift focus to highlight our most popular products and enable the user to drill down to key category pages.

SuperBreak v1 design

Qualitative Testing

Even though I had qualitatively tested similar designs in the past and I was intending to AB test the new design anyway, I conducted some qualitative research. This was to give the design the best chance possible of performing well in the AB test. I also needed to build the case and evidence why the new design would improve user’s perception of the website because at the time, the AB testing programme was limited, so getting an available slot in the AB roadmap was hard, I would need a solid case to aid my cause and to ensure it would actually work!

Five Second Test

I asked 15 UK users over 30, the same set of questions for control and v1. I set the scenario “Imagine you are looking for a short break away and land on this site…”. They would then see the design for 5 seconds, followed by some questions. This would help gauge people’s immediate reaction when landing on a page, if the reaction is bad – they’re more likely to bounce or gradually lose confidence in the website, eventually exiting. It would also help determine if they understand SuperBreak’s proposition instantly.

Rate the quality of this page between 1 (worst) and 5 (best).

Control:
3.8
V1:
3.6

Did the brand appear trustworthy?

Control:
80% yes
V1:
100% yes

What product do you think this company sells?

Control percentage of ‘short breaks’ mention:
20%
V1 percentage of ‘short breaks’ mention:
47%

The designs surprisingly performed similarly but promisingly v1 did score a bit higher for reassurance and seemed to more accurately convey what the website sells.

Which design do you prefer?

I then asked a more direct question of 50 UK users given part of the objective was to improve the overall visual appeal, this test would highlight if people preferred the look of it and therefore potentially be more attracted to using it.

Control:
26%
V1:
74%

You’ve landed on this site looking for a short break away. Where would you click next?

Click test control version

Control

Response time: 27.6 seconds

Click test v1

V1

Response time: 16.9 seconds

The response time was much faster for v1, suggesting users would more quickly be able to understand the page and complete their task. It suggests the page information was easier to absorb (reducing the cognitive load), in a shorter space of time. More users interacted with the search options, perhaps showing that they were missed on control.

photo of laptop displaying SuperBreak homepage version 1

AB Test and Conclusion

After writing up my report and sharing with stakeholders, the next step was to AB test the homepage. In this case, I finished off the code for the variation and a third party ran the test for us. Though I have developed and analysed many tests myself, my role often varies.

The qualitative research did suggest that the new homepage would be successful but as we believe in a test driven process, we decided to still test it to be sure and to help justify any dev cost involved in implementing it.

The AB test won, increasing conversion by +3.72%. We also measured a number of other metrics which gave us valuable learnings in how users interact with the page.

Strategically, this put us in a position to do further testing to the page to enhance further and address other areas of the page iteratively.

Since it has been implemented we have ran several successful AB tests on the background image, as well as adding reassurance and USP information on the page.