TV
A/B Testing and Search CRO – A Lesson from Seasalt Cornwall
In this talk, LJ Hazzard and Jana Lindner from Seasalt Cornwall discuss they used A/B testing to improve their user experience and increase website conversion rates.
We were glad to welcome LJ Hazzard and Jana Lindner from Seasalt Cornwall at Discovered London event. Seasalt is a modern lifestyle brand based in Cornwall with a focus on quality, longevity, and design, and has been trading for over 40 years with a strong online and high street presence. In this conversation, Jana and LJ discuss their decision to test their search bar and the hypothesis that making it more prominent will encourage more usage, ultimately improving their user experience and driving higher conversion rates.
As the senior user experience manager and UX University tutor at Falmouth University, LJ works closely with all of Seasalt’s digital teams to conduct research and testing to create engaging experiences that meet the needs of their business. Jana, on the other hand, is the senior Digital Trade Manager and is responsible for optimizing website conversion rates.
Together they work to deliver the best possible user experience and have heavily invested in digital transformation of this renowned brand, including onboarding new technology such as Klevu for search and merchandising and Insider for marketing. If you want to know more about how A/B testing helped them find the best solution for their website, keep reading.
Research, hypothesis, action
In designing and testing a new search bar for their website, the team followed a thorough and methodical process. First, they researched industry best practices and looked at what their competitors were doing. Then, they designed a variety of different search bar options and implemented them through a testing platform.
They measured success by defining specific goals, such as the number of clicks and purchases, and analyzed the test results to determine a winner. The winning design was then added to the website as a permanent feature. The team started off with desktop A/B testing, and then mobile as it was more complex.
Desktop A/B testing
When it came to desktop, the team faced several challenges such as poor visibility and lack of an inviting prompt to encourage shoppers to use search.
They came up with a variety of solutions, including increasing the size of the search bar, using contrasting colors to draw attention, and changing the position of the bar. They then created different variants of these solutions and tested them all to determine which one performed the best.
The team recognized that there is no one-size-fits-all solution and that what works for one brand may not work for another. That’s why they tested multiple variants, including subtle changes like color adjustments, to determine which one was the most effective. By defining clear goals and closely monitoring the results, the team was able to make informed decisions and ultimately improve their website’s search function.
A clear winner for desktop search users
Testing and analyzing various design variants can have a significant impact on the overall success of a campaign. The testing period of April – May last year saw a 5.5% increase in overall search usage, just by doing the testing alone.
The testing revealed that one variant outperformed the others, with all the right elements in place, such as good location, visibility, and contrast, resulting in increased clicks. Interestingly, it wasn’t the variant that was expected to win, demonstrating that assumptions may not always be correct.
The winning design was added to the development backlog to be permanently released on the site. Meanwhile, a new campaign was built in Insider with the winning variant at 99%, which ran until the new search campaign was released in September. Since the release of the winning design, sessions with search have increased 9% YoY, and there has been a 16% increase in revenue from those sessions.
It is impressive how testing and analyzing different design variants can have a such significant impact on a campaign’s success, as much it is important to not make strong assumptions and to be open for some unexpected results.
By implementing the winning design, the team achieved impressive results, with increased search usage and revenue. It serves as a reminder of the value of testing and analyzing different design variants and not relying solely on assumptions.
Mobile A/B testing
The Seasalt team considered how to optimize the mobile search experience. Their hypothesis was that exposing the search bar would encourage more usage. However, they knew that they had to get the UX right.
One of the challenges with mobile devices is a lack of visibility. The search icon is often hidden among other icons in the header bar, making it difficult for customers to find and use quickly. The team tested various designs, including changing the placement of the search icon and using a search bar with a white or gray background.
However, they had to consider usability as well. They didn’t want to annoy users with a search bar that disrupted their browsing experience. By considering the behaviors and goals of mobile users, the team was able to create a UX design that encouraged search usage without compromising the user experience.
Search & Discovery Demo & Consultation
Uncover growth opportunities for your ecommerce business
After 27 days of testing, they found that all variants with the search bar pinned open showed an uplift in search sessions, resulting in a 27% increase in search usage. However, there was no clear winner between the variants. So they continued the campaign, turned off the control group, and identified the two variants that performed better and continued to test. They then selected the variation that provided the best user experience and brand value and are currently working to implement it on the site.
Final results and key takeaways
Since the start of the A/B test, the company saw a 42% YoY increase in sessions with search and a 47% increase in revenue from those sessions. The results highlight the importance of A/B testing in optimizing the customer experience and increasing revenue for businesses.
There are a few key takeaways that businesses should keep in mind when running A/B tests:
- Identify a clear problem and have a clear plan for what they are testing and why. This helps to align the team and gives them a clear goal to focus on.
- Consider the device you are testing on. Experiences differ against desktop and mobile, so consider the needs of your users and their behaviors on various devices. What works for one device may not work for another, so be sure to test multiple ideas.
- Start small and then decide if you want to run smaller iterative tests instead of a larger redesign.
- Have an open mind and be willing to learn from these tests. Although tests may fail, having that information to make informed decisions is the way forward.
By breaking down A/B testing into the basics and focusing on a clear problem and key business goal, businesses can make meaningful improvements without getting overwhelmed.
Big thanks to Jana and LJ from Seasalt Cornwall for this insightful talk at Discovered! If you want to know more about how Seasalt Cornwall uses Klevu for search and category merchandising, feel free to check out our case study.