office2bw.jpg

eCommerce Marketing Blog

How A/B Testing can Increase your Conversion Rates

Posted by Paul Kaye on Jul 23, 2015 10:30:00 AM

ABTesting-1Some solutions are absolute when it comes to finding what works. The right part for the fix, the right key to the door, the right power supply for the device. Then, there are the solutions that can be constantly improved by making experience-informed adjustments over time. Recommender engines, of course, belong here. The more data an engine is fed over time, the more on target its results will be in terms of pleasing the user and converting them to customers.  Widely known for their conversion successes, Amazon continues to report dramatic sales increases year after year. Fortune has attributed the secret of their growth to “the way Amazon has integrated recommendations into nearly every part of the purchasing process from product discovery to checkout.”

But getting recommendations right means investing enormous amounts of attention on metrics and to improving them. Amazon regularly employs data scientists for the role. Their function is to spend time understanding the metric that needs to be maximized and the business context for that maximization. Data scientists call this the optimization objective, states this Analytics Magazine article on recommendation engines, and remain focused on a single one. But not every company has an Amazon-sized cache of data scientists to run the A/B plays that drive such profitable improvements. Most depend on their marketing department to create an A/B plan that can find, and fix recommendation shortcomings.

It’s not a question of whether A/B testing works.

It’s a question of how you get there.

A/B testing is nothing new to marketers who run comparative tests on email marketing campaigns all the time. But testing the call-to-action or image of an email campaign is a far less sophisticated process than testing the performance of a recommendation engine and making metrics-based tweaks 

But when it comes to recommendation engines, taking ownership of performing the kind of sophisticated A/B testing required to drive incremental upticks in profitability shouldn’t be entered into lightly. Says one analytics Ph.D. of testing and improving profitability of recommendation results, “A/B testing simply has to be baked in.”    

Why selecting recommendation engines with built in A/B testing is the way to go.

Converting retail visitors into buyers is becoming an increasingly precise science. Comparatively clunky user-engagement models of the past enabled retailers to make “close-enough” recommendations that delighted the uninitiated with their predictive power and accuracy. Today, users expect recommendation engines to serve more than just a predictive purpose. As an example, retail users are interested in discovery, rapidly exploring diverse items and preserving their privacy. This complicated web of intent and interaction is why built-in recommender A/B testing is more desirable than an A/B plan crafted by marketers in-house.

Strands A/B Testing can increase conversion by several percentage points.

Companies deploying Strands Retail have access to scientifically created and performed A/B testing to optimized revenue and sales production.  Of course, with a machine learning driven system like Strands Recommender, A/B testing should be done at all times. Simply applying a recommendation engine to a retail site will increase conversion rate significantly. Unfortunately many marketers stop here. Over the course of time, improvements to the system can be made. They just happen more slowly and require a much more refined awareness of the activities that can affect profitable change. Data scientists are much more capable (and focused) than marketers in creating the kind of A/B testing that can squeeze a number like .002% conversion rate out. Small? Yes, but more profitable.

What else is different about Strands A/B testing

  • Fine-tuning is done on Strands servers (SaaS) so the marketer
  • Doesn’t need to do it by themselves.  
  • Complex IT algorithms hosted on Strands servers rather than client side
  • Though the Strands solution itself produces impressive conversion results, Strands A/B testing and fine-tuning maximizes conversion rate (up to 5-10% increase expected with this “set-it-and-forget-it” strategy)

Increasing conversion through the use of A/B testing is a standard, but evolving practice—especially as it applies to sophisticated machine-learning solutions like Strands Recommender. If you’re interested in finding out how Strands Retail A/B testing can improve your conversion metrics:

Contact Us

Using Big Data to Understand Buyer Behaviour

 

 

Topics: Personalization

Subscribe to Email Updates

Posts by Topic

see all