This document describes techniques to consider when performing an A/B test of Google Maps Platform Place Autocomplete and Address Validation APIs.
A couple of benefits of using Place Autocomplete and Address Validation API are as follows:
- Improved customer experience: By providing your customers with real-time suggestions for addresses and places, you can help them complete their checkout more quickly and easily. This can lead to a better customer experience.
- Improved data accuracy: Place Autocomplete and Address Validation API can help you improve the accuracy of your customer data. This can be especially critical in ecommerce, as accurate address data is relied upon for the successful delivery of packages.
To improve the quality of your addresses, run an A/B test to evaluate which validation solution best meets your needs. This gives you a chance to quantitatively decide which product is best suited for your use case.
An A/B test is a way to compare two versions of a web page or app against each other. It is a type of controlled experiment that is used to determine the effect of a change to a variable on a measurable outcome.
To perform an A/B test, create two versions of a page or app, one as a control and the other with the measurable change. You then show these versions to different users and measure how they interact with them. The version that performs better is the winner.
System Architecture Overview
Let's look at A/B testing Address Validation in an ecommerce use case. The architecture diagram below shows how a customer would interact with your commerce experience allowing you to determine the more effective validation strategy.
[System Context] A/B Testing Address Validation
The systems involved when A/B testing the value of the Address Validation API.
The A/B testing process
When you are thinking about the overall A/B testing process, there are four stages to consider.
- Prep - Identify testing requirements, scope and timescale.
- Build - Implement the Place Autocomplete and Address Validation API in an environment to run the test against.
- Run - Collect metrics whilst the test is running, until significant results are gained or time has expired.
- Analyze - Compare the results with the hypothesis and identify next steps.
We'll speak about each of these in turn.
Prep
Deciding on A/B testing requirements
Initial discovery
Ask yourself: Why are you adding or changing an address validation provider? For example, using Google Maps Places Autocomplete:
- Saves time: You don't have to type out the entire name of a place when you can just start typing and see suggestions appear.
- Reduces errors: If you misspell the name of a place, Google Maps Places Autocomplete will still suggest the correct place.
There are many benefits to address validation, including:
- Improved delivery rates: Address validation can help to improve delivery rates by ensuring that mail and packages are sent to the correct address. This can save businesses time and money, and improve customer satisfaction.
- Improved data quality: Address validation can help improve data quality by identifying and correcting errors in addresses. This can improve the accuracy of marketing campaigns and other data-driven initiatives.
Deciding on hypothesis
Decide on your hypothesis to test. Here's two examples:
1. Conversion rate
When you add a type ahead solution, it is usual to see a slight increase in conversion rates, and this is a good metric to track. If you are changing your type ahead solution from another provider, then a flat conversion rate should be expected. If the conversion rate goes down, the first thing to check would be the implementation.
Conversion rate is important, but it may not tell the whole story. Adding an address validation solution is designed to catch people from submitting poor quality addresses at the point of entry, and may add some natural friction to address capture in some scenarios. This could lead to a drop in overall conversion rates, but this shouldn't necessarily be seen as a bad thing. The uncompleted orders due to the addition of address validation may have been associated with poor quality address data that would have resulted in a cost to the business through delivery chargebacks.
2. Reduction in poor quality addresses
This is where a good address validation solution can really shine. By implementing Address Validation, you should expect to see a reduction in poor quality address data.
If you are comparing a new solution to an existing one, it may be tempting to just compare the ‘good address' match rates, and select the service that provides a higher match rate. This can be misleading because one service may be providing more false positives than the other.
Instead, the more impactful metric is to compare the successful outcome of using the address data. Taking ecommerce as an example, the desired outcome of capturing an address would be the eventual successful delivery of a package.
Build
Now it's the exciting part! It's time to build a new solution for your customers. We already have a handy guide for implementing Place Autocomplete and Address Validation API on an ecommerce checkout. We recommend you check this out while completing this step.
Even if you're not building specifically for ecommerce, a lot of the information is still relevant, especially the guidance on determining address quality from the output of the Address Validation API.
Architecture Diagram
An example of the containers that could be used to build an A/B test in an ecommerce environment is below:
[Execution Environment] A/B Testing Address Validation
The important applications, services, and data stores, in the key systems, powering the architecture. (Click to enlarge.)
Validating the implementation
A poorly implemented solution will produce unreliable test results. Before running the A/B test, it's first important to validate the solution with a small user group to ensure it works as expected. This can be internal QA testers, and/or a selected group of external testers, who you trust to give constructive feedback.
Run
Ramping up slowly
Even with the solution validated, it's still a good idea to ramp up the test slowly, starting with a small group of users. By doing this, bugs or other issues can be caught early and quickly addressed without affecting a large percentage of your users.
Full test
Once the solution has been tested by a small group of users and any problems have been addressed, we can ramp up to a full A/B test. This doesn't necessarily have to be a true 50/50 split of traffic, but should be comparable in size with a randomly selected set of live usage.
Capturing Metrics
During the test, you should ensure that appropriate data to support your hypothesis is captured. You can use an A/B testing platform during this process, to ease this data collection, and later analysis. Google Maps Platform also collects API usage metrics which may be of use, you can check out this page to learn more about using our reporting tools.
Some suggested metrics are as follows:
Place Autocomplete
Conversion rate: Has the conversion/completion rate of your form improved from having no autocomplete solution previously?
Tool interaction: Are more users successfully interacting with Place Autocomplete compared with the previous solution?
Address Validation
Delivery success: Has there been a reduction in failed deliveries due to address quality?
Address changes: Has there been a reduction in the number of address change charges you have received from couriers?
Residential vs commercial: Has there been an improvement in capturing residential vs commercial data? (select markets only)
Analyze
Now the test is over, it's time to analyze the results against the original test criteria and hypothesis. If you used an A/B testing platform to complete the process, some information may already be available to you.
Going back to the Reduction in poor quality addresses section above, you can also use other metrics that may not have been captured by the A/B testing platform. This could be the rate of failed deliveries between the testing scenarios, with example data such as this:
Solution A | Solution B | |
---|---|---|
Failed deliveries | 1.75% | 1.23% |
Looking at the basic example above, it is clear that for this use case, Solution B would be the better choice.
Conclusion
We hope this guide has given you enough information to get you started on your A/B testing journey! While it has used examples from the ecommerce space, the same basic principles can be applied across the board. Pinpoint the successful outcome of having good quality address data in your business, and track that as your main hypothesis.
We've included the links mentioned in the guide again below, as suggested further reading.
Happy testing!
Next Steps
Download the Improve checkout, delivery, and operations with reliable addresses Whitepaper and view the Improving checkout, delivery, and operations with Address Validation Webinar.
Suggested further reading:
- Address Validation for Ecommerce Checkout
- Place Autocomplete Documentation
- Address Validation API Documentation
- Google Maps Platform Reporting
Contributors
Principal authors:
Henrik Valve | Google Maps Platform Solutions Engineer