04/17/2025 | News release | Distributed by Public on 04/17/2025 10:15
Illustration by Ibrahim Rayintakath
Last week, we shared new data that helps businesses understand the conversion and revenue benefits of offering different payment methods. We tested the impact of more than 50 different payment methods and found that when at least one additional relevant payment method beyond cards was dynamically surfaced, businesses on average saw a 12% increase in revenue and a 7.4% increase in conversion. In particular, businesses saw the biggest uplift when offering popular local payment methods, digital wallets, and bank debits.
Now that we've shared the results, we want to share how we ran the experiment. As we explain in the rest of this post, we had to test millions of combinations of payment methods while ensuring that we maintained a consistent shopping experience for customers. We'll walk you through some of the key decisions we made, and lessons we learned, that can help you better understand how to measure the impact of payment methods for your business.
We wanted our experiment to measure the impact on conversion and revenue of adding one or more payment methods. We accomplished this with a randomizing algorithm that randomly chose a set of payment methods to withhold from customers during checkout.
The randomness was important for making the experiment results reliable and unbiased, but at the same time, we didn't want it to create inconsistent experiences for customers. For example, if a business's customer needed to refresh the page or exit their session on any of our payment surfaces and then return later, we wanted to be sure they saw the same set of payment methods they had initially. (This was important from a customer experience perspective and also from an experimental one-it would have introduced noise into the data if the same customer, making essentially the same purchase, had seen two different sets of payment methods.)
Keeping payment methods consistent across related sessions was easy for transactions on Stripe Checkout because each customer session has a unique session ID that persists even when a customer refreshes or exits the page. On the other hand, with Payment Links, a new customer session-and new session ID-is created each time the page is loaded or refreshed.
To keep payment methods consistent across related sessions on Payment Links, we built a composite string of identifiers:
We combine these inputs into a single string and use a hash function to randomly assign a number to the string. Then, we gather all the payment methods being tested in the experiment, and compute all possible combinations of them, assigning a number to each combination. Finally, we match the hashed number with the combination number, and hold back (not display) that combination. This approach allows us to continue using randomness in our experiment, but in a way that maintains consistency for customers who need to refresh the page.
After we'd figured out how to balance payment method randomization with customer experience, we had to ensure that our experiment ran over a large enough sample size to uncover meaningful patterns. In particular, we wanted to be able to see how the impact of adding noncard payment methods varied based on where a customer was located and the device they were using to make a purchase. This would provide more useful insights for our users, but it also meant ensuring we had meaningful subsamples for each device type and location we wanted to control for. To solve this, we followed the same pretesting approach that is used in randomized controlled trials:
This is considered to be the gold-standard approach used in clinical trials, but businesses often skip it in their own experiments because it's time-intensive to implement. We decided to devote the internal resources to doing it because we wanted to be confident about the insights we shared with users.
We wanted the experiment to assess not just the impact of adding payment methods, but also how payment methods perform differently based on factors like customer location and the industry in which the business operates. With more than 50 payment methods in our experiment across 200 buyer countries, this meant we had to test more than 2 million combinations of each payment method and country pair.
This added complexity as we began to analyze the results. We had to navigate millions of data points to identify the results that were most meaningful to businesses. Doing this manually would have resulted in days of analysis by a team of data scientists. Instead, to analyze the data at scale, we used AI to build a causal forest model. This model used a decision tree-like format to gradually split the data into smaller and smaller segments to find the conversion uplift for each.
The casual forest model automatically decides how to split the data based on all the different criteria from our experiment. For example, the tree could split data by business location or industry. For each split, the model asks itself if it has enough information to calculate the conversion uplift: if yes, it calculates the uplift, and if no, it splits again by another factor to get more information.
The data in this diagram is illustrative and does not represent any findings from our experiment.
We built hundreds of these causal trees, which allowed us to dive deeper into segments and share more tailored results with users.
This experiment is just one example of how we're helping you optimize you checkout. You can now turn on more than 40 payment methods through a single integration with Stripe's Optimized Checkout Suite. We also launched our no-code payment method A/B testing tool to allow you to run your own experiments (and validate any learnings we share). More importantly, the AI models built into the Optimized Checkout Suite handle the logic for displaying eligible payment methods for each transaction-removing the need to know and encode specific eligibility requirements.
To learn more about payment methods or get access to Stripe's Optimized Checkout Suite, read our docs or get in touch with an expert from our team. And if you find this work exciting, come join us.
*We collect UserAgent and IP address based on the user's instruction and where permitted by applicable law and our Privacy Policy.