Conservation of Intent: The hidden reason why A/B tests aren’t as effective as they look

When a +10% isn’t really a +10%
OK, this is an infuriating startup experience: You ship an experiment that’s +10% in your conversion funnel. Then your revenue/installs/whatever goes up by +10% right?

Wrong :(

Turns out usually it goes up a little bit, or maybe not at all.

Why is that? Let’s call this the “Conservation of Intent” (Inspired by the Law of the Conservation of Momentum 😊)

The difference between high- and low-intent users
For all your users coming in, only some of them are high-intent. It’s hard to increase that intent just by making a couple steps easier – that’ll just grow your low-intent users. Doing tactical things like moving buttons above the fold, optimizing headlines, removing form fields – those are great, but the increases won’t directly drop to your bottom line.

In other words, the total amount of intent in your system is fixed. Thus the law of the conservation of intent!

This is why you can’t add up your A/B test results
If you’re at a company that A/B tests everything and then announces the great results – that’s wonderful, of course, but just run the thought experiment of summing together all of those A/B tests. And then look at your top-line results. Rarely does it match.

The most obvious way to see this is to test something high up on a funnel, for example maybe the landing page where a new user hits, or an email that a re-engaged users opens – you can see that a big lift on the top of the funnel flows down unevenly. Each step of friction burns off the low-intent users that are flowing step-by-step.

Be skeptical of internal results, but more importantly, external case studies too
If you’re at a big company and another team publishes a test result, make sure you agree on the actual final metric you’re trying to impact – whether that’s revenue, highly engaged users, or something else. Make sure you always review that.

Similarly, this is a reason to be skeptical of vendors and 3rd parties who have case studies that’ll increase your revenue by X just because they increase their ad conversion rate (or whatever) by X. In these kinds of misleading case studies – often presented at conferences – not only do vendors have the ability to only cherry pick the best examples that reinforce their case, but also the metric that’s highest impacted! Be skeptical and don’t be fooled.

Unlock increases to the bottom line
First, understand what’s really blocking your high-intent users. Those are the ones who’d like to flow all the way through the funnel, but can’t, for whatever reason. For Uber, that was things like payment methods, app quality (for Android especially!), the forgot password flow, etc. If you can’t pay or can’t get back into your account, then even if you use the app every day, you might switch to a different app that’s less of a pain in the ass.

Also, you can focus your experiments. You obviously get real net incremental increases on conversion the further down the funnel you go. By that point, the low-intent folks have burned off. You’re closer to the bottom line. Look the steps right around your transaction flow – for ecommerce sites that might be the process to review your cart and add your shipping info, or the request invoice flow for SaaS products, etc. Think about high-intent scenarios, for example when you hit a paywall or run out of credits/disk space/resources/etc. All of these can be optimized and it’ll hit the bottom line quickly.

Make sure your roadmap reflects reality
When it comes to your product roadmapping, yes you can definitely brainstorm and ship a bunch of +10% increases, but you need to add a discount factor to your spreadsheets to reflect reality. Can’t just add up all your results.

When you focus on low-intent folks, you’ll have to get creative to build their intent quickly. Things like being able to try out the product, having their friends into the product – these are the “activation” steps that generate intent. Here’s a great place to start – a highly relevant essay on getting users more psych’d, guest written by Darius Contractor from the Dropbox growth team.

Conservation of Intent
Many of you have directly experienced the “Conservation of Intent” but now you have a name for it! It’s tricky.

This is really a reflection of how working on product growth is really a combo of psychology and data-driven product. You can’t just look at this stuff in a spreadsheet and assume that a lift in one place automatically cascades into the rest of the model.

[Originally tweetstormed at @andrewchen – follow me for future updates!]

Published by

Andrew Chen

Andrew Chen is a general partner at Andreessen Horowitz, investing in startups within consumer and bottoms up SaaS. Previously, he led Rider Growth at Uber, focusing on acquisition, new user experience, churn, and notifications/email. For the past decade, he’s written about metrics, monetization, and growth. He is an advisor/investor for tech startups including AngelList, Barkbox, Boba Guys, Dropbox, Front, Gusto, Product Hunt, Tinder, Workato and others. He holds a B.S. in Applied Mathematics from the University of Washington

Exit mobile version