Get the App

Chapter 9 of 10

Module 9: Measuring ASO Impact on Revenue, Not Just Installs

Tie your ASO work to business outcomes by tracking the full funnel from impression to revenue using App Analytics and third-party tools.

15 min readen

Step 1 – Why ASO Has to Be Measured in Revenue, Not Just Installs

Most beginners judge App Store Optimization (ASO) success by installs alone. But installs are only the middle of the funnel.

To make smart business decisions, you need to connect:

> Storefront behavior → Installs → In‑app behavior → Revenue

Key idea: Two keywords or creatives can generate the same number of installs but very different revenue.

  • Keyword A: 1,000 installs → $0.20 ARPU → $200 revenue
  • Keyword B: 600 installs → $1.00 ARPU → $600 revenue

If you optimize only for installs, you’d favor Keyword A. If you optimize for revenue, you’d prioritize Keyword B.

In this module you will learn how to:

  1. Read App Store metrics (impressions, product page views, conversion rate).
  2. Connect them to in‑app metrics (retention, ARPU, LTV).
  3. Attribute revenue to product pages, Custom Product Pages (CPPs), keywords, and campaigns.
  4. Build a simple ASO revenue dashboard and basic benchmarks.

We’ll focus mostly on Apple’s ecosystem (App Store Connect App Analytics + SKAdNetwork-era attribution) but the logic also applies to Google Play with its own tools.

Step 2 – Map the Full Funnel: From Impression to Revenue

To measure ASO impact on revenue, you need a clear funnel model.

1. Storefront funnel (App Store layer)

Using App Store Connect → App Analytics:

  • Impressions – How many times your app appears in search results, Today tab, Browse, or as an ad.
  • Product Page Views (a.k.a. taps) – How many users opened your product page.
  • Conversion Rate (CR) – Percentage of product page views that result in a download.

Basic storefront funnel:

> Impressions → Product Page Views → Installs

2. In‑app funnel (Product & monetization layer)

Using in‑app analytics (e.g., Firebase, Amplitude, Mixpanel) and/or MMPs (e.g., AppsFlyer, Adjust, Branch, Singular):

  • Activation – Reaching a first key action (e.g., sign‑up, first level completed).
  • Retention – Users coming back on D1, D7, D30, etc.
  • Monetization – In‑app purchases, subscriptions, ads.

Key metrics:

  • ARPU (Average Revenue Per User) = total revenue / number of users.
  • LTV (Lifetime Value) = total revenue you expect from a user over their lifetime.

3. Combined ASO → Revenue funnel

The full picture:

> Impressions → Product Page Views → Installs → Activation → Retention → Revenue (ARPU, LTV)

Your goal in this module: learn how to track and compare this funnel for different keywords, creatives, and CPPs so you can prioritize what truly drives revenue.

Step 3 – Example: Calculating Key Funnel Metrics

Let’s walk through a simple example to make the metrics concrete.

Imagine you run an iOS meditation app. For the last 7 days you see in App Store Connect (for Search results only):

  • Impressions: 50,000
  • Product Page Views: 5,000
  • First‑time Downloads: 1,000

In your in‑app analytics for the same cohort of users:

  • Paying users: 100
  • Total revenue from this cohort: $800 (including trials that converted within the period)

1. Tap‑through rate (TTR)

Also called tap‑through rate or view‑through rate.

```text

TTR = Product Page Views / Impressions

= 5,000 / 50,000

= 0.10 → 10%

```

2. Conversion rate (store page → install)

```text

CR = Installs / Product Page Views

= 1,000 / 5,000

= 0.20 → 20%

```

3. ARPU (Average Revenue Per User)

```text

ARPU = Revenue / Installs

= $800 / 1,000

= $0.80 per user

```

4. LTV (simplified, period‑based)

If you only look at this 7‑day window, your 7‑day LTV for this cohort is also $0.80. In practice, LTV is modeled over a longer horizon (e.g., 90 days or 1 year), but this simple snapshot is already useful for comparing ASO variants.

Insight:

  • A/B testing a new icon that increases CR from 20% to 24% at the same ARPU would directly increase revenue from this traffic by ~20%.

Step 4 – Connect App Store Data to In‑App Revenue (Tools & Setup)

To tie ASO to revenue, you must connect App Store analytics with in‑app analytics/attribution.

1. Apple tools

As of early 2026, key Apple tools are:

  • App Store Connect → App Analytics
  • Metrics: Impressions, Product Page Views, Conversion Rate, Retention, Sales, Proceeds.
  • Dimensions: Source (Search, Browse, App Referrer, Web Referrer), Country/Region, Device, Custom Product Page, etc.
  • Custom Product Pages (CPPs)
  • Each CPP has a unique URL you can use in Apple Search Ads and other campaigns.
  • App Analytics reports performance per CPP, including conversion and revenue.
  • SKAdNetwork (SKAN 4.x)
  • Apple’s privacy‑preserving attribution framework.
  • Aggregated, delayed postbacks; you typically read it through your MMP or ad network dashboards.

2. Third‑party tools (very common in industry)

  • MMPs (Mobile Measurement Partners) – e.g., AppsFlyer, Adjust, Branch, Singular
  • Attribute installs and revenue to campaigns, CPPs, and sometimes keyword themes.
  • Provide cohort reports (D1/D7/D30 revenue, retention, ROAS).
  • Product analytics – e.g., Firebase, Amplitude, Mixpanel
  • Track in‑app events (signup, level complete, purchase).
  • Build funnels and retention curves.

3. Practical setup checklist

For each ASO experiment (new icon, screenshots, keyword focus, CPP):

  1. Name variants clearly (e.g., `CPPFitnessNewYear_2026`).
  2. Ensure each CPP is used in distinct campaigns in Apple Search Ads or other channels.
  3. In your MMP or analytics tool, tag campaigns with the CPP name or keyword theme.
  4. Align time windows when comparing App Store data and in‑app revenue (e.g., same 7‑day or 30‑day cohort).

This setup lets you say things like: “CPP A for ‘weight loss’ traffic drives 30% higher 30‑day LTV than CPP B for generic ‘fitness’ traffic.”

Step 5 – Thought Exercise: Which Variant Actually Wins?

You run an experiment with two Custom Product Pages for your language learning app. Both target Apple Search Ads users searching for Spanish‑related keywords.

You collect 14‑day data:

#### CPP Variant A – Generic

  • Impressions: 40,000
  • Product Page Views: 4,000
  • Installs: 1,200
  • Revenue from this cohort (14 days): $720

#### CPP Variant B – Focused on "Spanish for Travel"

  • Impressions: 25,000
  • Product Page Views: 3,000
  • Installs: 900
  • Revenue from this cohort (14 days): $990

Your task:

  1. Calculate for each variant:
  • Tap‑through rate (TTR)
  • Conversion rate (CR)
  • ARPU (14‑day)
  1. Decide which variant is better for the business, and explain why.

Write down your answers before moving on.

---

Suggested solution (check yourself)

Variant A

  • TTR = 4,000 / 40,000 = 10%
  • CR = 1,200 / 4,000 = 30%
  • ARPU = $720 / 1,200 = $0.60

Variant B

  • TTR = 3,000 / 25,000 = 12%
  • CR = 900 / 3,000 = 30%
  • ARPU = $990 / 900 = $1.10

Interpretation:

  • Variant B has fewer installs (900 vs 1,200) but much higher ARPU ($1.10 vs $0.60).
  • For every 1,000 installs, B would generate almost 2x the revenue.

Conclusion:

  • If ad costs and traffic quality are similar, Variant B is the better business choice, even though it delivers fewer installs.
  • This is exactly why ASO should be optimized for revenue, not just installs.

Step 6 – Segmenting by Keyword Themes and Campaigns

To find high‑value cohorts, you should segment your data by keyword themes and campaigns, not just by country or device.

1. Why keyword themes?

Individual keywords can be noisy and overlapping. Group them into themes that reflect user intent.

Example for a fitness app:

  • Weight loss intent: “lose weight”, “fat loss workout”, “weight loss app”
  • Muscle building intent: “build muscle”, “strength training app”
  • Home workout intent: “home workout”, “no equipment workout”

You can then compare:

  • TTR and CR in App Store Connect (per keyword or ad group)
  • ARPU, LTV, retention in your MMP/analytics (per campaign or ad group mapped to a theme)

2. Practical setup in Apple Search Ads

In Apple Search Ads (ASA):

  • Create ad groups aligned with keyword themes.
  • Link each ad group to a relevant CPP.
  • Use consistent naming conventions (e.g., `USSearchWeightLoss_CPP1`).

Then you can:

  • Pull App Store metrics per ad group (impressions, TTR, CR).
  • Use your MMP to see revenue and LTV per ad group.

3. Interpreting results

You might find that:

  • "Weight loss" theme → lower CR but higher ARPU.
  • "Home workout" theme → higher CR but lower ARPU.

That insight guides you to:

  • Bid more aggressively on high‑LTV themes.
  • Tailor CPP creatives to the specific intent (e.g., show weight‑loss transformations vs home‑workout convenience).

The same logic applies on Google Play using Google Ads keyword campaigns and store listing experiments, even though the interfaces differ.

Step 7 – Simple ASO Revenue Dashboard Logic (Spreadsheet or Code)

You can build a basic ASO revenue dashboard in Excel/Google Sheets or in code. Below is a simple Python‑style pseudocode that mirrors what you’d calculate in a spreadsheet.

```python

Example: Evaluate ASO performance for different CPPs or keyword themes

aso_data = [

{

"name": "CPP_Generic",

"impressions": 50000,

"productpageviews": 6000,

"installs": 1500,

"revenue": 900.0 # in USD

},

{

"name": "CPP_WeightLoss",

"impressions": 30000,

"productpageviews": 4500,

"installs": 1300,

"revenue": 1300.0

}

]

for row in aso_data:

impressions = row["impressions"]

views = row["productpageviews"]

installs = row["installs"]

revenue = row["revenue"]

ttr = views / impressions if impressions else 0 # Tap-through rate

cr = installs / views if views else 0 # Conversion rate

arpu = revenue / installs if installs else 0 # Average revenue per user

revperimpression = revenue / impressions if impressions else 0

print(f"=== {row['name']} ===")

print(f"TTR: {ttr:.2%}")

print(f"CR: {cr:.2%}")

print(f"ARPU: ${arpu:.2f}")

print(f"Revenue per impression: ${revperimpression:.4f}\n")

```

What to replicate in a spreadsheet:

For each row (CPP, keyword theme, or campaign), create columns:

  • `TTR = Product Page Views / Impressions`
  • `CR = Installs / Product Page Views`
  • `ARPU = Revenue / Installs`
  • `RevenueperImpression = Revenue / Impressions`

Then sort by `Revenue_per_Impression` or `ARPU` to see which ASO elements drive the most value.

This is a simple but powerful way to:

  • Compare CPPs
  • Compare keyword themes
  • Compare seasonal campaigns (e.g., New Year, Back to School)

Even if you don’t code, understanding this logic helps you structure your Google Sheets dashboard correctly.

Step 8 – Quiz: Connecting Store Metrics to Revenue

Answer this question to check your understanding of which metric matters most for prioritizing ASO changes.

You test two new icons for your app. Icon X increases installs by 25% with no change in ARPU. Icon Y keeps installs flat but increases ARPU by 40%. All other factors (cost, traffic volume, seasonality) are similar. Which icon is better for long‑term ASO performance and why?

  1. Icon X, because more installs always mean more revenue in the long run.
  2. Icon Y, because higher ARPU means each user is more valuable, even if install volume is unchanged.
  3. Both are equally good because one improves installs and the other improves ARPU.
Show Answer

Answer: B) Icon Y, because higher ARPU means each user is more valuable, even if install volume is unchanged.

Icon Y is better for long‑term ASO performance because **ARPU directly reflects revenue per user**. If installs stay constant but ARPU grows by 40%, your total revenue from the same volume of traffic increases by 40%. Icon X only increases installs by 25% with no change in user value, which is a smaller revenue gain. Optimizing ASO for **revenue and LTV**, not just installs, is the goal.

Step 9 – Flashcards: Key Terms for Revenue‑Focused ASO

Use these flashcards to review the most important concepts from this module.

Tap‑Through Rate (TTR)
The percentage of **impressions** that turn into **product page views**. Formula: Product Page Views / Impressions.
Conversion Rate (CR)
The percentage of **product page views** that become **installs**. Formula: Installs / Product Page Views.
ARPU (Average Revenue Per User)
Average revenue earned per user in a given period. Formula: Total Revenue / Number of Users (or Installs).
LTV (Lifetime Value)
The total revenue you expect to earn from a user over their entire lifetime using the app. Often modeled over 90 days or longer.
Custom Product Page (CPP)
A variant of your App Store product page with different screenshots, promo text, etc., each with a unique URL. Used to target specific audiences or campaigns and measured separately in App Analytics.
High‑value cohort
A group of users (e.g., from a specific keyword theme, CPP, or campaign) that generates **above‑average ARPU or LTV**, even if their install volume is smaller.
Revenue per Impression
Total revenue divided by total impressions. Shows how much revenue each impression is worth and helps compare ASO variants on a like‑for‑like basis.

Step 10 – Mini Project: Design Your Own ASO Revenue Analysis

Apply what you’ve learned by outlining a mini ASO revenue analysis for a hypothetical app.

Pick one app type (e.g., language learning, meditation, fitness, budgeting). Then answer these prompts:

  1. Define your main ASO experiment
  • Example: “Test a new CPP focused on ‘exam preparation’ vs generic CPP for a language learning app.”
  1. List the metrics you will track at each stage
  • Storefront: Impressions, Product Page Views, TTR, CR.
  • In‑app: D1/D7 retention, ARPU (7‑day or 30‑day), LTV proxy.
  1. Describe how you will segment users
  • By keyword theme (e.g., “exam prep”, “travel”, “business English”).
  • By CPP or campaign name.
  1. Explain your success criteria
  • Example: “We will consider the new CPP successful if 30‑day ARPU is at least 20% higher than the control, even if installs are slightly lower.”
  1. Sketch a simple dashboard (in words)
  • What rows? (e.g., `CPPExamPrep`, `CPPGeneric`)
  • What columns? (Impressions, TTR, CR, Installs, Revenue, ARPU, Revenue/Impression).

Write your answers in a document or notes app. If you have access to App Store Connect or a sandbox project, try to replicate this structure with real or sample data.

This exercise mirrors how growth and ASO teams work in industry today: tying store changes to revenue outcomes and using that data to prioritize the next iteration.

Key Terms

Attribution
The process of determining which source (keyword, campaign, ad, CPP, etc.) should get credit for an install and the revenue that follows.
Impressions
The number of times your app appears on the App Store (e.g., in search results, Today tab, or Browse) during a selected period.
Product Page Views
The number of times users open your app’s product page on the App Store after seeing it somewhere in the store.
Conversion Rate (CR)
The ratio of installs to product page views, showing how effectively your product page turns visitors into users. CR = Installs / Product Page Views.
LTV (Lifetime Value)
An estimate of the total revenue a user will generate for your app over their entire lifetime, often modeled over a fixed horizon like 90 or 365 days.
Revenue per Impression
A metric that divides total revenue by total impressions to show how much revenue each impression is worth, useful for comparing ASO variants.
Tap‑Through Rate (TTR)
The ratio of product page views to impressions, indicating how often users tap to view your page after seeing your app. TTR = Product Page Views / Impressions.
Custom Product Page (CPP)
A customizable version of your App Store product page with unique creatives and a shareable URL, allowing you to target specific audiences and measure performance separately.
ARPU (Average Revenue Per User)
Average revenue generated per user or per install in a given time window. ARPU = Total Revenue / Number of Users.
MMP (Mobile Measurement Partner)
A third‑party attribution platform (e.g., AppsFlyer, Adjust, Branch, Singular) that helps track where installs and in‑app events come from and how much revenue they generate.