4 min read
Programmatic Measurement
Track and optimize programmatic campaign performance effectively.
Core Metrics
Essential programmatic performance metrics.
code
Delivery Metrics:
├── Impressions: Ads served
├── Reach: Unique users
├── Frequency: Impressions per user
├── Win Rate: Bids won / bids made
└── Fill Rate: Impressions / requests
Engagement Metrics:
├── CTR: Clicks / impressions
├── Viewability: % viewable impressions
├── Completion rate: Video completes
├── Engagement rate: Interactions
└── Time-in-view: Seconds visible
Efficiency Metrics:
├── CPM: Cost per 1,000 impressions
├── CPC: Cost per click
├── CPV: Cost per view
├── CPCV: Cost per completed view
└── CPA: Cost per acquisition
Outcome Metrics:
├── Conversions: Goal completions
├── Revenue: Sales generated
├── ROAS: Return on ad spend
├── LTV: Lifetime value
└── Incrementality: True liftBenchmark Reference
code
Industry Averages:
├── CTR: 0.1-0.3% (display)
├── CTR: 0.5-1.5% (native)
├── CTR: 0.5-2.0% (video)
├── Viewability: 60-70%
├── Video completion: 70-80%
└── CPM: $1-$10 (open exchange)
$10-$30 (premium/PMP)Attribution Approaches
Measuring programmatic's contribution.
code
Attribution Models:
├── Last Click
│ └── 100% credit to final click
├── First Click
│ └── 100% credit to first touch
├── Linear
│ └── Equal credit all touches
├── Time Decay
│ └── More recent = more credit
├── Position-Based
│ └── 40/20/40 first/middle/last
└── Data-Driven
└── Algorithm-basedView-Through Attribution
code
View-Through Conversions:
├── User sees ad (no click)
├── User converts later
├── Conversion attributed to view
└── Window: 1-30 days typically
Best Practices:
├── Use shorter windows (1-7 days)
├── De-duplicate with clicks
├── Compare to holdout groups
├── Don't over-credit
└── Validate with incrementalityCross-Device Attribution
code
Cross-Device Challenge:
├── User sees ad on mobile
├── User converts on desktop
├── Connection needed
Solutions:
├── Deterministic (login-based)
│ └── Same user logged in
├── Probabilistic (modeled)
│ └── Statistical matching
├── Device graphs
│ └── Third-party connections
└── Panel-based
└── Representative sampleViewability & Attention
Beyond impressions to actual visibility.
code
Viewability Standards (MRC):
├── Display: 50% pixels for 1 second
├── Video: 50% pixels for 2 seconds
├── Large format: 30% pixels for 1 second
└── Industry avg: 60-70%
Viewability Factors:
├── Ad position (above fold better)
├── Page layout
├── User behavior
├── Ad format
├── Device type
└── Publisher qualityAttention Metrics
code
Beyond Viewability:
├── Time-in-view: Seconds visible
├── Hover time: Active engagement
├── Scroll depth: Position reached
├── Attention units (vendor-specific)
└── Quality exposure
Attention Providers:
├── Adelaide (AU metric)
├── Lumen (eye tracking)
├── TVision (CTV attention)
└── Moat/Oracle attentionIncrementality Testing
Measure true advertising impact.
code
Incrementality Definition:
├── Conversions caused by ads
├── vs conversions that would happen anyway
├── True causal impact
└── Gold standard measurement
Testing Methods:
├── Holdout Tests
│ ├── Control: No ads
│ ├── Test: Ads served
│ └── Compare conversion rates
├── Geo Tests
│ ├── Test markets: Ads
│ ├── Control markets: No ads
│ └── Compare outcomes
└── Ghost Bidding
├── Bid but don't serve
├── Track conversions anyway
└── Compare to servedTest Design
code
Holdout Test Setup:
├── Sample size: Statistical power
├── Duration: Enough conversions
├── Randomization: True random
├── Measurement: Identical tracking
└── Analysis: Statistical significance
Example:
├── Test: 90% of audience sees ads
├── Control: 10% holdout (no ads)
├── Measure: Conversion rate difference
├── Calculate: Incremental lift
└── Validate: Statistical significanceReporting Framework
Structure performance reporting effectively.
code
Report Cadence:
├── Real-Time
│ ├── Pacing (budget, impressions)
│ ├── Alerts (brand safety)
│ └── Active optimizations
├── Daily
│ ├── Performance summary
│ ├── Top/bottom performers
│ └── Action items
├── Weekly
│ ├── Full metrics review
│ ├── Optimization log
│ └── Test results
├── Monthly
│ ├── Executive summary
│ ├── Trend analysis
│ ├── Learnings
│ └── Recommendations
└── Quarterly
├── Strategic review
├── Incrementality results
└── Planning forwardDashboard Structure
code
Executive View:
├── Spend vs budget
├── Key KPIs (CPA, ROAS)
├── Trend charts
└── Highlights/concerns
Performance View:
├── All metrics detail
├── Segment breakdowns
├── Audience performance
├── Publisher performance
└── Creative performance
Optimization View:
├── What's working
├── What's not working
├── Test results
├── Recommendations
└── Action logPro Tip: Don't rely solely on platform-reported conversions. Implement incrementality testing at least quarterly to understand true advertising impact. Many "conversions" would have happened anyway—knowing your real incremental lift changes how you allocate budget.