Stay organized with collections
Save and categorize content based on your preferences.
1. Before you begin
Consider what's most important to your business based on the following types of customers and use cases and ensure your integration and the experiment reflect those priorities. Those criteria could include:
Customer type: large versus small advertisers, agencies, vertical type, geo footprint
Campaign objectives and conversion types: user acquisition, customer retention, purchases, revenue
Use cases: reporting, ROI analysis, bid optimization
2. Use cases
We often see summary reports used for reporting and event-level reports used for optimization (and possibly reporting as auxiliary data). To maximize measurement capabilities, combine event-level and aggregate-level; for example, based on Google Ads's methodology and Privacy Sandbox optimization research.
Check that the source-side and trigger-side keys that you are planning to use make sense for your use-cases
Example key structure to start with could be: a key structure that includes all dimensions you want to track. Based on the output you can test different key structures.
Testing various Epsilon values within Aggregation Service and able to provide a perspective on it
Batching strategy
Full understanding of the impact of different batching frequencies (e.g. hourly, daily or weekly) and how reports are batched (e.g. by advertiser X scheduled report time). Additional details in the Developer Docs andAgg Service Load Testing guidance
Test with at least one batching frequency and one advertiser
Testing different combinations of batching frequencies and report dimensions, and identifying optimal settings for their use-cases
This may also include having various strategies per advertiser or groups of advertisers (e.g. small, medium, large). See MTG Agg Service Load Testing guidance
Minimize report loss by adjusting batching strategy to account for potential delayed aggregatable reports
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-01-29 UTC."],[[["Prioritize business needs by considering customer types, campaign objectives, and use cases when integrating and experimenting with Attribution Reporting API."],["Combine event-level and summary reports for comprehensive measurement, leveraging them for optimization and reporting respectively."],["Optimize Attribution Reporting API setup by configuring source/trigger registration calls, utilizing both click-through and view-through conversions, and experimenting with reporting windows."],["Explore advanced techniques like noise reduction, aggregation service optimization, and batching strategies to enhance measurement accuracy and efficiency."],["Integrate with debugging tools to validate implementation, compare with existing measurement, and identify areas for improvement."]]],[]]