TL;DR:
- An attribution window is the defined time period after a user clicks or views an ad during which subsequent conversions can be credited to that marketing touchpoint—typically ranging from 1 to 90 days depending on sales cycle length and business model (B2C vs B2B).
- Misaligned attribution windows across platforms (Meta’s 7-day default vs Google’s 30-day default) create cross-platform reporting discrepancies where total attributed conversions exceed actual conversions by 30-60%, inflating ROAS and masking true channel performance.
- Optimizing attribution windows to match actual customer behavior—analyzing time-to-conversion data and setting windows that capture 85-90% of legitimate conversions without overcounting delayed organic conversions—improves attribution accuracy by 25-40% and enables rational cross-channel budget allocation.
What Is an Attribution Window?
An attribution window (also called a conversion window or lookback window) is the maximum time period following a marketing interaction—such as an ad click or impression—during which a conversion event (purchase, form submission, app install) can be attributed back to that original touchpoint.
If a user clicks a Facebook ad on Monday and converts on Thursday (3 days later), a 7-day attribution window would credit the conversion to Facebook. If that same user waited 10 days to convert, only attribution windows of 10+ days would assign credit—shorter windows would treat this as an organic or unattributed conversion.
Attribution windows operate separately for click-through and view-through conversions. Click-through attribution windows measure time from ad click to conversion (typically 7-30 days). View-through attribution windows measure time from ad impression without click to conversion (typically 1-7 days, since passive ad exposure has weaker influence than active clicks). Platforms configure both independently: Meta defaults to 7-day click / 1-day view; Google Ads defaults to 30-day click / 1-day view; GA4 defaults to 90 days for most conversions.
Test LeadSources today. Enter your email below and receive a lead source report showing all the lead source data we track—exactly what you’d see for every lead tracked in your LeadSources account.
Why Attribution Windows Matter for Accurate ROI Measurement
Attribution windows fundamentally determine which conversions get credited to paid marketing versus organic channels, directly impacting reported ROAS, CAC, and budget allocation decisions.
Setting windows too long inflates attributed performance by claiming conversions that would have occurred organically. A 90-day attribution window credits a Facebook ad clicked in January for a March purchase—but did that January ad truly drive the March conversion, or did the customer research independently, receive email nurture, see organic social mentions, and decide to buy based on accumulated brand exposure unrelated to the original ad? Research from attribution platforms shows that 30-40% of conversions credited in extended attribution windows (60+ days) would have occurred without the attributed touchpoint, representing false positive attribution.
Setting windows too short undercounts paid marketing impact by missing legitimate assisted conversions. B2B software purchases averaging 45-day sales cycles can’t be properly measured with 7-day windows—prospects clicking Google Ads, researching for 3-4 weeks, then converting receive no paid search credit under short windows despite search ads clearly initiating the journey. This systematic undercounting leads to underinvestment in channels that drive early-stage awareness but convert beyond short window thresholds.
Cross-platform window discrepancies create reporting inflation where attributed conversions exceed actual conversions. If Meta (7-day window), Google (30-day window), and LinkedIn (90-day window) all claim credit for the same customer’s conversion through different touchpoints at different times, your dashboard shows 300 attributed conversions from 100 actual customers—each platform legitimately claiming credit within its window settings, but collectively overstating performance by 200%. CFOs examining aggregate ROAS across platforms see mathematically impossible totals (4.2x + 3.8x + 3.1x = 11.1x reported ROAS from 2.8x actual blended ROAS) and lose confidence in marketing measurement entirely.
How Attribution Windows Work Across Platforms
Each advertising platform implements attribution windows differently, with varying defaults and configuration options.
Meta (Facebook/Instagram) Attribution Windows: Default is 7-day click / 1-day view. Configurable options include 1-day click, 7-day click, or 1-day engaged view (new 2024 default that requires 2+ seconds ad viewing). Conversion credit flows to Meta if action occurs within window after last Meta interaction. Meta changed from 28-day click / 1-day view default to 7-day in 2021, reducing reported conversions for most advertisers by 15-30% overnight—a window compression reflecting iOS ATT privacy restrictions and industry shift toward shorter, more conservative windows.
Google Ads Attribution Windows: Default is 30-day click / 1-day view for Search, Display, Video. Configurable from 1 to 90 days for clicks. Google’s longer default versus Meta creates systematic cross-platform discrepancies: campaigns with 20-day average time-to-conversion show strong Google ROAS (30-day window captures delayed conversions) but weak Meta ROAS (7-day window misses same conversions), driving incorrect conclusion that “Google performs better” when window mismatch, not channel quality, explains the gap.
GA4 Attribution Windows: Default is 90 days for most conversion events, with option to configure 7, 30, 60, or 90 days. GA4 uses data-driven attribution model that distributes credit across touchpoints within the window. Crucially, GA4’s 90-day default often exceeds platform windows (Meta 7-day, Google 30-day), meaning GA4 reports fewer platform-sourced conversions than platforms report for themselves—a reverse discrepancy where platform dashboards show inflated performance versus third-party GA4 measurement.
Configuring Optimal Attribution Windows by Business Type
Optimal attribution window length depends on actual customer conversion lag patterns, which vary dramatically by business model and product complexity.
Fast-Cycle E-commerce (Impulse Purchases, <$100 AOV): 1-7 day windows align with same-day or next-day purchase patterns for apparel, beauty, consumables. 85-90% of conversions occur within 3 days of initial click. Recommendation: 7-day click / 1-day view matches Meta default; longer windows overcount by crediting ads for delayed organic purchases driven by retargeting, email, or repeat intent unrelated to initial ad exposure.
Considered-Purchase E-commerce ($100-$500 AOV): 14-30 day windows capture research phases for electronics, home goods, moderate-ticket items. Average time-to-conversion: 7-12 days. Recommendation: 30-day click / 7-day view balances capturing legitimate delayed conversions against overcounting. Test 14-day vs 30-day to measure incremental conversions gained—if 30-day window adds <10% conversions beyond 14-day, shorter window reduces inflation with minimal undercounting.
High-Ticket E-commerce (>$500 AOV): 30-60 day windows for furniture, jewelry, luxury goods with extended consideration. Average time-to-conversion: 18-30 days. Recommendation: 60-day click / 14-day view captures authentic long-cycle purchases. Validate with cohort analysis: track conversions by days-since-click to identify where curve flattens—if 90% of conversions occur within 45 days, 60-day window is optimal; extending to 90 days adds marginal conversions with high false-positive risk.
B2B SaaS (SMB, Transactional): 30-60 day windows for low-touch, self-serve SaaS ($50-$500 MRR). Sales cycles: 14-35 days from awareness to free trial to paid conversion. Recommendation: 45-day click / 7-day view captures multi-week evaluation typical of SMB software buying while avoiding 90-day overcounting where initial ad exposure has decayed and later touchpoints (webinars, sales calls, peer references) become primary drivers.
B2B SaaS (Mid-Market/Enterprise): 90-180 day windows for complex, sales-assisted deals ($5K-$50K+ ACV). Average sales cycle: 60-120 days involving multiple stakeholders, demos, procurement. Recommendation: 90-day click / 30-day view for paid channels that initiate pipeline, but supplement attribution with multi-touch models and CRM integration to credit multiple touchpoints across extended journeys. Single-window attribution systematically fails for enterprise B2B—120-day deal influenced by Google Ad (day 1), webinar (day 30), SDR outreach (day 60), and demo (day 90) cannot be accurately credited to Google within any single attribution window; requires full journey tracking like LeadSources.io provides.
Common Attribution Window Configuration Mistakes
Mistake 1: Using platform defaults without analyzing actual time-to-conversion. Meta’s 7-day default optimizes for Meta’s reporting needs (shorter windows reduce iOS ATT undercounting), not your business reality. If 40% of conversions occur 8-15 days post-click, 7-day window systematically undercounts by missing this tail. Solution: Export conversion data, calculate distribution of days-from-click-to-conversion, set window to capture 85-90th percentile while excluding extreme outliers (60+ day lags likely represent organic re-engagement, not ad influence).
Mistake 2: Mismatched windows across platforms creating false performance comparisons. Comparing Meta (7-day) ROAS to Google (30-day) ROAS is apples-to-oranges—Google’s longer window mechanically inflates its reported performance for products with 15-25 day purchase consideration. Solution: Standardize attribution windows across all platforms to enable valid cross-channel comparison; set all to 30 days or all to 14 days based on your median time-to-conversion, then rerun historical analysis to establish baseline performance under consistent measurement.
Mistake 3: Ignoring view-through attribution entirely or setting unrealistic view windows. Disabling view-through (0-day view window) undercounts upper-funnel awareness campaigns where impressions drive later direct/organic conversions. Conversely, 7-day view windows overcount—did that banner impression 6 days ago truly influence today’s purchase, or did customer journey through 15 other touchpoints render original impression irrelevant? Solution: 1-day view window for display/video is industry best practice; extended to 3-day view only for high-frequency retargeting where ad exposure actively reminds high-intent users.
Mistake 4: Setting windows based on campaign type rather than customer behavior. Some advertisers use 30-day windows for prospecting and 7-day for retargeting, assuming retargeting converts faster. Reality: retargeting audiences include mix of immediate-converters (same-day) and delayed-converters (8-14 days who need multiple retargeting exposures)—short windows miss the delayed segment. Solution: Analyze time-to-conversion by audience segment, not campaign type; data may reveal prospecting and retargeting have similar lag distributions requiring uniform windows.
Mistake 5: Never updating attribution windows as business evolves. Attribution windows set in 2021 may no longer reflect 2026 customer behavior—supply chain delays, competitive dynamics, or product line changes alter purchase timelines. A brand that initially sold $50 impulse products (7-day window appropriate) expanding into $500 considered-purchase items needs window extension to avoid undercounting new product line. Solution: Quarterly review of time-to-conversion distributions; adjust windows if median lag shifts >20% or if 85th percentile moves beyond current window threshold.
Standardizing Attribution Windows for Cross-Platform Measurement
Achieving consistent cross-platform attribution requires deliberate window alignment and reporting governance.
Step 1: Calculate actual time-to-conversion distribution. Export last 90 days of conversion data with timestamp of first paid click and conversion timestamp. Calculate days-between for each conversion. Generate distribution: X% convert within 1 day, Y% within 7 days, Z% within 30 days. Identify 85th percentile—window length where 85% of conversions fall within. This becomes your standardized window target.
Step 2: Configure all platforms to matched windows. If 85th percentile is 21 days, set Meta to 30-day click (closest available option above 21), Google to 30-day click, LinkedIn to 30-day click. Accept that no platform offers exact 21-day granularity; choose consistent rounding (all to next available increment: 30 days). Document standard in measurement framework: “All paid channels measured on 30-day click / 1-day view attribution windows effective Q1 2026.”
Step 3: Implement third-party unified tracking. Platform-native attribution will always have some discrepancy due to technical differences (Meta’s pixel vs Google’s gtag vs LinkedIn’s Insight Tag). Deploy unified tracking via LeadSources.io or similar attribution platform that captures all paid touchpoints with consistent window logic applied uniformly—every channel measured with same 30-day window against same conversion definition, eliminating platform-specific measurement quirks.
Step 4: Establish reporting standards with window-adjusted metrics. When comparing platforms, always note window settings in reports: “Meta: 3.2x ROAS (30-day window); Google: 2.8x ROAS (30-day window); LinkedIn: 2.1x ROAS (30-day window).” This transparency prevents misinterpretation and builds CFO confidence that reported metrics are methodologically sound. If business requirements force different windows (compliance reasons, technical limitations), report “window-adjusted ROAS” that normalizes performance to common baseline.
Step 5: Monitor attribution inflation via total-conversions audit. Sum attributed conversions across all platforms for a given month. Compare to actual total conversions from your CRM or order system. If attributed conversions exceed actual by >15%, you have meaningful overlap/inflation indicating windows are too long or multiple platforms are claiming same conversions. Tighten windows by 25% (e.g., 30-day to 21-day) and remeasure; optimal window minimizes inflation while maximizing coverage of legitimate conversions.
Advanced Attribution Window Strategies
Dynamic Windows by Channel Type: Instead of universal 30-day window, apply differentiated windows reflecting each channel’s typical conversion lag. Branded search (1-3 day median lag) gets 7-day window; generic search (5-10 day lag) gets 14-day window; display awareness (15-25 day lag) gets 30-day window. This granular approach reduces inflation from channels with immediate intent (brand search doesn’t need 30 days to prove value) while properly crediting upper-funnel channels with longer influence cycles.
Decay-Weighted Attribution Within Windows: Rather than binary all-or-nothing credit within window, apply time-decay weighting. Conversion 1 day post-click receives 100% credit; conversion 15 days post-click receives 40% credit; conversion 29 days post-click receives 10% credit. This probabilistically discounts older interactions that likely had diminishing influence, reducing overcounting while maintaining window long enough to capture delayed conversions. Implement via custom attribution models in GA4 or third-party platforms; not available in native Meta/Google reporting.
Cohort-Based Window Optimization: Analyze time-to-conversion separately for new vs returning customers. New customers often have longer research phases (18-25 day lag) requiring extended windows; returning customers convert quickly (2-5 day lag) from targeted offers. Apply 30-day windows to prospecting campaigns targeting new customers, 7-day windows to retention campaigns targeting existing customers—matching windows to actual behavior by segment rather than using one-size-fits-all approach.
Seasonal Window Adjustments: Purchase timelines compress during high-intent periods (Black Friday, holidays, end-of-quarter for B2B) and extend during slower periods. Shorten windows to 14 days during Q4 holiday season when conversion lag averages 4-6 days; extend to 45 days during Q1-Q2 when consideration lengthens to 20-30 days. This seasonal calibration prevents Q4 performance inflation (long window crediting ads for urgent organic purchases) and Q1 undercounting (short window missing extended research phase).
Attribution Window Best Practices
Match windows to median conversion lag, not maximum. If 50% of conversions occur within 10 days and 90% within 30 days, but outliers extend to 120 days, set window at 30 days (90th percentile). Don’t extend to 120 days to capture rare outliers—those extreme lags likely represent organic re-engagement where original ad had minimal influence.
Shorten windows during iOS ATT and privacy-constrained environments. iOS 14.5+ restrictions reduce trackability of conversions beyond 7 days due to SKAdNetwork limitations and ATT opt-out rates. Even if Android/desktop conversions support 30-day tracking, iOS conversions measured via SKAdNetwork have 7-day maximum windows—mixing both creates inconsistent measurement. Consider platform-wide 7-day window to maintain uniform measurement across all device types rather than fragmenting into device-specific windows.
Use longer click windows and shorter view windows. Industry best practice: click windows 3-5x longer than view windows (e.g., 30-day click / 7-day view or 14-day click / 1-day view). Clicks represent active engagement with clear intent signal; impressions are passive exposure with weaker influence. Balanced asymmetry properly weighs click-driven versus view-driven conversions without overcounting weak impression influence.
Validate window accuracy with incrementality testing. Run holdout experiments where control groups see no ads for 30 days, test groups see normal ad exposure. Measure conversion lift in test group. If test group shows 25% lift over control after 30 days, your 30-day attribution window is directionally correct—conversions within that window are genuinely ad-influenced. If lift is only 8%, your window is too long and you’re crediting organic conversions to ads; tighten to 14-day and retest.
Document window rationale for stakeholder alignment. Attribution window debates often devolve into political arguments (“sales wants longer windows to credit early touchpoints; marketing wants shorter windows to optimize faster”). Document data-driven rationale: “30-day window selected because 87% of conversions occur within 30 days per Q3 2025 analysis; median conversion lag is 12 days; 30-day window balances coverage (87%) against inflation risk from extended tail.” Data trumps opinions in window configuration decisions.
Frequently Asked Questions
What is the difference between an attribution window and a lookback window?
The terms are often used interchangeably, but attribution window specifically defines when a marketing touchpoint can receive credit for a conversion (7 days after click = 7-day attribution window), while lookback window refers to how far back in time an analytics system checks for relevant interactions when attributing a conversion.
In practice: attribution window is forward-looking from the touchpoint (click happens Monday, conversion must occur by next Monday for 7-day window). Lookback window is backward-looking from the conversion (conversion happens Friday, system looks back 30 days to find eligible clicks for 30-day lookback). Both measure the same time relationship but from opposite directions. Most platforms use “attribution window” terminology; Adobe Analytics and some enterprise tools use “lookback window.” Functionality is identical—the time period within which attribution can occur.
Why do Meta and Google report different conversion totals for the same campaign period?
Four primary causes: (1) Attribution window mismatch—Meta defaults to 7-day click while Google defaults to 30-day click, meaning Google credits conversions up to 30 days post-click that Meta excludes after day 7, inflating Google’s count. (2) View-through attribution differences—Meta includes 1-day view-through conversions; Google also includes 1-day view-through but defines “view” differently (3-second exposure vs 1-second), changing which impressions qualify. (3) Tracking methodology—Meta’s pixel and Google’s gtag fire independently, with slight timestamp differences causing attribution conflicts when both tags compete to claim the same conversion. (4) Cross-device tracking limitations—Meta and Google have different cross-device graphs (Meta links via logged-in Facebook users; Google links via logged-in Google accounts); conversions attributed to different devices create apparent discrepancies.
Solution: Standardize windows across platforms (set both to 30-day or both to 7-day), use third-party attribution tool as single source of truth (LeadSources.io, Segment, Rockerbox) that applies uniform logic to all platforms, and accept 10-15% discrepancy as normal measurement variance rather than perfect platform agreement.
How do I choose the right attribution window length for my B2B SaaS business?
Analyze actual sales cycle data from CRM: calculate average days from first paid touchpoint (ad click) to closed-won deal. If average is 45 days with standard deviation of 20 days, most conversions occur between 25-65 days. Set attribution window at 60-90 days to capture this range.
Validate by examining conversion-lag distribution: export all closed deals from last quarter with timestamp of first marketing interaction and close date. Plot histogram of days-between. If 80% of deals close within 60 days, 60-day window is appropriate. If distribution is bimodal (fast 0-20 day deals + slow 80-120 day deals), consider segmenting: transactional SMB deals get 30-day window; enterprise deals get 120-day window. Crucially, B2B attribution requires multi-touch journey tracking beyond single windows—LeadSources.io’s full session tracking captures the complete path across multiple touchpoints that single attribution windows cannot measure, providing true B2B attribution clarity.
Should I use different attribution windows for prospecting versus retargeting campaigns?
Conventional wisdom suggests yes—prospecting has longer conversion lags (awareness → consideration → purchase spans weeks), retargeting has shorter lags (high-intent users convert within days). However, data often contradicts this assumption: retargeting audiences include both immediate-converters (same-day purchase) and delayed-converters (8-14 days requiring multiple retargeting exposures before conversion), creating bimodal distribution similar to prospecting.
Best practice: Analyze time-to-conversion separately for prospecting and retargeting audiences. If retargeting median lag is 3 days vs prospecting median lag of 15 days, differentiated windows (7-day retargeting, 30-day prospecting) make sense. If medians are similar (both ~12 days), use uniform windows to simplify reporting and avoid artificial performance differences caused by measurement rather than true campaign effectiveness. Most advertisers discover that uniform windows across campaign types, differentiated by channel (search vs display) or product (low-ticket vs high-ticket), provides better measurement than campaign-type segmentation.
How do attribution windows affect reported ROAS, and how can I calculate true ROAS?
Longer attribution windows mechanically inflate ROAS by crediting more conversions (including some that would have occurred organically). If 7-day window shows $10K attributed revenue and 30-day window shows $14K attributed revenue from identical $5K spend, reported ROAS shifts from 2.0x to 2.8x—a 40% increase driven purely by measurement, not actual marketing effectiveness.
Calculate window-independent true ROAS via incrementality testing: run holdout experiment with test group (ads) vs control group (no ads), measure lift, calculate incremental revenue (test revenue – control revenue), divide by spend. If $5K spend generates $8K incremental revenue (based on test vs control), true ROAS is 1.6x regardless of attribution window settings. Compare this true ROAS (1.6x) to reported ROAS under different windows (2.0x at 7-day, 2.8x at 30-day) to quantify window-driven inflation. Optimal window is one where reported ROAS most closely approximates true incremental ROAS—in this example, 7-day window (2.0x) overstates by 25%; 30-day window (2.8x) overstates by 75%; might test 3-day window to see if it gets closer to 1.6x true ROAS.
What happens when a user clicks multiple ads from different platforms within my attribution window?
This creates attribution conflict where multiple platforms legitimately claim the same conversion within their respective windows. Example: User clicks Facebook ad Monday, Google ad Wednesday, LinkedIn ad Friday, converts Sunday. If all platforms use 7-day windows, all three credit the conversion—Facebook (7 days since Monday), Google (4 days since Wednesday), LinkedIn (2 days since Friday). Your aggregate reporting shows 3 attributed conversions from 1 actual conversion, inflating total attributed performance by 200%.
Platform-native reporting cannot resolve this conflict—each platform only sees its own touchpoints. Solutions: (1) Use last-click attribution within window—only the final paid touchpoint before conversion (LinkedIn in example) receives credit, eliminating double-counting but ignoring assist value of earlier touchpoints. (2) Use fractional multi-touch attribution—divide credit across all touchpoints within window (Facebook 33%, Google 33%, LinkedIn 33% in example), maintaining aggregate accuracy while acknowledging multiple influences. (3) Deploy unified attribution platform (LeadSources.io, Segment) that sees all touchpoints across platforms and applies consistent deduplication logic, eliminating platform-level inflation. Option 3 is gold standard for cross-platform measurement accuracy.
How frequently should I review and update my attribution window settings?
Quarterly reviews at minimum, with trigger-based reviews for major business changes. Quarterly cadence enables detection of gradual shifts in time-to-conversion patterns due to competitive dynamics, seasonal trends, or evolving customer behavior. Compare Q1 conversion-lag distribution to Q4 prior year; if median lag shifted from 12 days to 18 days, attribution window may need extension from 30 to 45 days to maintain coverage.
Trigger-based reviews required for: (1) Product line expansions—adding $500 considered-purchase products to $50 impulse-purchase catalog requires window extension for new line. (2) Market expansions—entering new geographies with different purchase behaviors may necessitate region-specific windows. (3) Platform changes—Meta updating attribution options (2024 introduction of “engaged view” default) requires evaluation of new setting impact. (4) Significant conversion rate changes—sudden 30%+ drop in attributed conversions may indicate window is no longer capturing customer behavior; investigate whether actual lag patterns shifted or window was always too short but masked by volume.