Sharing Results with Stakeholders
Overview
Effectively communicating marketing mix modeling results requires tailoring your presentation to your audience's technical background, business priorities, and decision-making needs. This guide provides strategies for sharing MixModeler outputs with different stakeholder groups.
Know Your Audience
Executive Leadership
Priorities: Bottom-line impact, strategic direction, budget allocation
Preferred Format: High-level summary, visual charts, clear recommendations
Technical Level: Minimal statistics, focus on business implications
Key Questions:
- Which channels drive the most revenue? 
- Where should we invest more/less? 
- What's the expected ROI of budget reallocation? 
Marketing Teams
Priorities: Channel performance, campaign optimization, tactical decisions
Preferred Format: Detailed attribution, channel comparisons, time-series trends
Technical Level: Moderate - understand marketing metrics, some statistics
Key Questions:
- How do my channels compare to each other? 
- Which campaigns performed best? 
- What's the optimal budget split? 
Finance Teams
Priorities: ROI, cost efficiency, budget justification
Preferred Format: Numerical tables, cost-benefit analysis, variance explanations
Technical Level: High numerical literacy, less marketing context
Key Questions:
- What's the ROI of each channel? 
- Can we justify current marketing spend levels? 
- Where can we cut costs with minimal impact? 
Data Science / Analytics Teams
Priorities: Methodology, statistical validity, model quality
Preferred Format: Full technical details, diagnostics, assumptions, limitations
Technical Level: High - understands statistics, modeling, diagnostics
Key Questions:
- Did the model pass diagnostic tests? 
- What are the assumptions and limitations? 
- How robust are the results? 
Preparing Exports for Different Audiences
For Executives
Create Summary Excel:
- Export full model to Excel 
- Add new "Executive Summary" sheet at the beginning 
- Include only: - Top 5-10 most important coefficients 
- Simple ROI calculations 
- Clear recommendations 
 
- Use visual formatting (colors, bold, conditional formatting) 
- Add 1-2 key charts (channel contribution pie chart, ROI bar chart) 
Example Summary Sheet:
TV
$50,000
3.25
6.5x
Increase
Digital
$30,000
2.18
7.3x
Increase
$20,000
0.45
2.3x
Maintain
Radio
$15,000
0.12
0.8x
Reduce
Talking Points:
- "Our analysis shows digital and TV deliver the highest ROI" 
- "We recommend shifting $5K from radio to digital for 15% revenue increase" 
- "Model explains 85% of revenue variation, very strong fit" 
For Marketing Teams
Create Marketing Dashboard:
- Export with decomposition included 
- Focus on Group Decomposition sheet 
- Create time-series charts showing: - Each channel's contribution over time 
- Seasonal patterns 
- Performance trends 
 
- Add channel comparison tables 
- Include variable-level details for campaign analysis 
Example Marketing View:
- Stacked area chart of contributions over time 
- Table of average weekly contribution by channel 
- Campaign-level performance (if using campaign variables) 
- Optimization recommendations with specific spend allocations 
Talking Points:
- "TV drives 35% of attributed revenue on average" 
- "Digital performance improved 20% in Q4 vs Q3" 
- "We see clear seasonality in December - 40% higher effectiveness" 
For Finance Teams
Create Finance Package:
- Export full model 
- Highlight Model Statistics sheet 
- Create ROI calculation sheet: - Coefficient × average spend / average KPI 
- Cost per incremental unit (revenue, conversion, etc.) 
- Marginal ROI calculations 
 
- Add variance analysis 
- Include sensitivity scenarios 
Example ROI Sheet:
TV
$50,000
3.25
$162,500
$0.31
325%
Digital
$30,000
2.18
$65,400
$0.46
218%
Talking Points:
- "Every dollar in TV generates $3.25 in revenue" 
- "Model R² of 0.85 means highly reliable estimates" 
- "Reallocating 20% of print budget to digital projects +$12K revenue/month" 
For Technical Teams
Provide Complete Package:
- Full Excel export (all sheets) 
- PDF diagnostic reports for all tests 
- Documentation of: - Data sources and preparation 
- Variable transformations applied 
- Prior specifications (if Bayesian) 
- Model selection rationale 
 
- Code/methodology notes 
Include:
- Diagnostic test results 
- Convergence metrics (Bayesian) 
- Residual analysis 
- Multicollinearity assessment 
- Assumptions and limitations 
Talking Points:
- "Model passes all diagnostic tests (normality, autocorrelation, heteroscedasticity)" 
- "VIF values all below 5, no multicollinearity concerns" 
- "Bayesian model converged (R-hat < 1.01, ESS > 1000)" 
Creating Effective Presentations
PowerPoint Structure
Slide 1: Executive Summary
- Key findings (3-5 bullets) 
- Top recommendation 
- Expected business impact 
Slide 2: Methodology Overview (1 slide only)
- What is MMM (2-3 sentences) 
- Data used (time period, channels included) 
- Model quality (R², sample size) 
Slide 3-4: Channel Performance
- Bar chart of coefficients or ROI by channel 
- Table with key metrics 
- Performance ranking 
Slide 5-6: Attribution Over Time
- Stacked area chart from decomposition 
- Trend insights 
- Seasonal patterns 
Slide 7: Recommendations
- Specific actions (increase X, decrease Y) 
- Expected impact with numbers 
- Implementation timeline 
Slide 8: Q&A / Appendix
- Technical details 
- Methodology notes 
- Diagnostic results 
Visualization Best Practices
Use Bar Charts for comparing channels:
- Simple, clear comparison 
- Easy to rank performance 
- Intuitive for non-technical audiences 
Use Stacked Area Charts for attribution over time:
- Shows contribution dynamics 
- Reveals seasonal patterns 
- Communicates total and breakdown simultaneously 
Use Pie Charts sparingly:
- Overall attribution share 
- Budget allocation 
- Only when 3-7 categories 
Avoid:
- Complex scatter plots (unless technical audience) 
- Statistical diagnostic charts (appendix only) 
- 3D charts (harder to read) 
Color Coding Strategy
Consistent Channel Colors:
- Assign each channel a color 
- Use same colors across all charts 
- Match decomposition group colors in MixModeler 
Example Color Scheme:
- TV: Blue 
- Digital: Orange 
- Print: Green 
- Radio: Red 
- Base/Other: Gray 
Performance Indicators:
- Green highlight: High performers, increase budget 
- Yellow highlight: Moderate performers, maintain 
- Red highlight: Low performers, reduce budget 
Storytelling with Data
Structure Your Narrative
1. Set Context
- "We analyzed 2 years of data across 8 marketing channels" 
- "Goal: Understand which channels drive revenue most effectively" 
2. Present Findings
- "Model shows 85% of revenue variation explained by marketing activities" 
- "Top 3 channels: Digital (7.3x ROI), TV (6.5x ROI), Events (5.8x ROI)" 
3. Reveal Insights
- "Digital effectiveness increased 40% after campaign refresh in Q3" 
- "TV has strong holiday seasonality - 60% more effective in Q4" 
4. Make Recommendations
- "Shift $50K from low-ROI print to high-ROI digital" 
- "Increase TV spend 20% during Q4 holiday season" 
5. Quantify Impact
- "Expected revenue increase: $180K annually" 
- "Improves overall marketing ROI from 4.2x to 5.1x" 
Use Analogies and Metaphors
For Coefficients:
- "For every dollar spent on TV, we generate $3.25 in revenue - like a 225% return on investment" 
For Adstock:
- "TV advertising is like a wave - the effect builds over 3-4 weeks, then gradually fades" 
For Saturation:
- "Digital ads show diminishing returns - doubling spend doesn't double results" 
For Model Fit:
- "The model explains 85% of revenue changes - like having an 85% accurate crystal ball" 
Addressing Common Questions
"How accurate is this?"
Answer:
- "The model R² of 0.85 means it explains 85% of revenue variation" 
- "We validated results with statistical tests - all passed" 
- "Typical prediction error is ±8%, well within acceptable range" 
"Why didn't you include [X channel]?"
Answer:
- "We included all channels with consistent, reliable data" 
- "Channels with limited data or recent launches analyzed separately" 
- "Can add new channels in next model iteration as data accumulates" 
"These ROI numbers seem high/low"
Answer:
- "ROI reflects incremental impact, not total impact" 
- "Numbers align with industry benchmarks for [sector]" 
- "Results validated against actual spend and revenue data" 
"Can we trust the recommendations?"
Answer:
- "Model passed all diagnostic tests for statistical validity" 
- "Results consistent across multiple model specifications" 
- "Recommend testing with gradual budget shifts, monitor results" 
"What about external factors?"
Answer:
- "Model includes seasonality, trends, and control variables" 
- "Isolates marketing impact from other business drivers" 
- "Regular updates will capture changing market conditions" 
Handling Sensitive Information
What to Share Externally
Safe to Share:
- Relative channel performance (rankings) 
- ROI ratios and multiples 
- Model methodology and approach 
- Directional recommendations 
Redact Before Sharing:
- Absolute spend amounts 
- Actual revenue numbers 
- Specific coefficients (if proprietary) 
- Competitive intelligence 
Creating Anonymized Versions
Technique 1: Percentages
- Convert absolute spend to % of total 
- Report relative contributions 
- Use indexed values (base year = 100) 
Technique 2: Ratios Only
- Report ROI multiples 
- Share efficiency metrics 
- Provide performance rankings 
Technique 3: Illustrative Scenarios
- "If we shift 10% of budget from Channel A to Channel B..." 
- Use hypothetical numbers that preserve insights 
Follow-Up and Action
Schedule Review Meeting
Within 1 Week: Present findings
Within 2 Weeks: Finalize recommendations
Within 1 Month: Begin implementation
Quarterly: Review results, update model
Document Decisions
Create Decision Log:
- What was recommended 
- What was decided 
- Rationale for any deviations 
- Expected vs actual results (track over time) 
Enable Self-Service
For Ongoing Questions:
- Share annotated Excel export 
- Provide one-page summary 
- Create FAQ document 
- Offer follow-up session 
Best Practices Summary
Tailor Content: Match detail level to audience technical sophistication
Lead with Insights: Business implications first, methodology second
Use Visuals: Charts communicate faster than tables
Tell a Story: Context → Findings → Insights → Recommendations → Impact
Be Transparent: Share limitations, assumptions, and uncertainty
Quantify Impact: Always translate findings into business metrics
Enable Action: Clear, specific, prioritized recommendations
Follow Through: Track implementation, measure results, iterate
Next Steps: Explore Model Reimport to reload models for future analysis, or review Excel Export Features for creating custom stakeholder reports.
Last updated