Evaluation and Storytelling: Proving Worth and Sharing the Impact
You've planned, designed, and implemented. Now comes the question every funder, policymaker, and community member will ask: Did it work?
But proving worth isn't enough. You must also share the story—in ways that inspire continued support, inform policy decisions, and honor the communities you serve.
The Evaluation Imperative
Beyond Accountability
Evaluation serves multiple masters:
For Funders: Did the investment produce results?
For Policymakers: Should this be scaled or replicated?
For Practitioners: What worked and what didn't?
For Communities: Did this improve our lives?
For the Field: What can others learn?
Good evaluation answers all these questions. Great evaluation does so while building rather than extracting from communities.
The CDC Framework for Program Evaluation
A Systematic Approach
The CDC Framework provides structure without being overly rigid:
Six Steps:
-
Engage Stakeholders: Who needs to be involved? Who will use the findings?
-
Describe the Program: What is the program theory? (Connect to your logic model from Week 4)
-
Focus the Evaluation: What questions matter most? What's feasible to measure?
-
Gather Credible Evidence: What data will convince skeptics?
-
Justify Conclusions: What standards determine success? How will you interpret findings?
-
Ensure Use and Share Lessons: How will findings be used? Who needs to know?
Evaluation Types
Process Evaluation:
- Were activities implemented as planned?
- Did we reach intended populations?
- What was the quality of implementation?
- What adaptations were made and why?
Outcome Evaluation:
- Did short-term outcomes change?
- Are changes attributable to the program?
- Were changes equitably distributed?
Impact Evaluation:
- Did long-term health outcomes improve?
- Were there unintended consequences?
- What was the return on investment?
Connecting to Logic Models
Your Week 4 logic model becomes your evaluation blueprint:
| Logic Model Component | Evaluation Question | |----------------------|---------------------| | Inputs | Were resources sufficient and appropriate? | | Activities | Were activities implemented with fidelity? | | Outputs | Did we produce intended deliverables? | | Short-term Outcomes | Did knowledge/skills change? | | Medium-term Outcomes | Did behaviors change? | | Long-term Outcomes | Did health status improve? |
Data Visualization and Dashboards
Data Must Be Visible to Be Actionable
A 50-page evaluation report that no one reads fails its purpose. Visualization transforms data into insight.
Dashboard Design Principles
Clarity Over Cleverness:
- Choose the simplest chart that conveys the message
- Avoid 3D effects, unnecessary animation, and decoration
- Maximize the "data-ink ratio"—every pixel should convey information
Audience Awareness:
- Executives need dashboards, not spreadsheets
- Program staff need actionable detail
- Community members need accessible presentation
Real-Time When Possible:
- Monitoring dashboards for ongoing programs
- Outcome dashboards for periodic review
- Impact dashboards for summative evaluation
Common Visualization Mistakes
Chart Crimes to Avoid:
- Truncated axes that exaggerate differences
- Pie charts with too many slices
- Dual axes that confuse more than clarify
- Missing context (no baseline, no target)
- Color schemes inaccessible to colorblind viewers
Better Practices:
- Start bar charts at zero
- Use small multiples instead of complex charts
- Provide context (what does "good" look like?)
- Test visualizations with intended audiences
The One-Page Dashboard
For executive audiences, distill to essentials:
Top Row: Key metrics with trend indicators Middle: Progress toward objectives (process, outcome) Bottom: Action items and decisions needed
If they want more, they can ask. But they won't read a report.
Digital Storytelling for Advocacy
Beyond the Evaluation Report
Evaluation reports serve funders. Stories reach communities, policymakers, and the public.
"Data makes people think. Stories make people feel. People act on feeling, then justify with thinking."
Elements of Effective Digital Stories
The Human Element:
- Individual stories that represent broader patterns
- Real voices, not paraphrased summaries
- Specific details that create connection
The Data Element:
- Statistics that establish scale and significance
- Visualizations that make patterns visible
- Comparisons that create context
The Multimedia Element:
- Voice (authenticity and emotion)
- Image (visual evidence and connection)
- Music (emotional tone, cultural resonance)
Story Structure for Impact
The Challenge: What problem exists? Who faces it? Why does it matter?
The Solution: What was tried? How did it work? What made it different?
The Result: What changed? For whom? How do we know?
The Call: What should happen next? What can the audience do?
Ethical Storytelling
Consent and Control:
- Participants own their stories
- Informed consent for any sharing
- Right to review before publication
Representation:
- Avoid poverty porn and deficit narratives
- Show agency, not just vulnerability
- Balance individual stories with structural analysis
Attribution:
- Credit community contributions
- Share findings back to communities first
- Don't extract stories for external audiences alone
Capstone Integration
Assembling the Whole
After eight weeks, you have:
- Needs assessment data and problem statement
- Personas and journey maps
- Theory of change and logic model
- Intervention design and prototype feedback
- Implementation plan with Agile structure
- Budget and sustainability plan
- Evaluation framework
The capstone integrates these into a coherent proposal.
Internal Consistency Check
Does everything align?
| Component | Alignment Question | |-----------|-------------------| | Problem Statement | Does it match your assessment findings? | | Logic Model | Does theory of change support causal claims? | | Intervention | Does it address persona pain points? | | Implementation | Is the Agile plan realistic for available resources? | | Budget | Do line items support planned activities? | | Evaluation | Do indicators map to logic model? |
Misalignment reveals gaps in thinking. Fix them before presenting.
The Narrative Thread
Your proposal tells a story:
Act 1 (Weeks 1-3): The problem is real, urgent, and solvable. We understand it deeply.
Act 2 (Weeks 4-5): Our solution is evidence-based, theory-driven, and community-designed.
Act 3 (Weeks 6-8): We can implement effectively, sustain over time, and prove our impact.
Every section should advance this narrative.
The Final Pitch
Simulation of Reality
The capstone pitch simulates real-world pressure:
- Boards of Health deciding resource allocation
- Foundations choosing between competing proposals
- Legislative committees considering policy changes
The skills you develop serve you throughout your career.
Pitch Structure (10 minutes)
Opening Hook (30 seconds):
- Compelling statistic or story
- Why this matters, why now
The Problem (2 minutes):
- Assessment findings
- Affected populations
- Cost of inaction
The Solution (3 minutes):
- Intervention overview
- Theory of change
- Evidence base
The Plan (2 minutes):
- Implementation approach
- Team and partners
- Timeline
The Ask (1 minute):
- Specific resources needed
- What success looks like
- Call to action
Q&A (remaining time):
- Anticipate tough questions
- Bridge to your key messages
- Show depth of knowledge
Handling Tough Questions
Budget questions: Know your numbers cold. "Our total ask is $X, with personnel at $Y..."
Evidence questions: Acknowledge limitations honestly. "The evidence is strongest for... We're still learning about..."
Sustainability questions: Show long-term thinking. "We've identified three potential pathways to sustainability..."
Scale questions: Be realistic. "We're starting with [scope] because... If successful, we could expand to..."
Common Pitch Mistakes
Too Much Detail: You have 10 minutes, not 60. Hit highlights; they'll ask if they want more.
Reading Slides: The audience can read. Add value beyond what's on screen.
Defensive Responses: Questions aren't attacks. "That's a great question" is almost always appropriate.
No Clear Ask: End with a specific request. "We're asking for $X to accomplish Y."
The Transformation Complete
What You've Become
Eight weeks ago, you were learning to plan programs. Now you are:
A Systems Thinker: Seeing connections, feedback loops, and leverage points
An AI-Augmented Analyst: Using technology to deepen and accelerate assessment
A Human-Centered Designer: Starting with empathy, designing with communities
A Logic Model Architect: Building clear pathways from inputs to impact
An Agile Implementer: Adapting to reality while maintaining direction
A Resource Mobilizer: Understanding money, sustainability, and advocacy
An Evaluator and Storyteller: Proving worth and sharing impact
This is modern public health practice.
The Ongoing Practice
Program planning isn't a course—it's a career. You'll use these skills:
- Every grant proposal
- Every community partnership
- Every program you manage
- Every system you try to change
The frameworks become second nature. The thinking becomes reflex.
The Impact Imperative
Remember why this matters:
Communities are waiting for solutions to problems that harm them daily. Policymakers need evidence to make better decisions. Resources are limited and must be allocated wisely.
Your ability to plan, implement, and evaluate effective programs—then tell their story compellingly—directly affects whether health improves.
This is the privilege and responsibility of public health practice.
Now go make health happen.
Continue Your Learning
This article is part of an 8-week course on Adaptive Program Planning in the Digital Age. Learn systems thinking, AI-augmented assessment, Human-Centered Design, and Agile implementation for modern public health practice.