Public Health

Intervention Design and Prototyping: Building Evidence-Based Solutions

Learn to select and adapt evidence-based interventions, integrate behavioral theory, create rapid prototypes, conduct user testing, and finalize implementation-ready intervention plans.

Intervention Design and Prototyping: Building Evidence-Based Solutions

You've assessed needs, designed with empathy, and built a logic model. Now it's time to design the actual intervention—the services, products, or programs that will create change.

This week focuses on building solutions that work: grounded in evidence, informed by theory, and validated through testing.

Selecting Evidence-Based Interventions

Not Starting from Scratch

The best interventions build on what already works. Evidence-Based Interventions (EBIs) have been rigorously tested and shown to produce outcomes in controlled settings.

Benefits of EBIs:

  • Known effectiveness (under certain conditions)
  • Established implementation protocols
  • Training materials often available
  • Easier to justify to funders
  • Evaluation frameworks exist

Limitations of EBIs:

  • May not fit local context
  • Developed for different populations
  • Can be resource-intensive
  • Rigid protocols may not serve all users

Finding Evidence-Based Interventions

Key repositories include:

General Public Health:

  • Community Guide (thecommunityguide.org)
  • Evidence-Based Cancer Control Programs (ebccp.cancercontrol.cancer.gov)
  • Substance Abuse and Mental Health Services Administration (SAMHSA) Evidence-Based Practices Resource Center

Specific Topics:

  • Diabetes Prevention Program Registry
  • HIV Prevention Research Synthesis
  • What Works Clearinghouse (education)

Evaluating Evidence Strength

Not all "evidence-based" claims are equal:

Strong Evidence:

  • Multiple randomized controlled trials
  • Consistent findings across studies
  • Demonstrated in diverse populations
  • Long-term follow-up showing sustained effects

Moderate Evidence:

  • One or more controlled studies
  • Quasi-experimental designs
  • Consistent observational evidence
  • Theory-based with logical outcomes

Emerging/Promising:

  • Limited studies showing positive results
  • Strong theoretical foundation
  • Successful pilot projects
  • Expert consensus

Adaptation vs. Fidelity

The core tension in EBI implementation:

Fidelity: Implementing exactly as designed to preserve effectiveness

Adaptation: Modifying to fit local context and population

The Balance:

  • Preserve "core components" that drive effectiveness
  • Adapt "adaptable elements" for local fit
  • Document all modifications
  • Evaluate whether adaptations affect outcomes

Identifying Core Components

Core components typically include:

  • Theoretical mechanisms (what produces change)
  • Key content elements (essential information/skills)
  • Dosage (amount of intervention needed)
  • Delivery method (how core content reaches users)

Adaptable elements typically include:

  • Language and cultural references
  • Examples and scenarios
  • Visual design and materials
  • Scheduling and logistics
  • Supplementary activities

Integrating Behavioral Theory

Why Theory Matters

Theory explains why interventions work—the mechanisms that connect activities to outcomes. Understanding mechanisms allows for:

  • Smarter adaptations (preserve mechanisms)
  • Better troubleshooting (when things don't work)
  • More efficient evaluation (measure mediating variables)
  • Clearer communication (explain why, not just what)

Common Behavioral Theories

Health Belief Model: Key constructs: Perceived susceptibility, severity, benefits, barriers, cues to action, self-efficacy

"If Maria believes she's at risk for diabetes (susceptibility), that diabetes is serious (severity), that prevention works (benefits), that she can overcome barriers (self-efficacy), and receives a trigger to act (cue), she'll participate in prevention."

Social Cognitive Theory: Key constructs: Self-efficacy, outcome expectations, observational learning, behavioral capability, environment

"If Maria sees people like her succeeding at diabetes prevention (modeling), believes she can do it (self-efficacy), and has the skills and environment to support change (capability + environment), she'll adopt healthy behaviors."

Transtheoretical Model (Stages of Change): Stages: Precontemplation, contemplation, preparation, action, maintenance

"Maria might be in contemplation stage—thinking about change but not ready to act. Intervention strategies should match her stage (increase motivation) rather than assume she's ready for action (provide skills training)."

Mapping Theory to Components

For each program component, identify:

| Component | Theory | Construct Targeted | Mechanism | |-----------|--------|-------------------|-----------| | Peer educator delivery | Social Cognitive | Observational learning | "People like me succeed" | | Skills practice activities | Social Cognitive | Self-efficacy | Mastery experience | | Family involvement | Social Cognitive | Environment | Social support | | Goal-setting worksheets | Health Belief | Self-efficacy | Behavioral planning |

This mapping ensures program components have theoretical justification.

Rapid Prototyping

From Abstract to Concrete

Rapid prototyping creates tangible representations of interventions quickly and cheaply. The goal isn't perfection—it's learning.

Prototype Purpose:

  • Test concepts before full development
  • Get user feedback early
  • Identify problems cheaply
  • Communicate ideas to stakeholders

Types of Prototypes

Paper Prototypes:

  • Sketched interfaces for apps
  • Draft brochures and materials
  • Storyboards for videos
  • Mock-ups of physical spaces

Role-Play Prototypes:

  • Scripted counseling sessions
  • Simulated service encounters
  • Walkthrough of program experiences

Service Blueprints:

  • Step-by-step service delivery maps
  • Backstage/frontstage activities
  • User journey through the service

Minimum Viable Product (MVP):

  • Simplest functional version
  • Just enough to test core hypothesis
  • Designed for learning, not scale

Prototype Principles

Low-fidelity first: Start rough, refine later

  • Paper before digital
  • Scripts before production
  • Sketches before designs

Test early, test often: Don't wait for perfection

  • Prototype → Test → Learn → Iterate
  • Multiple small tests beat one big launch

Embrace failure: Prototypes are meant to reveal problems

  • Problems found in testing are successes
  • Cheap failures now prevent expensive failures later

Example: Prototyping a Diabetes Prevention Class

Paper Prototype:

  • Sketched curriculum outline
  • Draft lesson plans with timing
  • Mock participant materials

Role-Play Prototype:

  • Facilitator walks through lesson
  • Volunteer participants react
  • Observers note issues

Service Blueprint:

  • Registration process mapped
  • Classroom setup documented
  • Follow-up touchpoints specified

MVP:

  • Single class session delivered
  • Real participants recruited
  • Full feedback collected

User Testing and Feedback Loops

"Think Aloud" Testing

Users interact with prototypes while narrating their thoughts:

Instructions:

"I'm going to show you a draft of our program materials. Please tell me what you're thinking as you look at them—what makes sense, what's confusing, what you like, what concerns you. There are no wrong answers."

What to Listen For:

  • Confusion points
  • Emotional reactions
  • Assumptions and interpretations
  • Suggested improvements
  • Enthusiasm or resistance

The Feedback Grid

Organize feedback into four categories:

| Likes (What worked) | Criticisms (What didn't work) | |---------------------|------------------------------| | "I like that it shows people like me" | "The times don't work for my schedule" | | "The language is clear" | "Too much reading" |

| Questions (What's unclear) | Ideas (What could be better) | |---------------------------|------------------------------| | "How long is each session?" | "Could we do this online?" | | "What if I miss a class?" | "Add recipes we could try" |

Prioritizing Feedback

Not all feedback is equal. Prioritize based on:

Frequency: Do multiple users raise the same issue?

Severity: Does this prevent use entirely or just reduce satisfaction?

Alignment: Does this feedback align with other evidence?

Feasibility: Can we address this within constraints?

Iteration Cycles

After testing:

  1. Synthesize feedback
  2. Identify priority changes
  3. Modify prototype
  4. Test again

Repeat until:

  • Core issues are resolved
  • Diminishing returns on changes
  • Ready for pilot implementation

Finalizing the Intervention Plan

Documentation Requirements

The final intervention plan includes:

Scope and Sequence:

  • What content is delivered
  • In what order
  • Over what timeframe

Delivery Specifications:

  • Who delivers (qualifications needed)
  • Where delivery occurs
  • How delivery is structured

Materials and Resources:

  • Participant materials
  • Facilitator guides
  • Equipment and supplies

Quality Assurance:

  • Fidelity monitoring protocols
  • Competency standards for deliverers
  • Participant assessment tools

Alignment Verification

Before finalizing, verify alignment:

  • [ ] Intervention addresses needs assessment findings
  • [ ] Activities match persona journey pain points
  • [ ] Components target theory constructs
  • [ ] Outputs connect to logic model outcomes
  • [ ] Resources are sufficient for planned activities
  • [ ] Prototype testing validated core assumptions

The Implementation Blueprint

The finalized plan serves as the blueprint for:

  • Training program staff
  • Procuring resources
  • Recruiting participants
  • Monitoring implementation
  • Evaluating outcomes

Next week: Taking this blueprint into Agile implementation.


Adaptive Program Planning in the Digital Age

This is Week 5 of an 8-week course. Learn systems thinking, AI-augmented assessment, Human-Centered Design, and Agile implementation for modern public health practice.

Watch the Full Course on YouTube

Course Navigation