6. The Research Design Ecosystem
Before you start
- Module 1: paradigms and ways of knowing
- Familiarity with database search basics
- Awareness that 'gray literature' exists outside peer-reviewed journals
By the end you'll be able to
- Reframe 'literature review' as boundary scanning and knowledge synthesis
- Search effectively across disciplinary silos
- Integrate gray literature and non-traditional sources
- Honor methodological diversity in synthesis
- Identify where disciplinary databases will fail your problem
From "literature review" to boundary scanning
The phrase "literature review" frames the task as bounded: find what's been published in your area, summarize it, identify a gap. For mono-disciplinary research this works. For boundary-crossing work it doesn't, because the relevant evidence lives in places that don't show up in "your area."
A better frame: boundary scanning — the active practice of looking across disciplinary, methodological, and epistemic boundaries to assemble the evidence base your problem actually needs. The work isn't reviewing; it's hunting.
This shift has practical consequences for how you search, what counts as a source, and how you synthesize.
Searching across disciplinary silos
Disciplinary databases reflect disciplinary boundaries, not problem boundaries. PubMed catches biomedical and a slice of public-health work. PsycINFO catches psychology. ERIC catches education. Sociological Abstracts catches sociology. Web of Science and Scopus catch broad-but-not-everything.
For a transdisciplinary problem, your search strategy should map which database your problem's evidence lives in, then search them all. A reasonable starting set for a public-health-and-social-policy problem might be PubMed + Web of Science + Sociological Abstracts + ERIC + a country-specific government repository.
The corollary: a paper that says "we searched PubMed" without justification is a paper that picked a discipline silently. Justification means naming why the chosen databases cover the problem space.
Gray literature: the evidence that journals don't carry
A surprising amount of the most relevant evidence on applied problems lives outside peer-reviewed journals:
- Government reports — state-level health departments, federal agency analyses, regulatory dockets
- NGO and think-tank papers — RAND, Urban Institute, Mathematica, Brookings, local foundations
- Conference proceedings — often the most current methodological work
- Practice guidelines — clinical society guidelines, professional standards
- Theses and dissertations — frequently the only source for in-depth study of niche populations
- Community-produced reports — community needs assessments, advocacy organization white papers
Excluding gray literature is not rigor; it is a method choice with consequences. The standard transdisciplinary search includes at least one purposive sweep of gray sources, documented for audit.
Honoring methodological diversity in synthesis
The traditional systematic review treats studies as comparable units to pool. This works for narrow biomedical questions and breaks down quickly for transdisciplinary problems, where studies span paradigms, methods, and populations.
Two alternatives:
- Integrative review (Whittemore & Knafl): explicitly admits diverse study designs and synthesizes across them. Useful for emerging fields.
- Critical interpretive synthesis (Dixon-Woods et al.): treats synthesis as an interpretive act, producing a theoretical contribution rather than a summary.
The key shift: synthesis is claim-making, not stacking. The product of a transdisciplinary synthesis is a position about how the evidence fits together, not a table of summaries.
When disciplinary databases will fail your problem
Some clues that a database-driven strategy won't be enough:
- The problem has a strong practice or community dimension that academic literature treats only thinly
- The relevant evidence is recent enough that peer review hasn't caught up (12–24 months for fast-moving fields)
- The phenomenon is named differently across disciplines and your search terms don't capture all the names
- The most relevant work has been done by graduate students whose dissertations weren't published in journals
- The community holding the knowledge has been historically excluded from academic study
When these signals appear, supplement database search with citation chaining, expert outreach, and community partnership. Treat the search itself as a research design step, not a preliminary.
Building the search as a research design move
A transdisciplinary search isn't a literature-review checklist; it's a design decision documented with the rest of the protocol:
- Problem map — what disciplines, populations, and evidence types are relevant?
- Database set — which databases are needed, and why?
- Search terms — including cross-discipline synonyms and idioms-of-distress equivalents
- Gray-literature plan — which non-academic sources, and how identified?
- Practitioner/community knowledge plan — what knowledge cannot be found in any database, and how will you elicit it?
- Synthesis approach — integrative review, critical interpretive synthesis, narrative synthesis, or other
Documenting these decisions makes the search auditable and treats it as part of the research design rather than a preliminary.
A worked example
A research team studying community responses to youth violence is preparing to launch an intervention study. Their boundary-scanning strategy:
- Database set: PubMed (clinical/public health), Sociological Abstracts (community sociology), Criminal Justice Abstracts (criminology), ERIC (school-based interventions)
- Gray literature: City Office of Violence Prevention reports, local foundation evaluations, two community-led needs assessments
- Practitioner knowledge: structured interviews with violence interrupters, school counselors, faith-based program leaders
- Synthesis: critical interpretive synthesis, producing a theoretical claim about which intervention features cross successfully across community contexts
Each of these is documented in their protocol. The published paper that emerges can be audited against this map.
Closing
"Literature review" is the wrong frame for boundary-crossing work. Boundary scanning treats search as a research design step, draws on multiple disciplinary databases, includes gray literature, and admits practitioner and community knowledge. Synthesis is claim-making, not summary-stacking.
Next: literature synthesis as boundary work — appraising across paradigms and writing reviews that take positions.
Common mistakes
These are the traps learners hit most often on this topic. Knowing them in advance is half the fix.
Searching one or two databases and calling it comprehensive
PubMed alone misses sociology. ERIC alone misses health policy. A transdisciplinary search starts with the question, maps which disciplinary databases the answer hides in, and searches all of them — plus gray literature.
Treating gray literature as a footnote
Government reports, NGO white papers, community newsletters, and conference proceedings often hold the most current evidence on applied questions. Excluding them is not rigor; it is a method choice with consequences.
Synthesizing by comparison only
Stacking findings side-by-side is comparison, not synthesis. Synthesis means proposing an integrative claim that no single source could make alone.
Practice problems
Try each on paper first. Click Show solution only after you've made a real attempt.
- Problem 1For a problem you care about, list the three disciplinary databases you'd search and one gray-literature source.
Show solution
Example for adolescent mental health policy: PsycINFO (psychology), Web of Science (broad), Sociological Abstracts (sociology), plus state-level Department of Education reports for gray literature. The gray-literature source is often where the policy variation actually shows up.
- Problem 2Identify one piece of indigenous, community, or practitioner knowledge relevant to your problem that no database will surface.
Show solution
This exercise is about admitting that the formal record is partial. Knowledge that lives in practice is reached through interviews, observation, or community partnership — not through Boolean search. Naming it in your design is rigor, not concession.
Practice quiz
- Question 1What is the strongest argument for searching beyond a single disciplinary database?
- Reflection 2Give one example of a synthesis claim that integrates findings from two disciplines.
Lesson 6 recap
- Literature review is the wrong frame for boundary-crossing work
- Knowledge synthesis is integrative claim-making, not stacking summaries
- Multiple databases and gray literature are the floor, not the ceiling
- Practitioner and community knowledge require elicitation, not search
Coming next: Lesson 7 — Literature Synthesis as Boundary Work
- Next: literature synthesis as boundary work
- You'll see how to appraise across paradigms, not just within one
- We focus on writing reviews that build bridges rather than fortify silos
Saved in your browser only — no account, no server.