How to Build a Search Strategy for Your Literature Review
A search strategy is the engine of a literature review. If it is weak, no amount of downstream screening or synthesis will save the review. Fortunately, building a strong strategy is a teachable skill: it is roughly 20% creativity and 80% careful bookkeeping. This article walks through the method librarians use, pared down to what a student or first-time reviewer actually needs.
Start with a well-scoped question
Before you touch a database, write your question in a structured format. For intervention questions, use PICO (Population, Intervention, Comparator, Outcome). For qualitative questions, use PICo (Population, Interest, Context). For scoping reviews, use PCC (Population, Concept, Context). See our search strategy process page for worked examples.
A vague question produces a vague search. "How does stress affect students?" is not a searchable question. "Does mindfulness-based stress reduction, compared with usual care, reduce anxiety in undergraduate students?" is.
Map your concepts
Break the question into 2–4 concepts. For the mindfulness example:
- Population: undergraduate students
- Intervention: mindfulness-based stress reduction
- Outcome: anxiety
Under each concept, list every synonym, acronym, spelling variant, and related term you can think of. Pull terms from:
- Two or three known seminal papers (read their titles, abstracts, keywords)
- Database thesauri (MeSH for PubMed, Emtree for Embase, APA Thesaurus for PsycINFO)
- Existing reviews on nearby topics
Do not skimp here. Missing a synonym — say, "undergraduate" without "college student" — can drop 30% of relevant records.
Combine with Boolean operators
Within a concept, combine synonyms with OR (union). Across concepts, combine with AND (intersection). Use NOT sparingly — it silently excludes studies that mention the excluded term incidentally.
Use quotation marks for exact phrases ("mindfulness-based stress reduction"), truncation for word endings (student* catches student, students, studentship), and adjacency operators (NEAR/3, W/3) where supported. See our Boolean operators guide for practical examples.
Combine free text with controlled vocabulary
Every major database has a thesaurus:
- PubMed: MeSH (Medical Subject Headings)
- Embase: Emtree
- PsycINFO: APA Thesaurus of Psychological Index Terms
- CINAHL: CINAHL Subject Headings
Always search both the controlled vocabulary term AND the free-text term. Controlled vocabulary catches records indexed to the concept; free text catches recent or un-indexed records and international terminology. See our MeSH terms guide for PubMed specifics and the PubMed database guide for platform mechanics.
Translate across databases
You will search at least two databases — usually three to five. Each database has its own syntax. MeSH does not exist in Scopus. Truncation symbols differ. Field tags differ.
Build the strategy in the database you know best, then translate. Tools like the Polyglot Search Translator (Bond University) can bootstrap translations, but always hand-check. Document every translation; reviewers will ask.
Apply limits cautiously
Limit by:
- Language only if you have no translation capacity (and justify it as a limitation)
- Date only if the intervention or concept did not exist before a certain year
- Publication type to exclude editorials, letters, and commentaries — but be careful with conference abstracts
Do not limit by study design using database filters alone; they are imperfect. Apply design criteria at screening.
Document everything
For each database, record:
- Database name and platform (e.g., "Ovid MEDLINE")
- Date searched
- Full search string as run
- Filters applied
- Number of results
Use the search strategy documentation template or PRISMA-S as your template. This is item 7 of the PRISMA 2020 checklist (Page et al., 2021) — non-negotiable.
Get the search peer-reviewed
Use the PRESS (Peer Review of Electronic Search Strategies) checklist (McGowan et al., 2016) to have a second librarian or reviewer check:
- Are all concepts represented?
- Are synonyms comprehensive?
- Are operators and parentheses correct?
- Are limits appropriate?
A 30-minute peer review catches most common errors. Ten minutes of peer review is worth a day of re-running searches.
Run, de-duplicate, and export
After running all searches, export records (RIS or EndNote format) into a reference manager or Covidence. De-duplicate with a published method — Bramer's method in EndNote is the current gold standard. Expect 20–40% duplicates across databases; this is normal.
Then the real work begins: screening.