Review Process

Search Strategy

Build a reproducible literature search strategy with PICO framing, database selection, Boolean operators, MeSH terms, grey literature, and PRISMA-S documentation.

By Angel Reyes · Last updated

Tools

  • PubMed
  • Scopus
  • CINAHL
  • Web of Science
  • PsycINFO
  • ERIC
  • Cochrane Library

Phase 1: Design a reproducible search strategy

A literature review is only as trustworthy as the search that feeds it. The search strategy phase is where you translate a research question into a reproducible protocol of databases, controlled vocabulary, free-text terms, Boolean logic, and documentation. Done well, another researcher — or a peer reviewer — can rerun your search and arrive at the same result set. Done poorly, your review is impossible to update and vulnerable to selection bias. This phase feeds directly into the systematic review guide and the scoping review guide, and its outputs drive every downstream decision in Phase 2 — Screening.

1. Frame the question with PICO or PEO

Before touching a database, express your question in a structured framework. Quantitative reviews typically use PICO (Population, Intervention, Comparator, Outcome). Qualitative reviews often use PEO (Population, Exposure, Outcome) or SPIDER (Sample, Phenomenon of Interest, Design, Evaluation, Research type). Scoping reviews use PCC (Population, Concept, Context).

The framework forces you to name each concept explicitly. Each concept becomes a "block" of synonyms in your search string, and each block is combined with OR; the blocks are then combined with AND. A well-framed question typically produces three to four concept blocks. More than five often indicates an over-specified question that will miss relevant studies.

Document your framework in the Search Strategy Documentation Form (download from the templates library). Reviewers and librarians will expect to see it.

2. Select databases

No single database indexes the full biomedical, social-science, and grey literature. PRISMA 2020 and the Cochrane Handbook both recommend searching at least two subject-relevant bibliographic databases, plus trial registries and grey literature for intervention reviews. Common choices:

  • PubMed / MEDLINE — biomedical and life sciences; free; uses MeSH controlled vocabulary. See our PubMed search tutorial.
  • CINAHL — nursing and allied health; uses CINAHL subject headings. See our CINAHL search tutorial.
  • Scopus — multidisciplinary; strong citation tracking; no controlled vocabulary. See our Scopus search tutorial.
  • Web of Science — multidisciplinary; citation indexes back to 1900; strong for bibliometrics.
  • PsycINFO — psychology and behavioural sciences; uses APA Thesaurus.
  • ERIC — education research; uses ERIC Thesaurus.
  • Cochrane Library — controlled trials (CENTRAL), systematic reviews (CDSR), methods studies.

Match databases to your question. A review of school-based mental health interventions should cover ERIC, PsycINFO, and CINAHL; a drug efficacy review should cover PubMed, Embase, and CENTRAL.

3. Map controlled vocabulary

Each major database maintains a controlled vocabulary (a thesaurus) that indexers apply to every record. Using these terms captures conceptually related articles even when authors' keywords vary.

  • MeSH in PubMed — e.g., "Diabetes Mellitus, Type 2"[MeSH].
  • Emtree in Embase — often more granular than MeSH (e.g., drug names, devices).
  • CINAHL Subject Headings — similar to MeSH but adapted for nursing contexts.
  • APA Thesaurus of Psychological Index Terms in PsycINFO.
  • ERIC Thesaurus descriptors.

Check whether to explode a term (include narrower terms under it) and whether to restrict to major topic only. Explode by default for scoping reviews; restrict to major topic only when precision matters more than recall.

4. Combine controlled vocabulary with free text

Controlled vocabulary alone misses recent articles (not yet indexed), conference abstracts, and records from databases without a thesaurus. Pair each controlled term with a free-text (title/abstract) search of synonyms and spelling variants. Use truncation (*), phrase searching ("..."), and proximity operators (NEAR/n, ADJn) where the database supports them.

5. Write Boolean search strings

PubMed ("Diabetes Mellitus, Type 2"[MeSH] OR "type 2 diabetes"[tiab] OR "T2DM"[tiab]) AND ("Telemedicine"[MeSH] OR "telehealth"[tiab] OR "mHealth"[tiab])
Scopus TITLE-ABS-KEY(("type 2 diabetes" OR "T2DM") AND (telehealth OR telemedicine OR mHealth))
CINAHL (MH "Diabetes Mellitus, Type 2") OR TI "type 2 diabet*" OR AB "type 2 diabet*") AND ((MH "Telehealth") OR TI telehealth OR AB telehealth)

Operators follow a strict hierarchy: NOT is evaluated first, then AND, then OR — always wrap your OR-joined synonyms in parentheses so the engine does not silently strip them.

6. Include grey literature

Peer-reviewed databases miss dissertations, agency reports, conference papers, and unpublished trials. Reviews that rely only on published literature are vulnerable to publication bias. Search:

  • Google Scholar — broad coverage; screen the first 200–300 results per query.
  • ProQuest Dissertations & Theses Global — doctoral dissertations worldwide.
  • OpenGrey (archived) and GreyNet — European grey literature.
  • Agency websites — WHO, CDC, NICE, AHRQ, Cochrane, Campbell Collaboration, and the relevant government ministries for your topic.
  • Trial registries — ClinicalTrials.gov, WHO ICTRP, ISRCTN.
  • Preprint servers — medRxiv, PsyArXiv, SSRN, where appropriate.

Document exactly how you searched each grey source (URL, date, terms, number of hits screened) because most grey sources cannot be queried with Boolean precision.

7. Document everything with PRISMA-S

The PRISMA-S checklist (Rethlefsen et al., 2021) extends PRISMA 2020 with 16 items covering search reporting: databases, platforms, vendors, dates, limits, peer review of the strategy, deduplication method, and the full search strings for every source. Completing PRISMA-S as you go — not retrospectively — is the single most effective way to make your review reproducible.

Record at minimum:

Field Example
Database PubMed
Platform/vendor NLM, via pubmed.ncbi.nlm.nih.gov
Date of last search 2026-03-14
Date range No limit – 2026-03-14
Language limits None
Records retrieved 842
Full strategy See Appendix A

Store these in the Search Strategy Documentation Form template.

8. Peer review and pilot your strategy

Before committing, have a librarian or second reviewer evaluate your strategy using the PRESS (Peer Review of Electronic Search Strategies) checklist. PRESS catches missing synonyms, misplaced parentheses, wrong field tags, and accidental NOT exclusions.

Pilot-test by:

  1. Running your strategy on a seed set of 5–10 known relevant articles. Every seed article should be retrievable.
  2. Checking whether recall (did we find the seed articles?) and precision (what fraction of hits are plausibly on topic?) are acceptable. Scoping reviews tolerate lower precision; targeted systematic reviews need higher precision.
  3. Iterating. Add a missing synonym, loosen a field tag, or split a concept block and rerun.

9. Export, deduplicate, and hand off

Export records in RIS or NBIB format and import them into your reference manager (Zotero, EndNote, Mendeley) or screening tool (Covidence, Rayyan). Deduplicate using the tool's algorithm and document the duplicate count for your PRISMA flow diagram. The deduplicated set becomes the input to Phase 2 — Screening & Selection.

Tools and templates for this phase

Next phase

With a deduplicated, documented corpus in hand, move to Phase 2 — Screening & Selection →