Systematic Reviews and Meta-Analysis in Ecology

Matt

Why are we talking about Evidence Synthesis in an OpenScience course?

  • Evidence synthesis relies on Openness of primary research
  • Evidence synthesis built on many Open Science principles e.g. Protocols/reporting standards
  • Evidence syntheses often insufficiently Open

Stages of Evidence Synthesis (systematic review/map)

Question formulation

Question formulation

Often an iterative process – starts off broad and needs to be narrowed to be answered Identify a question that is of greatest interest (to stakeholders, decision-makers etc.) Identify a question that maximises cost-effectiveness Identify a question that minimised confusion (avoid vague phrasing)

Question formulation

What type of questions can be asked?

  • What is the state of the evidence? “What research evidence is there that humans are exposed to and affected by AMR in the environment?”

  • What is the effect of an intervention/exposure on a population? “How do changes in flow magnitude due to hydroelectric power production affect fish abundance and diversity in temperate regions?”

  • How can we generalise the best available evidence to a larger population/spatial extent? Greenspaces & human health benefits

Question frameworks

  • PICO – Population Intervention Comparator Outcome

    • What is the effect of neonicotinoid pesticides on pollinator performance?
  • PECO – Population Exposure Comparator Outcome

    • What is the effectiveness of conversion to organic farming on pollinator performance
  • PO – Population Outcome(s)

    • What is the prevalence (ppm) of neonicotinoid pesticides in fresh water?

Question frameworks – Your turn

  • How does the size and density of kelp forest affect fisheries in the Pacific?

  • Are reintroduction programs effective for increasing populations of African wild dogs in South Africa?

  • What is the impact of flooding on abundance of trout in Chilean rivers?

How does the size and density of kelp forest affect fisheries in the Pacific?

Are reintroduction programs effective for increasing populations of African wild dogs in South Africa?

What is the impact of flooding on abundance of trout in Chilean rivers?

Your question?

Protocol

CEE author guidelines

Example of a systematic protocol

Searching

Where to find evidence - tips and tricks

True or False

  • I would go to GoogleScholar first when searching for evidence
  • Web of Science is more comprehensive than Google Scholar
  • Web of Science is a transparent, repeatable resource
  • Grey literature is defined as anything unpublished and not peer-reviewed

Searches

  • Comprehensive, transparent, objective
  • Multiple academic databases
  • Tried and tested search string
  • Multiple sources of grey literature
  • Languages
  • Documented in detail – all decisions described and explained

Bibliographic databases

  • WoS

  • Scopus

  • Agricola

  • AGRIS (FAO)

  • Academic Search Premier
  • Biological Abstracts

  • CAB Abstracts

  • Lens.org

  • CEE 2012-2017 Systematic review – mean = 9 (2 – 75) databases. Search between 8-15 databases (broad and specific)

Data management

  • Search results are data
  • Search strategy is metadata
  • Use reproducible workflows
  • File format – RIS files are useful for exchange between software
  • Keep track of where files come from (which databases)
  • Use a file management or review management tool (e.g. Rayyan)

Some extra resources

Citation chasing with R

Data management with R

Review management

Different types of Evidence synthesis

What is the effect of logging on biodiversity?

Rapid reviews

  • No agreed definition of a “rapid review”
  • Which corners to cut (not exhaustive)?
  • Span of the literature (e.g. post 2010 - present)
  • “Study quality” (e.g. only systematic reviews?)
  • Number of reviewers (e.g. 1 person to review & 1 to check a %)

Critical appraisal

Critical appraisal

This is the hard part that no-one does!!

  • Checking both internal and external validity
  • Internal – are the conclusions of the paper based on sound causality
  • External – does the paper really fit with my question

How do we do it?

  • All possible sources of bias, error or uncertainty
  • Are the outcome measures used appropriate/accurate?
  • Is a suitable comparator present?
  • How were treatments assigned?
  • What is the level of replication?
  • What is the study design?
  • Are baseline measures included?
  • Are there systematic differences between groups?
  • Are there potential confounding variables?

How do we do it?

  • Lots of tools (none of them 100 % reusable!)
  • Checklists
  • Questions about validity (“yes”, “no” or “unclear”)
  • Have in mind what the “optimum” study would be
  • What is the overall validity? “High” “Low” “Unclear”

Things to consider

  • Adapt existing tools
  • Provide logical rational for the approach
  • Use CA to inform the results
  • Validity not quality
  • Conduct of the science NOT reporting of the science (which is annoying but…)
  • Try to think about all possible threats to validity
  • This is hard – do not just copy the previous review (as no doubt it can be improved)

Your turn

Critical appraisal questions

  1. Are control and intervention populations appropriately matched? i.e. did the authors use well-justified and sensible ways to select comparable control and intervention groups?

  2. Do control/before and intervention/after differ only in terms of the intervention? i.e. are there any other factors present that might have caused the differences between control and intervention groups?

  3. Were the measured samples selected in a systematic manner that aims to avoid bias?

  1. Has sufficient time passed after the intervention to allow impacts to be felt?

  2. Was the study design strong and appropriate? i.e. was it a randomised control trial, was there a temporal comparator (before) AND spatial comparator (control), or just temporal OR spatial?

  3. Was the sample size sufficient? i.e. was there true replication?

Synthesis

Using evidence to inform decisions

Giving decision-makers a pooled effect size or a biased assessment of the evidence should not be tolerated (looking at you Editors and Reviewers of Journals etc.)

  • What are typical synthesis textual outputs (which are useful)?
  • What are typical synthesis visual outputs?
  • Where can we do much better?
  • How can we help in the decision-process itself (with quantitative evidence)?

Typical Evidence Synthesis outputs

  • Narrative synthesis

  • Quantitative synthesis

  • Qualitative synthesis

Typical Evidence Synthesis outputs

  • Transparent

  • Minimise bias

  • Avoid vote-counting

  • Account for varying reliability of the evidence

DONT DO THIS - vote counting

DONT DO THIS - vote counting

Report the synthesis

Forest plots

Show me the data!

Meta-regression

Meta-analysis

Let’s have a look at the meta_analysis.qmd file

Not all evidence synthesis is created equally

We need to critically assess anything that is called a systematic review or meta-analysis.

Examples of poor conduct

landscape ecology

A challenge - find the problem(s) with this study