about methodology & statistics in research proposals, an information session

Wilfried Cools, Interfaculty Center Data processing & Statistics (ICDS)
wilfried.cools@vub.be --- https://www.icds.be/

Why and What

  • convince referees (and yourself):
    study should be successful, effective and efficient
  • some reviewers are statisticians
  • document (pdf/html) 'how to communicate methodology in research proposals'
  • should be 'methodological and statistical issues to communicate in a research proposal'
  • info session to discuss that document


  • updated and refined in future, you can help us
  • does not pretend full coverage
  • reflects our own views, not necessarily the views of other reviewers
  • should be used only for guidance, not as argument nor proof

Key Ingredients

  • what is the aim of your study
  • how is your study designed to achieve that aim


Research Aim

  • ask questions your research can provide answers for
    • if general, also operationalize
  • focus, specify the main questions
  • specify results -at a minimum- for a successful study
    • interesting & obtainable
  • describe expected results

  • aim partly determines justification
    • confirmatory / exploratory / preparatory / techn(olog)ical
    • quantitative / qualitative
    • inferential / descriptive


  • goal:: confirm an expected difference, relation, ...
  • focus:: statistical test or accurate parameter estimate
  • requirement::
    • sample size calculation
    • discuss costs and availability of observations
    • link research design and primary aim
    • statistical analysis plan
  • minimum:: significant effect / sufficient accuracy for primary hypotheses
  • note:: superiority versus non-inferiority and non-equivalence


  • goal:: to explore
  • focus A:: data description and/or parameter estimation
    • testing/accuracy could be secondary aim, only
  • focus B:: qualitative understanding
  • focus C:: predictive modeling or numerical techniques (eg., cluster analysis)
  • requirement::
    • sample size -justification-, argue balance information - cost
    • justify on substantive grounds, NOT on significance
    • link research design and most important information of interest
    • statistical analysis plan
  • minimum::
    • could be many things, as long as you can sufficiently argue its merit
    • note, study should be of interest without significant/accurate enough results !


  • goal:: prepare or justify future study
  • focus: small scale set-up to show the potential and/or detect issues
    • phase I and II clinical designs
      • requires decision criteria to proceed or not
    • pilot study: implementation of future study
      • no statistical testing
      • not in itself of interest, not intended for publication
      • could be (partially) qualitative
      • could be simply monitoring procedures (not a mini-copy)
  • requirement::
    • justify the need for a preparatory study & value of the -future- studies
    • minimal cost to get rough idea (eg., 3 observations per condition)
  • minimum: ensures information required for future study

TECHN(OLOG)ICAL advancements (purpose D)

  • goal:: to design, engineer, create, ...
  • focus:: rarely any statistics involved
  • requirement:: methodology not related to quantitative research
    • justification based on expected contribution versus costs, not statistics
      • proof of concept: feasibility
      • proof of principle: functionality
      • development application
  • minimum:: particular state of advancement, improvement

Quantitative versus Qualitative

  • quantitative
    • focus on quantifiable empirical aspects
    • typically reduces complexity (operationalize to manage)
    • can be descriptive and/or inferential

  • qualitative
    • focus on understanding: reasons, opinions, motivations, ...
    • typically aims at embracing complexity
    • only descriptive, exploratory / hypothesis generating

Descriptive versus Inferential

  • infer: 'population'
    • focus on population, aim at generalization
    • uses (ideally) representative samples
    • requires estimation of uncertainty, or use of it

  • describe: 'sample'
    • focus on the observed data
    • present data as is
    • use no uncertainty of estimation, nor p-values

Research Aim & Design

  • design is strategy to reach your aim
    data collection, measurement, intended analysis
    • specifies how (potential) observations are related
    • influences type of elicited information

  • popular saying: garbage in... garbage out !
    • poor design makes study inefficient or even invalid
    • statistics can not solve design problems

Research Design

  • sufficient sample size (quantity)
  • quality dependent on
    • conditions of observation, eg., cross-over
    • method of observation, eg., quality of life score
  • generalization dependent on
    • selection of research units
    • missing observations
  • allowing appropriate statistical analysis

Quantity of Observations

  • observations provide information and involve a cost
    • as many observations as possible, but...
  • when confirmatory, sample size -calculation- requires information on:
    • statistical test in focus
    • effect size aimed for
      • effect: ideally specified substantively, or common practice / earlier research
      • uncertainty: ideally based on earlier data/research or pilot
    • operational characteristics (type I error \(\alpha\) and type II error \(\beta\))
    • issues:
      • only for -future- studies (not retrospective)
      • only for primary research questions (take highest when multiple)

Quality of Observations

  • depends on conditions & method

  • general principle
    • control confounding variables (influence not in the model)
    • maximize systematic variability (explained variance)
    • minimize non-systematic variability (unexplained variance)

  • often requires control
    • experimental study exerts control
      • necessary condition for causal conclusions
    • observational study does not exert control
      • retrospective, naturalistic, survey, ...

Quality of Observations --- Confounders

  • control on confounding variables
    • randomization (large enough sample size)
    • repeated measures
    • blocking
    • matching
    • cross-over designs
    • (double-) blinding
    • and more ...
  • typically complicates statistical analysis
    • correlational structure

Quality of Observations --- (Non-) Systematic Variability

  • maximize systematic variability
    • use proper explanatory variables
    • use maximally differentiating conditions
      • beware: not too extreme

  • minimize non-systematic variability
    • maximize systematic variability
    • use proper measurement tools
      • high reliability / precision, maybe check first
      • combine measurement tools (estimate reliability)
    • use categories when necessary only, continuous if possible


  • generalization (inference) dependent on successful sampling
    • method of sampling
    • missing data
      • avoid
      • remediate

  • types of sampling
    • probabilistic (sampling: random, stratified, multi-stage, ....)
    • non-probabilistic (sampling: diversity, expert, ....)


  • observations provide information that is used to
    • test (signif.), estimate (confidence int.), predict (cross validate)

  • introduce the statistics that are planned
    • in agreement with design
    • able to answer the main research questions (highlight how)
    • promising to answer secondary research questions
    • while adhering to foreseeable challenges
      • eg., small sample sizes, difficult distributions, ...
  • notes:
    • sample size calculations are not a part of the statistical plan
    • give the resulting data file some serious thought

Practical Suggestions

  • isolate methodological and statistical issues from substantive reasoning
  • use consistent labeling
  • visualize wherever possible

    • data collection process (time-line)
    • categories of observations and their relation
    • show design with tables
      list all conditions between (rows) and within (columns)
    • conditions


  • many methodological and statistical issues, some may be relevant
  • convince yourself that your study is appropriate
  • be clear on your aim and how to be successful (research design)
  • convince the committee that your study is appropriate
  • for support, maybe turn to ICDS

Interfaculty Center Data processing & Statistics

Methodological and statistical support to help make a difference

  • ICDS provides complementary support in methodology and statistics to our research community, for both individual researchers and research groups, in order to get the best out of them

  • ICDS aims to address all questions related to quantitative research, and to further enhance the quality of both the research and how it is communicated

website: https://www.icds.be/
includes information on who we serve, and how
booking: https://www.icds.be/consulting.php
for individual consultations