• Home
  • Resources by chapter
    • Chapter 1: Why spatio-temporal epidemiology?
    • Chapter 2: Modelling health risks
    • Chapter 3: The importance of uncertainty
    • Chapter 4: Embracing uncertainty: the Bayesian approach
    • Chapter 5: The Bayesian approach in practice
    • Chapter 6: Strategies for modelling
    • Chapter 7: Is `real' data always quite so real?
    • Chapter 8: Spatial patterns in disease
    • Chapter 9: From points to fields: Modelling environmental hazards over space
    • Chapter 10: Why time also matters
    • Chapter 11: The interplay between space and time in exposure assessment
    • Chapter 12: Roadblocks on the way to causality: exposure pathways, aggregation and other sources of bias
    • Chapter 13: Better exposure measurements through better design
    • Chapter 14: New frontiers
  • Courses
    • Statistical methods in epidemiology
    • Spatio-temporal Methods in epidemiology
    • Advanced statistical modelling in space and time
    • BUC1 (CIMAT) When populations and hazards collide: modelling exposures
      and health
    • BUC2 (UNAM) Thinking Globally: The Role of Big Data
    • Detecting Pattens in Space and Time (CMM)
    • BUC4 (Bath) New Frontiers: Advanced Modelling in Space and Time
    • Big Data in Environmental Research
    • Statistics and Data Science in Research: unlocking the power of your data
    • Bayesian Hierarchical Models
    • BUCX (UNAM) Quantifying the Health Impacts of Air Pollution
    • Environmental Health Impact Assessment using R (IOM)
  • Computing resources
    • WinBUGS
    • INLA
    • EnviroStat
  • Book's webpage @ CRC

Spatio-Temporal Methods in Environmental Epidemiology

Chapter 13 - BETTER EXPOSURE MEASUREMENTS THROUGH
BETTER DESIGN

Summary

This chapter looks at the emergence of a central purpose; to explore or reduce uncertainty about aspects of the environmental
processes of interest. One form of uncertainty, aleatory, cannot be reduced by definition whereas with the other, epistemic,
where uncertainty can be reduced (see Chapter 3). However that reduction does not stop the original network from becoming
sub- optimal over time, pointing to the need to regularly reassess its performance. From that perspective we see that the design
criteria must allow for the possibility of ‘gauging’ (adding monitors to) sites that

  • Maximally reduce uncertainty at their space–time points (measuring their responses eliminates their uncertainty);
  • Best minimise uncertainty at other locations;
  • Best inform about process parameters;
  • Best detect non-compliers.

  • From this chapter, the reader will have gained an understanding of many of the challenges that the network designer may
    face. These involve the following topics:

  • A multiplicity of valid design objectives.
  • Unforeseen and changing objectives.
  • A multiplicity of responses at each site, i.e. which should be monitored.
  • A need to use prior knowledge and to characterise prior uncertainty.
  • A need to formulate realistic process models.
  • A requirement to take advantage of, and integrate with, existing networks.
  • The need to be realistic, meaning to contend with economic as well as administrative demands and constraints.

  • R CODE

    Code for ggmapping

    Example 13.14

    Loading metro vancouver data

    Modelling the california temperature field

    Redesigning the california temperature monitoring network


    DATA

    Maximum Daily Temperature at California sites

    California Temperature sites MetaData

    • ©Gavin Shaddick and James V. Zidek 2015