Recommendations to address uncertainties in environmental ... - bioRxiv

0 downloads 0 Views 7MB Size Report
Dec 7, 2018 - Baudrot V and Charles S. (2018). Recommendations to ... Based on reviews by: Andreas Focks and two anonymous reviewers ... also explored the mathematical properties of LC(x, t) and MF(x, t) as well as the impact of different ...... als—The best foundation for regulatory decision-making? Science of the ...
bioRxiv preprint first posted online Jun. 27, 2018; doi: http://dx.doi.org/10.1101/356469. The copyright holder for this preprint (which was not peer-reviewed) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-NC-ND 4.0 International license.

Open Access RESEARCH ARTICLE

Recommendations to address uncertainties in environmental risk assessment using toxicokinetics-toxicodynamics models Virgile Baudrot, Sandrine Charles Cite as:

Baudrot V and Charles S. (2018). Recommendations to address uncertainties in

environmental risk assessment using toxicokinetics-toxicodynamics models. bioRxiv 356469,

ver. 3 peer-reviewed and recommended by PCI Ecol. DOI: 10.1101/356469

Peer-reviewed and recommended by Peer Community in Ecology Recommendation DOI: 10.24072/pci.ecology.100007

Recommender: Luís César Schiesari

Based on reviews by: Andreas Focks and two anonymous reviewers

PEER COMMUNITY

IN

ECOLOGY

bioRxiv preprint first posted online Jun. 27, 2018; doi: http://dx.doi.org/10.1101/356469. The copyright holder for this preprint (which was not peer-reviewed) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-NC-ND 4.0 International license.

Open Access RESEARCH ARTICLE

Cite as: preprint Published: December 2018 Recommender:

Luís César Schiesari Reviewers:

Andreas Focks and two anonymous reviewers

Recommendations to address uncertainties in environmental risk assessment using toxicokinetics-toxicodynamics models Virgile Baudrot1 & Sandrine Charles1 1

Univ Lyon, Université Lyon 1, UMR CNRS 5558, Laboratoire de Biométrie et Biologie Évolutive

F-69100 Villeurbanne, France

This article has been peer-reviewed and recommended by:

Peer Community In Ecology (DOI: 10.24072/pci.ecology.100004)

Correspondence:

[email protected]

[email protected]

PEER COMMUNITY IN ECOLOGY | DOI: 10.24072/pci.ecology.100007

1 of 26

bioRxiv preprint first posted online Jun. 27, 2018; doi: http://dx.doi.org/10.1101/356469. The copyright holder for this preprint (which was not peer-reviewed) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-NC-ND 4.0 International license.

ABSTRACT

Providing reliable environmental quality standards (EQSs) is a challenging issue in environmental risk assessment (ERA). These EQSs are derived from toxicity endpoints estimated

from dose-response models to identify and characterize the environmental hazard of

chemical compounds such as those released by human activities. These toxicity endpoints

include the classical x% effect/lethal concentrations at a specific time t (EC/LC(x, t)) and

the new multiplication factors applied to environmental exposure profiles leading to x%

effect reduction at a specific time t (M F (x, t), or denoted LP (x, t) by the EFSA). However, classical dose-response models used to estimate toxicity endpoints have some weaknesses, such as their dependency on observation time points, which are likely to differ between

species (e.g., experiment duration). Furthermore, real-world exposure profiles are rarely constant over time, which makes the use of classical dose-response models difficult and

compromises the derivation of M F (x, t). When dealing with survival or immobility toxicity test data, these issues can be overcome with the use of the general unified threshold model of survival (GUTS), a toxicokinetics-toxicodynamics (TKTD) model that provides an

explicit framework to analyse both time- and concentration-dependent data sets as well as

obtain a mechanistic derivation of EC/LC(x, t) and M F (x, t) regardless of x and at any time t of interest. In addition, the assessment of a risk is inherently built upon probability distributions, such that the next critical step for ERA is to characterize the uncertainties of

toxicity endpoints and, consequently, those of EQSs. With this perspective, we investigated the use of a Bayesian framework to obtain the uncertainties from the calibration process and

to propagate them to model predictions, including LC(x, t) and M F (x, t) derivations. We

also explored the mathematical properties of LC(x, t) and M F (x, t) as well as the impact of different experimental designs to provide some recommendations for a robust derivation

of toxicity endpoints leading to reliable EQSs: avoid computing LC(x, t) and M F (x, t) for extreme x values (0 or 100%), where uncertainty is maximal; compute M F (x, t) after a long period of time to take depuration time into account and test survival under few correlated and uncorrelated pulses of the contaminant in terms of depuration.

Keywords: Survival models; Dose Response; GUTS; Lethal Concentration; Multiplication Factor; Lethal Profile; Margin of Safety; Environmental Risk Assessment

Introduction

Assessing the environmental risk of chemical compounds requires the definition of environmental quality standards (EQSs). EQS are based on several calculations depending on the context and institutions such as predicted-no-effect concentrations (PNECs) [19] and specific PEER COMMUNITY IN ECOLOGY | DOI: 10.24072/pci.ecology.100007

2 of 26

bioRxiv preprint first posted online Jun. 27, 2018; doi: http://dx.doi.org/10.1101/356469. The copyright holder for this preprint (which was not peer-reviewed) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-NC-ND 4.0 International license.

concentration limits (SCLs) [17]. Specifically, the derivation of EQSs results from a combination

of assessment factors with toxicity endpoints mainly estimated from measured exposure responses of a set of target species to a certain chemical compound [17, 19, 27, 37]. Estimating

reliable toxicity endpoints is challenging and very controversial [28, 32]. Currently, the first

step of environmental risk assessment (ERA) is the hazard identification of acute effects,

which consists of fitting classical dose-response models to quantitative toxicity test data. For acute effect assessment, such data are collected from standard toxicity tests, from which

the 50% lethal or effective concentration (LC50 or EC50 , respectively) is generally estimated

at the end of the exposure period, meaning that not all observations are used. In addition, classical dose-response models implicitly assume that the exposure concentration remains

constant throughout the experiment, which makes it difficult to extrapolate the results to

more realistic scenarios with time-variable exposure profiles combining different heights,

widths and frequencies of contaminant pulses [7, 13, 28, 35].

To overcome this limitation at the organism level, the use of mechanistic models, such as

toxicokinetics-toxicodynamics (TKTD) models, is now promoted to describe the effects of a

substance of interest by integrating the dynamics of the exposure [19, 26, 29]. Indeed, TKTD models appear highly advantageous in terms of gaining a mechanistic understanding of the

chemical mode of action, deriving time-independent parameters, interpreting time-varying

exposure and making predictions under untested conditions [7, 29]. Another advantage

of TKTD models for ERA is the possible calculation of lethal concentrations for any x% of

the population at any given exposure duration t, denoted LC(x, t). Furthermore, from time-variable concentration profiles observed in the environment, it is possible to estimate

a margin of safety such as the exposure multiplication factor M F (x, t), leading to any x% effect reduction due to the contaminant at any time t [7, 20] (also called the lethal profile and denoted LP (x, t) by [20]).

When focusing on the survival rate of individuals, the general unified threshold model of

survival (GUTS) has been proposed to unify the majority of TKTD survival models [29]. In the present paper, we consider the two most used derivations, namely, the stochastic death

(GUTS-RED-SD) and individual tolerance (GUTS-RED-IT) models. The GUTS-RED-SD model

assumes that all individuals are identically sensitive to the chemical substance by sharing a common internal threshold concentration and that mortality is a stochastic process once

this threshold is reached. In contrast, the GUTS-RED-IT model is based on the critical body residue (CBR) approach, which assumes that individuals differ in their thresholds, following a

probability distribution, and die as soon as the internal concentration reaches the individualspecific threshold [29]. The robustness of GUTS models in calibration and prediction has been

widely demonstrated, with little difference between GUTS-RED-SD and GUTS-RED-IT models [7,

10, 30]. Sensitivity analysis of toxicity endpoints derived from GUTS models, such as LC(x, t) and M F (x, t), has also been investigated [7, 10], but the question of how uncertainties are propagated is still under-studied.

Quantifying uncertainties or levels of confidence associated with toxicity endpoints is

PEER COMMUNITY IN ECOLOGY | DOI: 10.24072/pci.ecology.100007

3 of 26

bioRxiv preprint first posted online Jun. 27, 2018; doi: http://dx.doi.org/10.1101/356469. The copyright holder for this preprint (which was not peer-reviewed) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-NC-ND 4.0 International license.

undoubtedly a way to improve trust in risk predictors and to avoid decisions that could

increase rather than decrease the risk [12, 14, 24]. The Bayesian framework has many advantages for dealing with uncertainties since the distribution of parameters and thus their uncertainties is embedded in the inference process [36]. While the construction of priors on

model parameters can be seen as subjective [21], it provides added value by taking advantage of information from the experimental design [10, 15]. Consequently, coupling TKTD models

with Bayesian inference allows one to estimate the probability distribution of toxicity endpoints

and any other predictions coming from the mechanistic (TKTD) model by taking into account

all the constraints resulting from the experimental design. Moreover, Bayesian inference,

which is particularly efficient with GUTS models [10, 15], can also be used to optimize the

experimental design by quantifying the gain in knowledge from priors to posteriors [1]. Finally, Bayesian inference is tailored for decision making as it provides assessors with a range of

values rather than a single point, which is particularly valuable in risk assessment [21, 24].

In the present study, we explore how scrutinizing uncertainties helps provide recommenda-

tions for experimental design and the characteristics of toxicity endpoints used in EQSs while

maximizing their reliability. We first give an overview of TKTD models, with a focus on the GUTS [29] to derive EQS explicite equations. We then illustrate how to handle GUTS models

within the R package morse [8] with five example data sets. Then, we explore how a variety of

experimental designs influence the uncertainties in derived LC(x, t) and M F (x, t). Finally,

we provide a set of recommendations on the use of TKTD models for ERA based on their added value and the way the uncertainty may be handled under a Bayesian framework.

Material and methods

Data from experimental toxicity tests

We used experimental toxicity data sets described in [5] and [33] testing the effect of five

chemical compounds (carbendazim, cypermethrin, dimethoate, malathion and propiconazole) on the survival rate of the amphipod crustacean Gammarus pulex. Two experiments were

performed for each compound, one exposing G. pulex to constant concentrations and the

other exposing G. pulex to time-variable concentrations (see Table 1). In the constant exposure experiments, G. pulex was exposed to eight concentrations for four days. In the time-variable exposure experiments, G. pulex was exposed to two different pulse profiles consisting of two one-day exposure pulses with either a short or long interval between them.

GUTS modelling

In this section, we detail the mathematical equations of GUTS models describing the survival

rate over time of organisms exposed to a profile of concentrations of a single chemical

product. All other possible derivations of GUTS models are fully described in [29, 30]. Here, we PEER COMMUNITY IN ECOLOGY | DOI: 10.24072/pci.ecology.100007

4 of 26

bioRxiv preprint first posted online Jun. 27, 2018; doi: http://dx.doi.org/10.1101/356469. The copyright holder for this preprint (which was not peer-reviewed) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-NC-ND 4.0 International license.

Table 1. Characteristics of data sets used in the manuscript. "Profile type" is the type of exposure profile (constant or time-variable), "Data points" refers to the number of data points

in the data set, "Nbr profiles" is the number of profiles in the data set, "Ninit " is the initial

number of individuals in the profile, "Nbr days" is the number of days for each experiment, and "Time points per profile" is the number of observation time points for each time series

(each constant profile consisted of 5 time points). Product

Profile type

Data

points

Nbr

pro-

Ninit

Nbr

Time

files

days

per profile

carbendazim

constant

40

8

20

4

5

dimethoate

constant

40

8

20

4

5

propiconazole

constant

40

8

cypermethrin

variable

cypermethrin malathion

carbendazim

constant constant

variable

40

40

51

61

8 8

4

4

20

20

5

80

10

70

22

[8, 14, 16, 13] [10, 18, 18, 15] [10, 16, 17, 15] [35, 35] [11, 21, 21, 21]

80

58

4

80

propiconazole

variable

74

4

70

70

2

5

4

variable

variable

4

5

20

dimethoate malathion

4

points

10 10 10

provide a summary of GUTS-RED-SD and GUTS-RED-IT reduced models to introduce notations

and equations relevant for mathematical derivation of explicit formulations of the x% lethal

concentration at time t, denoted LC(x, t), and of the multiplication factor leading to x% mortality at time t, denoted M F (x, t). Toxicokinetics

We define Cw (t) as the external concentration of a chemical product, which can be variable

over time. As there is no measure of internal concentration, we use the scaled internal concentration, denoted Dw (t), which is therefore a latent variable described by the toxicokinetics part of the model as follows:

dDw (t) = kd (Cw (t) − Dw (t)) dt

(1)

where kd [time−1 ] is the dominant rate constant, corresponding to the slowest compensat-

ing process dominating the overall dynamics of toxicity.

As we assume that the internal concentration equals 0 at t = 0, the explicit formulation for

constant concentration profiles is given by

Dw (t) = Cw 1 − e−kd t PEER COMMUNITY IN ECOLOGY | DOI: 10.24072/pci.ecology.100007



(2) 5 of 26

bioRxiv preprint first posted online Jun. 27, 2018; doi: http://dx.doi.org/10.1101/356469. The copyright holder for this preprint (which was not peer-reviewed) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-NC-ND 4.0 International license.

An explicit expression for time-variable exposure profiles is provided in the Supplementary

Material as it can be useful for implementation but not for the mathematical calculus presented below. The GUTS-RED-SD and GUTS-RED-IT models are based on the same model for the scaled internal concentration. These models do not differ in the TK part but do differ in the TD part describing the death mechanism.

From the toxicokinetics equation (2), we can easily compute the x% depuration time DRTx ,

that is, the period of time after a pulse leading to an x% reduction in the scaled internal concentration:

(3)

− log(x%) kd

DRTx =

While GUTS-RED-SD and GUTS-RED-IT models have the same toxicokinetic equation (1), the

DRTx likely differs between them since the meaning of damage depends on the toxicodynamic equations, which are different. Toxicodynamics

The GUTS-RED-SD model supposes that all the organisms have the same internal threshold

concentration, denoted z [mol.L−1 ], and that once this concentration threshold is exceeded, the instantaneous probability of death, denoted h(t), increases linearly with the internal concentration. The mathematical equation is

(4)

h(t) = bw max (Dw (τ ) − z, 0) + hb 0≤τ ≤t

where bw [L.mol.time−1 ] is the killing rate and hb [time−1 ] is the background mortality

rate.

Then, the survival probability over time under the GUTS-RED-SD model is given by (5)

 Z t  SSD (t) = exp − h(τ ) dτ

The GUTS-RED-IT model supposes that the threshold concentration is distributed among 0

organisms and that death is immediate as soon as this threshold is reached. The probability of death at the maximal internal concentration with background mortality hb is given by

SIT (t) = exp(−hb t)(1 − F ( max (Dw (τ )))) 0