Evidential Model Validation under Epistemic Uncertainty

1 downloads 0 Views 2MB Size Report
28 Jan 2018 - This is an open access article distributed under the Creative Commons Attribution ..... However, the basic idea of the bootstrapping method [66].
Hindawi Mathematical Problems in Engineering Volume 2018, Article ID 6789635, 11 pages https://doi.org/10.1155/2018/6789635

Research Article Evidential Model Validation under Epistemic Uncertainty Wei Deng,1,2 Xi Lu,2 and Yong Deng

1,2,3

1

Institute of Fundamental and Frontier Science, University of Electronic Science and Technology of China, Chengdu 610054, China School of Computer and Information Science, Southwest University, Chongqing 400715, China 3 Big Data Decision Institute, Jinan University, Guangzhou, Guangdong 510632, China 2

Correspondence should be addressed to Yong Deng; [email protected] Received 25 September 2017; Accepted 28 January 2018; Published 22 February 2018 Academic Editor: Roman Wendner Copyright © 2018 Wei Deng et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. This paper proposes evidence theory based methods to both quantify the epistemic uncertainty and validate computational model. Three types of epistemic uncertainty concerning input model data, that is, sparse points, intervals, and probability distributions with uncertain parameters, are considered. Through the proposed methods, the given data will be described as corresponding probability distributions for uncertainty propagation in the computational model, thus, for the model validation. The proposed evidential model validation method is inspired by the idea of Bayesian hypothesis testing and Bayes factor, which compares the model predictions with the observed experimental data so as to assess the predictive capability of the model and help the decision making of model acceptance. Developed by the idea of Bayes factor, the frame of discernment of Dempster-Shafer evidence theory is constituted and the basic probability assignment (BPA) is determined. Because the proposed validation method is evidence based, the robustness of the result can be guaranteed, and the most evidence-supported hypothesis about the model testing will be favored by the BPA. The validity of proposed methods is illustrated through a numerical example.

1. Introduction Complex real world phenomena have been increasingly being modeled by various sophisticated computational models with few or no full-scale experiments. Besides, due to the rapid development of computer technology, the model-based simulations are increasingly dominant in the design and analysis of complex engineering systems, therefore making the reduction of the cost and the time of engineering development depending on the understanding of these phenomena and full-scale testings. The quality of the model prediction is influenced by various sources of uncertainty such as model assumptions, solution approximations, variability uncertainty in model inputs, and parameters and data uncertainty due to sparse and imprecise information. When the model is used for system risk assessment or certification of reliability and safety under actual use conditions, it is crucial to quantify the uncertainty and confidence in the model prediction in order to help the risk-informed decision making, and hence the model needs to be subjected to rigorous, quantitative verification and validation (V&V) before it can be applied

to practical problems with confidence [1]. Model validation is an important component of quantification of margins and uncertainties (QMU) analysis that is intimately connected with the assessment and representation of uncertainty [2]. The process of model validation measures the extent of the agreement between the model prediction and experimental observations [3]. Our work focuses on model validation under epistemic uncertainty in both the model inputs and the available observed experimental evidence. Modeling complex systems are very complicated since there are many factors interacting with each other [4– 8]. A key component of model validation is the specific, rigorous disposal of numerous sources and different types of uncertainty. The uncertainty can be roughly divided into two types: aleatory and epistemic. The term aleatory uncertainty describes the inherent variation of the physical system. Such variation is usually due to the random nature of the input data and can be mathematically represented by a probability distribution once enough experimental data is available. Epistemic uncertainty in nondeterministic systems arises due to ignorance, lack of knowledge, or incomplete information.

2 These definitions are adopted from the papers by Oberkampf et al. [9–11]. Epistemic uncertainty regarding a variable can be of two types: a poorly known stochastic quantity [12] or a poorly known deterministic quantity [13]. In this paper, we are concerned only with the former type of epistemic uncertainty where sparse and imprecise information (i.e., sparse point data and/or interval data) is available regarding a stochastic quantity; as a result, the distribution type and/or the distribution parameters are uncertain. Both the model inputs and the observed experimental evidence are where the uncertainties come from. Previous studies about the treatment of epistemic uncertainty utilize methods such as evidence theory [14, 15], fuzzy sets [16], entropy model [17, 18], and convex models of uncertainty [19], intended basically for uncertainty quantification, and it is not clear how to implement model validation under epistemic uncertainty. The model validation method in this paper deals with the situation where the epistemic uncertainty arises from the sparse and imprecise data concerning the model inputs, and the input quantities as well as the observed experimental evidence are in the form of both point form and interval form, with the method of Dempster-Shafer (D-S) evidence theory. Another issue to be addressed is that the final decision on model acceptance has to be made with discretion. Model uncertainty can be reduced through a deeper investigation or if there is some empirical information available about the system behavior, after which the model will be assessed and a decision can be made regarding acceptance or rejection of the model. This paper investigates the latter issue and has an assumption that the validity of a model is judged only by its output assuming that the investigation of its mechanism is unavailable. In the previous studies, Sankararaman et al. [20, 21] assess the validity of the model and make the decision using the Bayes factor, which is based on the idea of Bayesian hypothesis testing and is the ratio of the likelihood of the model prediction and the experimental observed data under two competing hypotheses: accept the model and refuse the model. However, they decide whether to accept the model solely by one Bayes factor calculation, which can be quite occasional and not precise enough. Bayesian theory based information fusion techniques have been evolving in many fields; nonetheless, the effective performance can only be achieved if adequate and appropriate a priori and conditional probabilities are available, or the results of Bayesian methods can be imprecise, even far away from the truth. As an extension to Bayesian theory, the Dempster-Shafer (D-S) evidence theory [14] uses basic probability assignment (BPA) to quantify evidence and uncertainty [17, 22–26]. Compared with probability, BPA has the advantage of efficient modeling of unknown information [27]. D-S evidence theory models how the uncertainty of a given hypothesis or discourse diminishes as groups of BPAs accumulate during the reasoning process [27–30]. The effectiveness of this method has been demonstrated in many fields. In this paper, first, a method based on the concepts of D-S evidence theory is proposed to eliminate the parameters uncertainty of probability distributions for random input variables whose available information is in the form of sparse point data and interval data. In the case of point data, each of

Mathematical Problems in Engineering them is regarded as a BPA of the alternative input variable probability distribution parameters; while in the case of interval data, the BPAs are obtained by sampling some point data from the intervals. Through our method, each model input set including multiple point and/or interval data will be described using a probability density distribution function (PDF) before uncertainty propagation analysis and model validation. Considering the faithfulness to the available data, for the model data with little distribution parameter information, we also construct PDFs using a computational statistic technique, bootstrapping method, to address this concern. Moreover, in the model validation part, we also propose an evidential method; and the probability distribution of experimental evidence as well as that of the model prediction, which is derived from the uncertainty propagation technique, is involved. The proposed method borrows the idea of Bayesian hypothesis testing and the definition of Bayes factor in [20, 31] as reference, inspired by which, groups of BPAs for the decision making of model acceptance are generated and obtained. The remainder of this paper is organized as follows to show the details about proposed methods. Section 2 provides an overview of basic concepts and notations about DempsterShafer evidence theory. Section 3 describes how the evidence based method can eliminate the parameter uncertainty of probability distribution (epistemic uncertainty) for model inputs. The validation method referred to the idea of Bayesian hypothesis testing, and Bayes factor is presented in Section 4. Illustration of the proposed methods is given through a steady state heat transfer problem in Section 5. Finally, we conclude our paper in Section 6.

2. Dempster-Shafer Theory of Evidence The real world is full of uncertainty, and how to deal with the uncertain information is still an open issue [32–36]. Many math tools, such as AHP [37–40], fuzzy sets [41– 43], D numbers [44–47], evidential reasoning [30, 48–51], and rough sets [52], are adopted with wide applications. In this section, the main concepts about the Dempster-Shafer (D-S) evidence theory are reviewed. The Dempster-Shafer evidence theory, which is introduced by Dempster [53] and then extended by Shafer et al. [14], is concerned with the question of belief in a proposition and systems of propositions. Compared with the Bayesian probability model, the merits of D-S evidence theory have already been recognized in various fields. First, the D-S evidence theory can handle uncertainty or imprecision embedded in the evidence [54]. In contrast to the Bayesian probability model in which probability masses can be only assigned to singleton subsets, in D-S evidence theory, probability masses can be assigned to both singletons and compound sets. Thus, more evidence will be provided to illustrate the hypotheses or the distribution between the singletons, and this theory can be viewed as a generalization of the classic probability theory. Second, in D-S evidence theory, no prior distribution is needed before the combination of BPAs from individual information sources. Third, the D-S evidence theory allows one to specify a degree of ignorance in some situations rather than the hypothesis that all have to be assigned precisely. Some notations concerned

Mathematical Problems in Engineering

3

in D-S evidence theory are introduced. Finally, conflicting management is improved with evidence theory [55–57]. Definition 1 (frame of discernment). Let 𝑈 = {𝐻1 , 𝐻2 , . . ., 𝐻𝑁} be a finite set of 𝑁 exhaustive and exclusive elements, and each is called a proposition or a hypothesis. The set 𝑈 is called the frame of discernment. Denote the power set of 𝑈 as 𝑃(𝑈) = {0, 𝐻1 , . . . , 𝐻𝑁, 𝐻1 ∪ 𝐻2 , 𝐻1 ∪ 𝐻3 , . . . , 𝑈}, where 0 denotes the empty set. Definition 2 (mass function). For a frame of discernment 𝑈, a mass function is a mapping 𝑚 : 2𝑈 → [0, 1], which is also called a basic probability assignment (BPA), satisfying

Aleatory uncertainty

Single distribution

Distribution parameter uncertainty

Sparse point data, interval data

Integration

Parametric approach

Family of distribution

Nonparametric approach

Figure 1: The analysis of model input data.

𝑚 (0) = 0, ∑ 𝑚 (𝐴) = 1,

(1)

𝐴∈2𝑈

where 𝐴 is any element of 2𝑈 and the BPA 𝑚(𝐴) expresses how strongly the evidence supports a particular proposition 𝐴. Definition 3 (Dempster’s rule of combination). Suppose 𝑚1 and 𝑚2 are two BPAs formed based on the information obtained from two different information sources in the same frame of discernment 𝑈; the Dempster's rule of combination, also called the orthogonal sum, denoted by 𝑚 = 𝑚1 ⊕ 𝑚2 , is defined as follows: 𝑚 (𝐴) =

1 ∑ 𝑚 (𝐵) ⋅ 𝑚2 (𝐶) 1 − 𝐾 𝐵∩𝐶=𝐴 1

(2)

Here, the paper presents an evidential method to represent the model input data and reduce the epistemic uncertainty in them as much as possible for applying the uncertainty propagation techniques to them and getting the model prediction and also for the model validation purpose. Through our method, each input set X of the model can be described using a single probability density function (PDF) just as the content in Figure 1 [20, 58]. Since the scenario here is about epistemic uncertainty, a parameters set P consisting of several alternative combinations of parameter statistics is provided by engineers and experts except for the inputs just as what is described in (c) mentioned above. Such epistemic uncertainty will be handled by the proposed evidence theory based method, while the computational statistic method is for the condition (b) mentioned above. The proposed method will be discussed in detail below with sparse point data and interval data.

with 𝐾 = ∑ 𝑚1 (𝐵) ⋅ 𝑚2 (𝐶) , 𝐵∩𝐶=0

(3)

where 𝐴, 𝐵, and 𝐶 are elements of 2𝑈 and 𝐾 is a normalization constant and represents a basic probability mass associated with conflicts among the sources of evidence, called the conflict coefficient of two BPAs. The larger the value of 𝐾 is, the more conflicting the sources are and the less informative their combination is.

3. Probability Distributions of Model Input Data The input of a mathematical or computational model is denoted by X and the model prediction or, say, output is Y, and both X and Y are the sets for the corresponding variables. Generally, there are three types of data in X as mentioned in [20]: (a) sufficient data to construct a precise probability density distribution function (aleatory uncertainty); (b) sparse point data and/or interval data (epistemic uncertainty exists); (c) probability density distribution function with uncertain parameters (epistemic uncertainty).

3.1. Evidential Probability Distribution of Data with Epistemic Uncertainty. The methods that have been developed for model validation are mostly in the cases where the experimental evidence is in the form of point data. However, there are some cases where the interval data exist as the experimental evidence rather than the point data [59–61]. For convenience, the format of the obtained sparse model input data X in this paper is expressed as point data 𝑥𝑖 (suppose 𝑖 = 1 to 𝑚) and interval data [𝑎𝑗 , 𝑏𝑗 ] (suppose 𝑗 = 1 to 𝑛). As mentioned above, the alternative PDF parameter set P is similarly expressed as P = {𝑃1 , 𝑃2 , . . . , 𝑃𝑘 } and P is the parameter set of the known probability density distribution function type such as exponential, normal, lognormal, and Poisson, for example, the mean and the standard deviation of a normal distribution. The PDF of X is represented by 𝑓𝑋,𝑃 (𝑥). It is noteworthy that the PDF for the model input data is conditioned on the decision of 𝑃 in the set P [62– 65]; therefore, every element in the set P should be evaluated by the available evidence, that is, the obtained model input data. The basic assignment probability (BPA) of each element of P could be derived according to the principle that the more optimal (or, say, more close to the real PDF of model inputs) an alternative PDF is, the higher the value 𝑓𝑋,𝑃𝑖 (𝑥) (𝑖 = 1 to 𝑘) is. A good and useful BPA should contain most of the information provided by the data source and should be suitable for the subsequent process (the combination of BPAs

4

Mathematical Problems in Engineering

and decision making). To solve this problem, the proposed method incorporates the available model input data with the alternative PDFs to extract the BPA information sufficiently. The normal distribution is commonly encountered in practice and is used throughout statistics, natural sciences, and social sciences as a simple model for complex phenomena. To explore the details, here we would like to use 𝑃𝑖 = (𝜇𝑖 ,𝜎𝑖 ) (𝑖 = 1 to 𝑘) as illustration. Some details of obtaining BPAs with the available model input data are discussed as follows. 3.1.1. Step 1. Above all, given the normal PDF parameter set P = {𝑃1 , 𝑃2 , . . . , 𝑃𝑘 } provided by experts and engineers, the corresponding 𝑘 normal PDFs can be determined and curves can be drawn. Each PDF will be coded as a proposition of the frame of discernment in D-S evidence theory; that is, 𝑈 = P. 3.1.2. Step 2. In this step, the BPAs will be derived from both the model input data and the normal PDFs. For the point data 𝑥𝑖 (𝑖 = 1 to 𝑚), we determine the BPA with the help of mean value theorem and regard observing 𝑥𝑖 as approximately equal to observing 𝑥𝑖 ∈ (𝑥𝑖 − 𝜀/2, 𝑥𝑖 + 𝜀/2), where 𝜀 is an infinitesimally small positive number. In such way, we can obtain a BPA from the 𝑖th evidence source (𝑚𝑖 (𝑃𝑗 ), 𝑗 = 1 to 𝑘): 𝑚𝑖 (𝑃𝑗 ) ∝ Prob (𝑃𝑗 )

(4)

𝜀 𝜀 Prob (𝑃𝑗 ) = Prob (𝑥𝑖 ∈ (𝑥𝑖 − , 𝑥𝑖 + )) 2 2 =∫

𝑥𝑖 +𝜀/2

𝑥𝑖 −𝜀/2

𝑓𝑋 (𝑥𝑖 ; 𝜇𝑗 , 𝜎𝑗 2 ) 𝑑𝑥

(5)

≈ 𝜀𝑓𝑋 (𝑥𝑖 ; 𝜇𝑗 , 𝜎𝑗 2 ) , that is, calculate the intersections of 𝑥𝑖 and the 𝑘 normal PDFs: 𝑚𝑖 (𝑃𝑗 ) = 𝑓𝑋 (𝑥𝑖 ; 𝜇𝑗 , 𝜎𝑗 2 ) .

(6)

Thus we can get 𝑘 intersections of the alternative PDFs with uncertain parameters. Besides, 𝑚𝑖 (𝑈) = max (𝑚𝑖 (𝐴)) − min (𝑚𝑖 (𝐴)) ,

(7)

where 𝐴 ∈ 𝑈 and the BPA values of the remaining propositions in the power set of 𝑈 are set to be equal to 0. In the evidential background, 𝑚𝑖 (𝑈) is regarded as the quantified expression of the global ignorance about the optimal model input PDF parameters. As for the interval data [𝑎𝑗 , 𝑏𝑗 ] (𝑗 = 1 to 𝑛), due to the fact that |𝑎𝑖 −𝑏𝑗 | is usually rather small, we just discretize each interval and sample a certain amount of points in the interval to replace the interval. The BPA generating process is the same as what has just been mentioned above for point data. By this means that we can get another certainconstant-times 𝑛 groups of BPAs, all of which will be used to help make the final decision about the optimal model input PDF.

3.1.3. Step 3. Based on the work that has been done in step 2, we combine all the obtained BPAs (that is, multiple 𝑚 mass functions) using the Dempster's rule of combination as shown in (2) and (3). According to the fused result, the proposition 𝑃𝑖 (𝑖 = 1 to 𝑘), whose final 𝑚 function value 𝑚(𝑃𝑖 ) is the largest, is the evidence-supported optimal distribution parameter choice in the decision making. Thus, the approximate real model input PDF is estimated as the normal distribution with parameters 𝑃decision = (𝜇, 𝜎). It is noteworthy that if 𝑚(𝑈) turns out to be the largest, then we would conclude that the alternative distributions are too similar to discriminate, or the given data is too sparse to be used for making a choice. Here, the engineers and experts who have provided P are assumed to be reliable. 3.2. Probability Distribution without Uncertain Parameters. Different from the situation in Section 3, assume another situation where there are only the sparse model input data denoted as the set X, and the PDF 𝑓𝑋(𝑥) needs to be estimated. Similarly, the model input data set X includes both points 𝑥𝑖 (𝑖 = 1 to 𝑚) and intervals [𝑎𝑗 , 𝑏𝑗 ] (𝑗 = 1 to 𝑛). Because the sample size is limited and not so large, an empirical distribution for X is hard to be constructed. However, the basic idea of the bootstrapping method [66] is that the inference about a population from sample data (sample → population) can be modeled by resampling the sample data (sampling with replacement) and performing inference on the resampled data (resample → sample). This technique allows estimation of the sampling distribution of almost any statistic using random sampling methods [67]. With the prior knowledge about the distribution type, we can estimate the distribution by using the bootstrapping method to determine the parametric statistics. Given both the point and the interval data, the intervals are first discretized into a finite number of points and the application of bootstrapping method follows; then the statistic (the mean 𝜇 and standard deviation 𝜎 of the population) should be estimated (central limit theorem) and be used for the construction of the PDF of model input.

4. Model Validation Through the above sections, the model input is represented using a single probability distribution. Propagating the input uncertainty through the mathematical model and determining the PDF of the model output Y is the following work. Given the statistical distributions of the input variables, various methods are available to carry out probabilistic analysis so as to quantify the uncertainty in the model output such as Monte Carlo simulation [68] and response surface methods [69, 70]. The choice of method depends on the nature of model used for predicting the output and the needs with respect to accuracy and efficiency. In this case, the model prediction Y will be in a probabilistic distribution form; therefore the model validation is about the comparison of the observed experimental evidence D and the probability distribution of the model output 𝑓𝑌(𝑦). This section implements the evidential model validation metric

Mathematical Problems in Engineering

5 0.58

inspired by the idea of Bayesian hypothesis testing and Bayes factor.

𝑃 (𝐻0 | 𝐷) 𝑃 (𝐷 | 𝐻0 is true) 𝑃 (𝐻0 ) = ⋅ 𝑃 (𝐻1 | 𝐷) 𝑃 (𝐷 | 𝐻1 is true) 𝑃 (𝐻1 )

Probability density function

4.1. Bayes Factor. Bayesian hypothesis testing estimates the probability of a hypothesis, given the observed experimental evidence D. Bayesian methods may use Bayes factors to compare hypotheses, which are introduced by Jeffreys [71]. A Bayes factor, 𝐵, is a Bayesian alternative to frequentist hypothesis testing that is most often used for the comparison of multiple models, usually to determine which model better fits the experimental evidence; it is in the form that the posterior odds in favor of the hypothesis are divided by the prior odds in favor of the hypothesis. Generally, there are two hypotheses 𝐻0 and 𝐻1 ; the related probabilities of these hypotheses can be updated using the Bayes theorem as [72]

0.4 0.3 0.2 B(yO,i , fY (yO,i )) 0.1 0

6

9

12

15 18 21 24 The desired variable (y)

27

30 32

Exp. evidence based Simulation generated

(8)

The first term on the right hand side of (8) is the Bayes factor 𝐵. In the context of model of validation, the two hypotheses 𝐻0 and 𝐻1 may be defined as “the model is correct” and “the model is incorrect,” respectively.

A(yO,i , fY,D (yO,i ))

0.5

Figure 2: The illustration figure for the BPA generation.

where 𝐴

4.2. Evidential Model Acceptance Decision Making. Consider a provided model input data set; the model predicts an output set Y𝑂 = {𝑦𝑂,1 , 𝑦𝑂,2 , . . . , 𝑦𝑂,ℎ } which consists of ℎ single deterministic quantities, each of which is corresponding to a specific input data circumstances and a measured experimental dataset 𝐷𝑖 (𝑖 = 1 to ℎ). Inspired by the developed Bayes factor in [20], 𝑓 (𝑦 | 𝐷𝑖 ) 󵄨󵄨󵄨󵄨 𝑃 (𝐷𝑖 | 𝐻0 is true) 󵄨󵄨 , = 𝐵= 𝑃 (𝐷𝑖 | 𝐻1 is true) 𝑓 (𝑦) 󵄨󵄨󵄨𝑦=𝑦𝑜,𝑖

(9)

where the reason why (9) is in that format is that “the epistemic uncertainty in the model inputs and the validation data have already been converted in to probabilistic information” by the authors [20], in this paper, we use the bootstrapping method to estimate the statistics in order to infer the PDF of the observed experimental evidence (𝑓𝑌,𝐷𝑖 (𝑦)), so as to replace the numerator 𝑓(𝑦 | 𝐷𝑖 ) in (9). And by our redefined equation (9), we can get two values: the numerator and the denominator. Accordingly, we constitute the frame of discernment under the target of making decision about model acceptance with two propositions (“the model is correct” (𝐻0 ) and “the model is incorrect” (𝐻1 )) such that 𝑈 = {𝐻0 , 𝐻1 }. As for the BPA value of each proposition, the two curves in Figure 2 are given for illustration and the two intersection points, 𝐴 and 𝐵, are what should be paid attention to. In the background of D-S evidence theory, we define the BPA values of propositions in 2𝑈 as 𝑚 (𝐴) = max (𝑓𝑌,𝐷𝑖 (𝑦𝑂,𝑖 ) , 𝑓𝑌 (𝑦𝑂,𝑖 )) ,

(10)

𝑚 (𝐵) = min (𝑓𝑌,𝐷𝑖 (𝑦𝑂,𝑖 ) , 𝑓𝑌 (𝑦𝑂,𝑖 )) ,

(11)

𝑚 (𝑈 − 𝐴) = 0,

(12)

{𝐻0 , max (𝑓𝑌,𝐷𝑖 (𝑦𝑂,𝑖 ) , 𝑓𝑌 (𝑦𝑂,𝑖 )) = 𝑓𝑌,𝐷𝑖 (𝑦𝑂,𝑖 ) ={ (13) 𝐻 , max (𝑓𝑌,𝐷𝑖 (𝑦𝑂,𝑖 ) , 𝑓𝑌 (𝑦𝑂,𝑖 )) = 𝑓𝑌 (𝑦𝑂,𝑖 ) { 1 𝐵 = 𝑈 = {𝐻0 , 𝐻1 } . Hence, one BPA is obtained and 𝑚(𝑈) is the quantification of the ignorance about the model testing. After obtaining all ℎ groups of BPAs, combine them all using the Dempster's rule of combination as shown in (2) and (3) such that 𝑚1 ⊕𝑚2 ⊕⋅ ⋅ ⋅⊕ 𝑚ℎ . According to the fused result, the three propositions 𝐻0 , 𝐻1 , and {𝐻0 , 𝐻1 } which own the largest BPA values affect the final result of model testing; if it is 𝐻0 or 𝐻1 , then the result is self-evident, yet if 𝑚(𝐻0 , 𝐻1 ) turns out to be the largest, then we could conclude that the given data is too sparse to be used for making a decision and other methods need to be employed. It should be mentioned that the evidence distance may be another alternative to replace the Bayes factor [73], which is still under research. Dempster-Shafer evidence theory is concerned with the question of belief in a proposition and systems of propositions, which is capable of converging most BPA values to the dominant proposition quickly. The Dempster's rule of combination is robust when there are incidental disagreements between model prediction and its relative measured evidence (the reasons could be measurement error, sparse evidence, etc.). In this case, the Bayes factor in (9) could be unreliable and unprecise whether merely computing once or averaging several results in the model testing, while the proposed evidence theory based method can handle such conditions properly. In particular, the BPAs reflect the partial lack of information available for decision making which could be used for rejecting model under consideration if the relative uncertainty is too high.

6

Mathematical Problems in Engineering Table 1: The alternative parameters provided by experts and engineers.

Curve number 𝜇 𝜎

1 4.9935 1.0037

2 4.9407 1.1251

3 5.3622 1.8413

4 5.2113 1.4508

5 4.8262 1.3849

5 0.2875 0.2879 0.2881 0.2880 0.2877 0.2863 0.2772 0.2853 0.2821 0.2793 0.2761 0.2724 0.2682

{1, 2, 3, 4, 5} 0.1804 0.1825 0.1841 0.1852 0.1858 0.1854 0.1724 0.1844 0.1802 0.1759 0.1705 0.1640 0.1565

Table 2: The BPA generated from evidence. 1 0.3850 0.3886 0.3916 0.3940 0.3958 0.3974 0.3883 0.3973 0.3948 0.3914 0.3866 0.3805 0.3732

Proposition 𝑚1 𝑚2 𝑚3 𝑚4 𝑚5 𝑚6 𝑚7 𝑚8 𝑚9 𝑚10 𝑚11 𝑚12 𝑚13

2 0.3490 0.3510 0.3525 0.3537 0.3544 0.3544 0.3446 0.3537 0.3506 0.3473 0.3431 0.3379 0.3319

3 0.2046 0.2061 0.2075 0.2088 0.2099 0.2120 0.2159 0.2130 0.2146 0.2155 0.2161 0.2165 0.2167

5. Numerical Example

4 0.2608 0.2631 0.2652 0.2670 0.2687 0.2715 0.2750 0.2726 0.2743 0.2749 0.2750 0.2746 0.2737

5.1. Problem Description and the Data. The steady state heat transfer in a thin wire of length 𝐿, with thermal conductivity 𝑘 and convective heat coefficient 𝛽, is of interest. The temperature prediction at the midpoint of the wire is desired. The condition of this problem is assumed to be essentially one-dimensional, and it is assumed that the solution can be obtained from the boundary value problem [20, 21]:

4.98, 5.21, and 5.02. Besides, suppose that the distribution for the convective heat coefficient 𝛽 is normal but not available yet. Instead, it is described using two intervals, [0.5, 0.55] and [0.45, 0.48], and two points data, 0.58 and 0.52. Suppose, for given end temperatures of the wire and the model parameters 𝑘 and 𝛽, the numerical model (14) predicts a set of temperatures 𝑇𝑥=2.0 = {13.34, 16.72, 18.51} for 𝑇0 = 𝑇𝐿 = 0. The wire made with properties 𝑘 and 𝛽 having the same measured values as input to the numerical model is tested three times repeatedly to measure the temperature at location 𝑥 = 2.0, and measured temperatures are different in each experiment. Here, assume that the measurements are in the following form: {13.10, [13.6, 13.8]}, {16.4, 17.0, [17.1, 17.2]}, and {18.8, 18.2, [18.9, 19.0]}. It is required to assess whether the observed experimental evidence supports the numerical model in (14). Various steps involved in the validation procedure are provided with details below.

𝜕2 𝑇 (14) + 𝛽𝑇 = 𝑄 (𝑥) . 𝜕𝑥2 Rebba et al. [21] assumed that the temperatures at the ends of the wire are both zero (i.e., 𝑇0 = 𝑇𝐿 = 0) and 𝑄(𝑥) = 25(2𝑥 − 𝐿)2 [74] is the heat source; these conditions used in this paper are of an ideal scenario. The length of the wire is deterministic and 𝐿 = 4. It is desired to predict 𝑇𝑥=2.0 . For the sake of illustration, 𝑘 and 𝛽 are random variables here and the PDF of the conductivity of the wire 𝑘 is assumed to be normal but with uncertain distribution parameters, which would be provided several alternatives by experts and engineers as shown in Table 1, which displays a parameter P set consisting of five alternative parametric statistics combinations. The input model data concerning 𝑘 is two intervals, [4.74, 4.90] and [5.11, 5.35], and three points data,

5.2. Probability Distribution of Model Input Data. As what has already been given in Section 3, the first thing we should embark on is to describe the provided model input data using proper PDF. For this numerical example, the PDF of the conductivity of the wire 𝑘 is described by the experts and engineers as normal but with uncertain distribution parameters, that is, a normal distribution conditioned on the given set of alternative parameters. Thus there are five alternative PDF curves, depicted in Figure 3, for using input data to constitute groups of BPAs for eliminating the parametric uncertainty in PDF of 𝑘. Figure 3(b) is the partial enlarged view for the intersections of provided data and the curves, from which the BPA values are determined for information fusion and decision making. The generated BPAs for combining are displayed in Table 2 (𝑚1 represents the

In this section, a numerical example that depicts the proposed methods for model validation is presented. For the purpose of illustration, various types of epistemic uncertainty, that is, sparse point data, interval data, and distribution parameter uncertainty, are arranged to appear simultaneously in the model input data. Moreover, the experimental observations include both point data and interval data. In addition, the units of following variables are omitted to simplify the illustration.

−𝑘

Mathematical Problems in Engineering

7 Table 3: The fused result of PDF choice for 𝑘.

1 0.6710

2 0.2491

3 0.0051

4 0.0314

0.4

0.4

0.35

0.35 Probability density function

Probability density function

Proposition BPA

0.3 0.25 0.2 0.15 0.1 0.05 0 −2

5 0.0434

{1, 2, 3, 4, 5} 1.8333 ⋅ 10−7

0.3 0.25 0.2 0.15 0.1 0.05

0

2 4 6 8 Thermal conductivity of the wire

 = 5.0,  = 1.0  = 4.9,  = 1.1  = 5.4,  = 1.8

10

0 4.5

12

4.6

4.7 4.8 4.9 5 5.1 5.2 5.3 Thermal conductivity of the wire

5.4

5.5

 = 5.2,  = 1.5  = 4.8,  = 1.4

(a) The five alternative PDF curves for 𝑘

(b) The partial enlarged view of (a) for provided data

Figure 3: Illustration for evidential representation of epistemic model input data.

5.3. Probability Distribution of Model Prediction. Now, since random variables 𝑘 and 𝛽 are both described in the PDF form, we come to the point where the uncertainty propagation technique needs to be applied, in order to estimate the distribution of 𝑇𝑥=2.0 which is also a random variable. With what has been discussed in Section 4, the uncertainty propagation technique utilized here is Monte Carlo simulation method and all those input PDFs are calculated by the computational model in (14). The model predicted PDF of 𝑇𝑥=2.0 is obtained and shown in Figure 6 as the red curve having a lognormal distribution with mean 17.12 and variance 10.042, namely, the PDF to be evaluated in the evidential model validation below.

4 3.5 Probability density function

BPA generated from the leftmost point on the 𝑥-axis and the rest are in a similar fashion). The fusion result is shown in Table 3 and the curves are coded as numbers 1 to 5 like in Table 1, from which we can conclude that the rank of five alternative PDF is 1 ≻ 5 ≻ 2 ≻ 4 ≻ 3 and the approximate real PDF of the thermal conductivity of the wire (𝑘) is the normal distribution with parameter combination 𝑃1 , that is, 𝑁 (𝜇 = 4.99, 𝜎 = 1.00), that is, the red curve in Figure 3(a). Since there is little information about the distribution of the convective heat coefficient of the wire 𝛽 and there are two intervals, [0.5, 0.55] and [0.45, 0.48], and two points values, 0.58 and 0.52, which is the situation talked about in Section 3.2, discretization and bootstrapping method come to help. We discretize the entire data into 1000 points and estimate the statistic parameters of the normal distribution of 𝛽; the PDF curve is shown in Figure 4.

3 2.5 2 1.5 1 0.5 0 0.1

0.2

0.3 0.4 0.5 0.6 0.7 Convective heat coefficient of the wire

0.8

0.9

 = 0.503,  = 0.10

Figure 4: The bootstrapping estimated PDF for 𝛽.

5.4. Evidential Model Validation. Given the model predictions 𝑇𝑥=2.0 = {13.34, 16.72, 18.51} and the three groups of corresponding measured experimental evidence, the next thing needed to do is estimating the PDFs of them (𝑓𝑌,𝐷𝑖 (𝑦), 𝑖 = 1 to 3) through the bootstrapping method, and the results are depicted in Figure 5. Then we integrate the 𝑓𝑌,𝐷𝑖 (𝑦) (𝑖 = 1 to 3) and 𝑓𝑌 (𝑦) in the same picture shown in Figure 6. By the indication of red dotted lines on it, accordingly, three tuples of values can be derived to generate the BPAs of three groups

8

Mathematical Problems in Engineering 0.58

0.65 0.6 Probability density function

Probability density function

0.5 0.5 0.4 0.3 0.2 0.1 0

0.4 0.3 0.2 0.1 0

11

11.5

12 12.5 13 13.5 14 14.5 15 Temperature at the middle of the wire

15.5

16

14 14.5

15 15.5

16 16.5

17 17.5

18 18.5

19 19.5

Temperature at the middle of the wire  = 16.80,  = 0.73

 = 13.45,  = 0.64 (a) For {13.10, [13.6, 13.8]}

(b) For {16.4, 17.0, [17.1, 17.2]}

0.58

Probability density function

0.5 0.4 0.3 0.2 0.1 0

16

16.5

17 17.5 18 18.5 19 19.5 20 Temperature at the middle of the wire

20.5

21

 = 18.60,  = 0.73 (c) For {18.8, 18.2, [18.9, 19.0]}

Figure 5: Distributions of the measured experimental evidence.

Table 4: The generated BPA for model acceptance decision making. Proposition 𝑚1 𝑚2 𝑚3

𝐻0 0.6175 0.5413 0.5439

𝐻1 0 0 0

{𝐻1 , 𝐻0 } 0.0736 0.1298 0.1023

of BPAs for information fusion and model decision making (shown in Table 4), and the calculation follows the instruction of (10)–(13). The fusion result is displayed in Table 5, and obviously the result shows that the experimental evidence agrees with the model, and the computational model in (14) is acceptable.

6. Conclusion This paper proposes an evidence theory based method to quantify the epistemic uncertainty, which includes three

Table 5: The fused result of model acceptance decision making. Proposition BPA

𝐻0 0.9967

𝐻1 0

{𝐻1 , 𝐻0 } 0.0033

types of epistemic uncertainty concerning input model data, that is, sparse point data, interval data, and probability distributions with uncertain parameters, in the computational model. We also develop an evidential method for model validation, inspired by the idea of Bayesian hypothesis testing and Bayes factor, which compares the model predictions with the measured experimental data so as to assess the predictive capability of the model and help the decision making of model acceptance. The bootstrapping technique in statistics is also used to estimate the statistics in order to infer the probability density function (PDF) of the data involved in the model validation process. Through the proposed methods, the given data both in the point and interval form

Mathematical Problems in Engineering

9

References

Probability density function

0.65 0.6 0.5 0.4 0.3 0.2 0.1 0 6 7

9

11 13 15 17 19 21 23 25 Temperature at the middle of the wire

Simulation generated Exp. evidence based 1

27

29 30

Exp. evidence based 2 Exp. evidence based 3

Figure 6: The illustration figure for evidential model validation.

will be regarded as random variables and described by the corresponding probability distributions for the uncertainty propagation in the computational model, thus, for the model validation. Developed by the idea of Bayes factor, the frame of discernment of D-S evidence theory is constituted and the basic probability assignment (BPA) is determined. Because the proposed validation method is evidence based, the robustness of the result can be guaranteed, and the most evidence-supported hypothesis about the model testing will be favored by the BPA, thus helping the decision making of model acceptance. A numerical example about the prediction of the wire middle temperature demonstrates that the proposed methods can effectively handle the epistemic uncertainty in the context of model validation. The further work needs to develop the methods for a more complex scenario of model validation where both the epistemic and aleatory uncertainty, as well as more types of epistemic uncertainty such as qualitative model data and categorical variables in the model, are taken into consideration.

Conflicts of Interest The authors declare that there are no conflicts of interest regarding the publication of this paper.

Authors’ Contributions Wei Deng and Xi Lu contributed equally to this study.

Acknowledgments This work is partially supported by the National Natural Science Foundation of China (Grants nos. 61573290, 61503237).

[1] W. L. Oberkampf and M. F. Barone, “Measures of agreement between computation and experiment: Validation metrics,” Journal of Computational Physics, vol. 217, no. 1, pp. 5–36, 2006. [2] J. C. Helton, “Conceptual and computational basis for the quantification of margins and uncertainty,” Tech. Rep., Sandia National Laboratories, 2009. [3] W. L. Oberkampf, M. Sindir, and A. Conlisk, Guide for the verification and validation of computational fluid dynamics simulations, Am. Institute of Aeronautics and Astronautics, 1998. [4] J. Liu, F. Lian, and M. Mallick, “Distributed compressed sensing based joint detection and tracking for multistatic radar system,” Information Sciences, vol. 369, pp. 100–118, 2016. [5] F. Xiao, M. Aritsugi, Q. Wang, and R. Zhang, “Efficient processing of multiple nested event pattern queries over multidimensional event streams based on a triaxial hierarchical model,” Artificial Intelligence in Medicine, vol. 72, pp. 56–71, 2016. [6] Y. Hu, F. Du, and H. L. Zhang, “Investigation of unsteady aerodynamics effects in cycloidal rotor using RANS solver,” The Aeronautical Journal, vol. 120, no. 1228, pp. 956–970, 2016. [7] F. Xiao, C. Zhan, H. Lai, L. Tao, and Z. Qu, “New parallel processing strategies in complex event processing systems with data streams,” International Journal of Distributed Sensor Networks, vol. 13, no. 8, pp. 1–15, 2017. [8] Y. Dong, J. Wang, F. Chen, Y. Hu, and Y. Deng, “Location of facility based on simulated annealing and “ZKW” algorithms,” Mathematical Problems in Engineering, vol. 2017, Article ID 4628501, 2017. [9] W. L. Oberkampf, J. C. Helton, and K. Sentz, “Mathematical representation of uncertainty,” in Proceedings of the 19th AIAA Applied Aerodynamics Conference, vol. 2001-1645, pp. 16–19, Anaheim, CA, USA, 2001. [10] W. L. Oberkampf, S. M. DeLand, B. M. Rutherford, K. V. Diegert, and K. F. Alvin, “New methodology for the estimation of total uncertainty in computational simulation,” in Proceedings of the 1999 AIAA/ASME/ASCE/AHS/ASC Structrures, Structural Dynamics, and Materials Conference and Exhibit, pp. 3061–3083, April 1999. [11] L. P. Swiler, T. L. Paez, and R. L. Mayes, “Epistemic uncertainty quantification tutorial,” in Proceedings of the 27th International Modal Analysis Conference (IMAC ’09), Society for Experimental Mechanics, Orlando, Fla, USA, February 2009. [12] C. Baudrit and D. Dubois, “Practical representations of incomplete probabilistic knowledge,” Computational Statistics & Data Analysis, vol. 51, no. 1, pp. 86–108, 2006. [13] J. C. Helton, J. D. Johnson, and W. L. Oberkampf, “An exploration of alternative approaches to the representation of uncertainty in model predictions,” Reliability Engineering & System Safety, vol. 85, no. 1, pp. 39–71, 2004. [14] G. Shafer et al., A Mathematical Theory of Evidence, Princeton University Press, Princeton, NJ, USA, 1976. [15] H. Agarwal, J. E. Renaud, E. L. Preston, and D. Padmanabhan, “Uncertainty quantification using evidence theory in multidisciplinary design optimization,” Reliability Engineering & System Safety, vol. 85, no. 1, pp. 281–294, 2004. [16] S. S. Rao and K. K. Annamdas, “Evidence-based fuzzy approach for the safety analysis of uncertain systems,” AIAA Journal, vol. 46, no. 9, pp. 2383–2387, 2008.

10 [17] Q. Zhang, M. Li, and Y. Deng, “Measure the structure similarity of nodes in complex networks based on relative entropy,” Physica A: Statistical Mechanics and its Applications, vol. 491, pp. 749–763, 2018. [18] L. Yin and Y. Deng, “Measuring transferring similarity via local information,” Physica A: Statistical Mechanics and its Applications, vol. 498, pp. 102–115, 2018. [19] Y. Ben-Haim and I. Elishakoff, Convex Models of Uncertainty in Applied Mechanics, Elsevier, 2013. [20] S. Sankararaman and S. Mahadevan, “Model validation under epistemic uncertainty,” Reliability Engineering & System Safety, vol. 96, no. 9, pp. 1232–1241, 2011. [21] R. Rebba, S. Mahadevan, and S. Huang, “Validation and error estimation of computational models,” Reliability Engineering & System Safety, vol. 91, no. 10, pp. 1390–1397, 2006. [22] J. A. Barnett, “Computational methods for a mathematical theory of evidence,” in Classic Works of the Dempster-Shafer Theory of Belief Functions, pp. 197–216, Springer, 2008. [23] Y. Deng, “Deng entropy,” Chaos, Solitons & Fractals, vol. 91, pp. 549–553, 2016. [24] W. Jiang and J. Zhan, “A modified combination rule in generalized evidence theory,” Applied Intelligence, vol. 46, no. 3, pp. 630–640, 2017. [25] J. Abelln, “Analyzing properties of Deng entropy in the theory of evidence,” Chaos Solitons & Fractals, vol. 95, pp. 195–199, 2017. [26] X. Zheng and Y. Deng, “Dependence assessment in human reliability analysis based on evidence credibility decay model and IOWA operator,” Annals of Nuclear Energy, vol. 112, pp. 673– 684, 2018. [27] H. Zheng and Y. Deng, “Evaluation method based on fuzzy relations between Dempster-Shafer belief structure,” International Journal of Intelligent Systems, 2017. [28] F. Ye, J. Chen, Y. Li, and J. Kang, “Decision-making algorithm for multisensor fusion based on grey relation and DS evidence theory,” Journal of Sensors, vol. 2016, pp. 1–11, 2016. [29] H. Xu and Y. Deng, “Dependent Evidence Combination Based on Shearman Coefficient and Pearson Coefficient,” IEEE Access, 2018. [30] T. Liu, Y. Deng, and F. Chan, “Evidential Supplier Selection Based on DEMATEL and Game Theory,” International Journal of Fuzzy Systems, 2017. [31] R. Zhang and S. Mahadevan, “Bayesian methodology for reliability model acceptance,” Reliability Engineering & System Safety, vol. 80, no. 1, pp. 95–103, 2003. [32] C. Li, S. MahaDeVan, Y. Ling, S. Choze, and L. Wang, “Dynamic Bayesian network for aircraft wing health monitoring digital twin,” AIAA Journal, vol. 55, no. 3, pp. 930–941, 2017. [33] S. Nath and B. Sarkar, “Performance evaluation of advanced manufacturing technologies: A De novo approach,” Computers & Industrial Engineering, vol. 110, pp. 364–378, 2017. [34] D. Meng, H. Zhang, and T. Huang, “A concurrent reliability optimization procedure in the earlier design phases of complex engineering systems under epistemic uncertainties,” Advances in Mechanical Engineering, vol. 8, no. 10, pp. 1–8, 2016. [35] B. Kang, G. Chhipi-Shrestha, Y. Deng, K. Hewage, and R. Sadiq, “Stable strategies analysis based on the utility of Z-number in the evolutionary games,” Applied Mathematics Computation, vol. 324, pp. 202–217, 2018. [36] Y. Yang, G. Xie, and J. Xie, “Mining important nodes in directed weighted complex networks,” Discrete Dynamics in Nature and Society, vol. 2017, Article ID 9741824, pp. 1–7, 2017.

Mathematical Problems in Engineering [37] R. K. Goyal and S. Kaushal, “A constrained non-linear optimization model for fuzzy pairwise comparison matrices using teaching learning based optimization,” Applied Intelligence, pp. 1–10, 2016. [38] X. Zhou, X. Deng, Y. Deng, and S. Mahadevan, “Dependence assessment in human reliability analysis based on D numbers and AHP,” Nuclear Engineering and Design, vol. 313, pp. 243– 252, 2017. [39] X. Zhou, Y. Hu, Y. Deng, F. T. Chan, and A. Ishizaka, “A DEMATEL-based completion method for incomplete pairwise comparison matrix in AHP,” Annals of Operations Research, 2016. [40] X. Deng and Y. Deng, “D-AHP method with different credibility of information,” Soft Computing, 2018. [41] R. Zhang, B. Ashuri, and Y. Deng, “A novel method for forecasting time series based on fuzzy logic and visibility graph,” Advances in Data Analysis and Classification. ADAC, vol. 11, no. 4, pp. 759–783, 2017. [42] H. Zheng, Y. Deng, and Y. Hu, “Fuzzy evidential influence diagram and its evaluation algorithm,” Knowledge-Based Systems, vol. 131, pp. 28–45, 2017. [43] L. Wu, J.-F. Liu, J.-T. Wang, and Y.-M. Zhuang, “Pricing for a basket of LCDS under fuzzy environments,” SpringerPlus, vol. 5, no. 1, article no. 1747, 2016. [44] F. Xiao, “An Intelligent Complex Event Processing with D Numbers under Fuzzy Environment,” Mathematical Problems in Engineering, vol. 2016, Article ID 3713518, 2016. [45] G. Fan, D. Zhong, F. Yan, and P. Yue, “A hybrid fuzzy evaluation method for curtain grouting efficiency assessment based on an AHP method extended by D numbers,” Expert Systems with Applications, vol. 44, pp. 289–303, 2016. [46] H. Mo and Y. Deng, “A new aggregating operator for linguistic information based on D numbers,” International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, vol. 24, no. 6, pp. 831–846, 2016. [47] T. Bian, H. Zheng, L. Yin, and Y. Deng, “Failure mode and effects analysis based on,” in Proceedings of the Dnumbers and TOPSIS, Quality and Reliability Engineering International, 2018. [48] D.-L. Xu, “An introduction and survey of the evidential reasoning approach for multiple criteria decision analysis,” Annals of Operations Research, vol. 195, pp. 163–187, 2012. [49] C. Fu and S. Yang, “Conjunctive combination of belief functions from dependent sources using positive and negative weight functions,” Expert Systems with Applications, vol. 41, no. 4, pp. 1964–1972, 2014. [50] F. Xiao, “A Novel Evidence Theory and Fuzzy Preference Approach-Based Multi-Sensor Data Fusion Technique for Fault Diagnosis,” Sensors, vol. 17, no. 11, p. 2504, 2017. [51] C. Fu, J.-B. Yang, and S.-L. Yang, “A group evidential reasoning approach based on expert reliability,” European Journal of Operational Research, vol. 246, no. 3, pp. 886–893, 2015. [52] M. Aggarwal, “Rough Information Set and Its Applications in Decision Making,” IEEE Transactions on Fuzzy Systems, vol. 25, no. 2, pp. 265–276, 2017. [53] A. P. Dempster, “Upper and lower probabilities induced by a multivalued mapping,” Annals of Mathematical Statistics, vol. 38, pp. 325–339, 1967. [54] F. Li, X. Zhang, X. Chen, and Y. C. Tian, “Adaptive and robust evidence theory with applications in prediction of floor water inrush in coal mine,” Transactions of the Institute of Measurement & Control, vol. 39, no. 1, Article ID 014233121668781, 2017b.

Mathematical Problems in Engineering [55] Y. Zhao, R. Jia, and P. Shi, “A novel combination method for conflicting evidence based on inconsistent measurements,” Information Sciences, vol. 367-368, pp. 125–142, 2016. [56] F. Xiao, “An improved method for combining conflicting evidences based on the similarity measure and belief function entropy,” International Journal of Fuzzy Systems, 2017. [57] W. Bi, A. Zhang, and Y. Yuan, “Combination method of conflict evidences based on evidence similarity,” Journal of Systems Engineering and Electronics, vol. 28, no. 3, Article ID 7978023, pp. 503–513, 2017. [58] C. Li and S. Mahadevan, “Relative contributions of aleatory and epistemic uncertainty sources in time series prediction,” International Journal of Fatigue, vol. 82, pp. 474–486, 2016. [59] S. Ferson, V. Kreinovich, J. Hajagos, W. Oberkampf, and L. Ginzburg, Experimental uncertainty estimation and statistics for data having interval uncertainty, Sandia National Laboratories, Albuquerque, NM, USA, 2007. [60] C. Li and S. Mahadevan, “Role of calibration, validation, and relevance in multi-level uncertainty integration,” Reliability Engineering & System Safety, vol. 148, pp. 32–43, 2016𝑙𝐴. [61] X. Du, A. Sudjianto, and B. Huang, “Reliability-based design with the mixture of random and interval variables,” Journal of Mechanical Design, vol. 127, no. 6, pp. 1068–1076, 2005. [62] O. P. Le Matre and O. M. Knio, Introduction: Uncertainty Quantification and Propagation, Springer, 2010. [63] M. McDonald and S. Mahadevan, “Uncertainty quantification and propagation for multidisciplinary system analysis,” in Proceedings of the 12th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference (MAO ’08), 12, 9 pages, Victoria, BC, Canada, September 2008. [64] C. Li and S. Mahadevan, “An efficient modularized samplebased method to estimate the first-order Sobol index,” Reliability Engineering & System Safety, vol. 153, pp. 110–121, 2016c. [65] S. Wojtkiewicz, M. Eldred, R. Field, A. Urbina, and J. RedHorse, Uncertainty Quantification in Large Computational Engineering Models, American Institute of Aeronautics and Astronautics 14, 2001. [66] B. Efron, “Bootstrap methods: another look at the jackknife,” The Annals of Statistics, vol. 7, no. 1, pp. 1–26, 1979. [67] H. Varian, “Bootstrap tutorial,” Mathematica Journal, vol. 9, no. 4, pp. 768–775, 2005. [68] R. L. Iman, “A Distribution-Free Approach to Inducing Rank Correlation Among Input Variables,” Communications in Statistics - Simulation and Computation, vol. 11, no. 3, pp. 311–334, 1982. [69] S. S. Isukapalli and P. G. Georgopoulos, “Computational methods for efficient sensitivity and uncertainty analysis of models for environmental and biological systems,” in Computational Chemodynamics Laboratory Environmental and Occupational Health Sciences Institute 170, 2001. [70] G. I. Schu¨eller, C. G. Bucher, U. Bourgund, and W. Ouypornprasert, “On efficient computational schemes to calculate structural failure probabilities,” in Stochastic Structural Mechanics, pp. 388–410, Springer, 1987. [71] H. Jeffreys, Theory of probability, 1961. [72] T. Leonard and J. S. Hsu, Bayesian methods: an analysis for statisticians and interdisciplinary researchers, vol. 5, Cambridge University Press, 1999. [73] H. Mo, X. Lu, and Y. Deng, “A generalized evidence distance,” Journal of Systems Engineering and Electronics, vol. 27, no. 2, Article ID 7514435, pp. 470–476, 2016.

11 [74] A. Urbina, T. L. Paez, T. K. Hasselman, G. Wije Wathugala, and K. Yap, “Assessment of model accuracy relative to stochastic system behavior,” in Proceedings of the 44th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference 2003, pp. 7–10, April 2003.

Advances in

Operations Research Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

Advances in

Decision Sciences Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

Journal of

Applied Mathematics

Algebra

Hindawi Publishing Corporation http://www.hindawi.com

Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

Journal of

Probability and Statistics Volume 2014

The Scientific World Journal Hindawi Publishing Corporation http://www.hindawi.com

Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

International Journal of

Differential Equations Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

Volume 2014

Submit your manuscripts at https://www.hindawi.com International Journal of

Advances in

Combinatorics Hindawi Publishing Corporation http://www.hindawi.com

Mathematical Physics Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

Journal of

Complex Analysis Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

International Journal of Mathematics and Mathematical Sciences

Mathematical Problems in Engineering

Journal of

Mathematics Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

Volume 2014

Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

#HRBQDSDĮ,@SGDL@SHBR

Journal of

Volume 201

Hindawi Publishing Corporation http://www.hindawi.com

Discrete Dynamics in Nature and Society

Journal of

Function Spaces Hindawi Publishing Corporation http://www.hindawi.com

Abstract and Applied Analysis

Volume 2014

Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

International Journal of

Journal of

Stochastic Analysis

Optimization

Hindawi Publishing Corporation http://www.hindawi.com

Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

Volume 2014