|
. | . |
|
by Staff Writers Philadelphia, PA (SPX) Apr 03, 2014
A computer model is a representation of the functional relationship between one set of parameters, which forms the model input, and a corresponding set of target parameters, which forms the model output. A true model for a particular problem can rarely be defined with certainty. The most we can do to mitigate error is to quantify the uncertainty in the model. In a recent paper published in the SIAM/ASA Journal on Uncertainty Quantification, authors Mark Strong and Jeremy Oakley offer a method to incorporate judgments into a model about structural uncertainty that results from building an "incorrect" model. "Given that 'all models are wrong,' it is important that we develop methods for quantifying our uncertainty in model structure such that we can know when our model is 'good enough'," author Mark Strong says. "Better models mean better decisions." When making predictions using computer models, we encounter two sources of uncertainty: uncertainty in model inputs and uncertainty in model structure. Input uncertainty arises when we are not certain about input parameters in model simulations. If we are uncertain about true structural relationships within a model-that is, the relationship between the set of quantities that form the model input and the set that represents the output-the model is said to display structural uncertainty. Such uncertainty exists even if the model is run using input values as estimated in a perfect study with infinite sample size. "Perhaps the hardest problem in assessing uncertainty in a computer model prediction is to quantify uncertainty about the model structure, particularly when models are used to predict in the absence of data," says author Jeremy Oakley. "The methodology in this paper can help model users prioritize where improvements are needed in a model to provide more robust support to decision making." While methods for managing input uncertainty are well described in the literature, methods for quantifying structural uncertainty are not as well developed. This is especially true in the context of health economic decision making, which is the focus of this paper. Here, models are used to predict future costs and health consequences of options to make decisions for resource allocation. "In health economics decision analysis, the use of "law-based" computer models is common. Such models are used to support national health resource allocation decisions, and the stakes are therefore high," says Strong. "While it is usual in this setting to consider the uncertainty in model inputs, uncertainty in model structure is almost never formally assessed." There are several approaches to managing model structural uncertainty. A primary approach is 'model averaging' in which predictions of a number of plausible models are averaged with weights based on each model's likelihood or predictive ability. Another approach is 'model calibration', which assesses a model based on its external discrepancies, that is, output quantities and how they relate to real, observed values. In the context of healthcare decisions, however, neither of these approaches is feasible since typically more than one model is not available for averaging, and observations on model outputs are not available for calibration. Hence, the authors use a novel approach based on discrepancies within the model or "internal discrepancies" (as opposed to external discrepancies which are the focus of model calibration). Internal discrepancies are analyzed by first decomposing the model into a series of subunits or subfunctions, the outputs of which are intermediate model parameters that are potentially observable in the real world. Next, each sub-function is judged for certainty based on whether its output would equal the true value of the parameter from real-world observations. If a potential structural error is anticipated, a discrepancy term is introduced. Subsequently, beliefs about the size and direction of errors are expressed. Since judgments for internal discrepancies are expected to be crude at best, the expression of uncertainty should be generous, that is, allowed to cover a wide distribution of possible values. Finally, the authors determine the sensitivity of the model output to internal discrepancies. This gives an indication of the relative importance of structural uncertainty within each model subunit. "Traditional statistical approaches to handling uncertainty in computer models have tended to treat the models as 'black boxes'. Our framework is based on 'opening' the black box and investigating the model's internal workings," says Oakley. "Developing and implementing this framework, particularly in more complex models, will need closer collaboration between statisticians and mathematical modelers." When Is a Model Good Enough? Deriving the Expected Value of Model Improvement via Specifying Internal Model Discrepancies by Mark Strong and Jeremy E. Oakley
Related Links Society for Industrial and Applied Mathematics Space Technology News - Applications and Research
|
|
The content herein, unless otherwise known to be public domain, are Copyright 1995-2014 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. Privacy Statement All images and articles appearing on Space Media Network have been edited or digitally altered in some way. Any requests to remove copyright material will be acted upon in a timely and appropriate manner. Any attempt to extort money from Space Media Network will be ignored and reported to Australian Law Enforcement Agencies as a potential case of financial fraud involving the use of a telephonic carriage device or postal service. |