دانلود مقاله ISI انگلیسی شماره 9682
ترجمه فارسی عنوان مقاله

ارزیابی مدل شبیه سازی ساختاری با توجه به انتخاب مدل و ساده سازی مدل ذر برآورد عدم قطعیت

عنوان انگلیسی
Assessment of structural simulation models by estimating uncertainties due to model selection and model simplification
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
9682 2011 9 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Computers & Structures, Volume 89, Issues 17–18, September 2011, Pages 1664–1672 Cover image

ترجمه کلمات کلیدی
تجزیه و تحلیل حساسیت - پیچیدگی مدل - عدم قطعیت مدل - انتخاب مدل های بیزی
کلمات کلیدی انگلیسی
پیش نمایش مقاله
پیش نمایش مقاله  ارزیابی مدل شبیه سازی ساختاری با توجه به انتخاب مدل و ساده سازی مدل ذر برآورد عدم قطعیت

چکیده انگلیسی

In this paper several methods for model assessment considering uncertainties are discussed. Sensitivity analysis is performed to quantify the influence of the individual model input parameters. In addition to the well-known analysis of a single model, a new procedure for quantifying the influence of the model choice on the uncertainty of the model prediction is proposed. Furthermore, a procedure is presented which can be used to estimate the model framework uncertainty and which enables the selection of the optimal model with the best compromise between model input and framework uncertainty. Finally Bayesian methods for model selection are extended for model assessment without measurements using model averaging as reference.

مقدمه انگلیسی

In structural design the prediction of the structural response is estimated often by using numerical or analytical models. Every prediction underlies a certain uncertainty which could be interpreted as a measure of the quality of the prediction. Following [1], uncertainty is the term to describe incomplete knowledge about models, parameters, constants, data, and beliefs. There are many sources of uncertainty, including the science underlying a model, uncertainty in model parameters and input data, observation error, and code uncertainty. Generally, uncertainties that affect model quality are categorised as: • Model framework uncertainty: the uncertainty in the underlying science and algorithms of a model. Model framework uncertainty is the result of incomplete scientific data or lack of knowledge about the factors that control the behaviour of the system being modelled. Model framework uncertainty can also be the result of simplifications necessary to translate the conceptual model into mathematical terms. • Model niche uncertainty: resulting from the use of a model outside the system for which it was originally developed and/or developing a larger model from several existing models with different spatial or temporal scales. • Model input uncertainty: resulting from data measurement errors, inconsistencies between measured values and those used by the model (e.g., in their level of aggregation/averaging), and parameter value uncertainty. We can distinguish between data uncertainty caused by measurement errors, analytical imprecision, and limited sample sizes during the collection and the treatment of data, and the stochasticity, which are fluctuations in ecological processes that are due to natural variability and inherent randomness.

نتیجه گیری انگلیسی

In this paper several methods for model assessment and selection are discussed. Extending standard sensitivity analysis of single models to a set of models, it could be shown, that in terms of the sensitivity of the model choice the designer can judge if the model choice has an influence on the prediction quality or if the uncertainty of the input parameters dominates. In addition two other methods are proposed to assess the model prediction quality. The first one estimates the model framework uncertainty from a reference, which could be the average of all models. The second one uses Bayesian methodology to quantify model probabilities. Both procedure gave similar results for the investigated examples and could be used to select the most appropriate model with the most accurate prediction. Nevertheless, both procedures require an assumption regarding a reference model. This assumption may significantly influence the estimated model uncertainties and model probabilities. The reference model should be chosen as the model with the best representation of the expected physical phenomena by assuming all required input parameters as exactly known. An average of all investigated models can be used, if no such a model can be preferred a priori. In the general case this reference model could be the most complex model, which is available. In a deterministic approach the designer would tend to use only this most complex model. The presented procedures now pay attention to the increasing model input uncertainty with increasing model complexity. They may help the designer to judge, which is the best compromise in terms of the model prediction quality, increasing model input uncertainty with increasing model complexity or decreasing model framework uncertainty with increasing complexity. On the other hand the presented procedures may give an answer to the question, which model complexity is useful for a certain simulation problem with a certain state of knowledge about the model parameters: is a more simpler model with higher abstraction level but fewer input parameters more suitable than a more complex model with more input parameters.