دانلود مقاله ISI انگلیسی شماره 27850
ترجمه فارسی عنوان مقاله

چرخه زندگی مدیریت اطلاعات و تصمیم گیری برای تجزیه و تحلیل سیستم

عنوان انگلیسی
LIFE-CYCLE MANAGEMENT OF INFORMATION AND DECISIONS FOR SYSTEM ANALYSES
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
27850 2001 15 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Mechanical Systems and Signal Processing, Volume 15, Issue 3, May 2001, Pages 513–527

ترجمه کلمات کلیدی
تجزیه و تحلیل سیستم - مدیریت اطلاعات
کلمات کلیدی انگلیسی
SYSTEM ANALYSES, MANAGEMENT OF INFORMATION
پیش نمایش مقاله
پیش نمایش مقاله  چرخه زندگی مدیریت اطلاعات و تصمیم گیری برای تجزیه و تحلیل سیستم

چکیده انگلیسی

Quality products require quality analysis; analysis that delivers the necessary behavioural information using the assigned resources in the prescribed time frame. Using analysis tools effectively requires deep understanding of theory and practice. Practical analysis could be supported by managing analysis tools information and their past uses. If managed properly, this information would be beneficial for present and future tool reuse. Following an examination of analysis activities, a model of analysis activities is developed including various aspects that influence it. The model includes all contextual information of analysis. With this information the model could improve analysis as well as prevent serious analysis errors. The model called AML—analysis modelling language—is described precisely with the object-oriented formalism unified modelling language (UML). The most important part of AML, the static structure diagram, is discussed in this paper. AML could be transformed with reasonable effort into an analysis process management system.

مقدمه انگلیسی

Product life cycle refers to whatever happens to the product throughout its life starting before inception in market analysis, through its design, manufacture, usage (including maintenance), and recycling. Any system and its associated information evolve through these stages from abstract to concrete. To illustrate, at the end of design, the product de"nition becomes more precise than initially envisioned, for example by specifying dimensional tolerances; while the information about the product manufacture is still incomplete, for example, dimensional variations are unknown and manufacturing #aws are not yet detected. Similarly, the information about the actual product behaviour is uncertain. For instance, uncertain material structure and properties, imprecise dimensions, uncertain environment, and unknown or uncertain interactions between physical phenomena could contribute to variability in anticipated system behaviour. When the system is being realised in physical objects, some of its associated information may become more precise such as the "nal product dimensions as built, while others, such as material structure and properties, remain uncertain. Some parameters, such as friction between components, also change in time. The evolution of information as product development stages unfold mandates special attention from systems that support design processes. This paper concentrates on systems analysis: the process of estimating systems behaviour. The behaviour of a system deals with its performance under various conditions including handling and operation. Behaviour is determined using analysis methods applied to conceptual models or measurements on scaled physical products or prototypes. Analysis could be performed with simple equations for deriving some values or complex ones such as 3D time-dependent multidisciplinary analysis. The system behaviour as estimated by analysis determines the system ful"lment of its desired functionality which is the initial driver of the product design. Consequently, the importance of product behaviour and functionality makes analysis information and processes prime targets for life-cycle management. Analysis is also a much less formal concept. When designers interact in product design, their exchange involves a particular aspect of the design that is modelled in their discussion. A question posed by one designer constitutes modelling and the response an analysis. Often, the focus of the discussion or negotiation drifts marking the use of several models which, while possibly loosely connected, are nevertheless invaluable for the negotiation. Therefore, to bene"t from past models arising in collaborative processes, the information derived from previous negotiations between designers needs to be maintained [1]. Information not only evolves through development stages but also any piece of information related to product behaviour evolves throughout the product life cycle. Initially, market analysis determines whether a product with the expected functions is viable. Subsequently, in product design, various analyses determine conceptually whether the product meets its intended speci"cations. In manufacturing, quality assurance determines through measurements whether the manufactured system meets its physical requirements (e.g. tolerances) and how they impact the intended behaviour. Finally, during system operation, sensors can provide data regarding actual system behaviour. All these calculations or measurements are di+erent evolving views of the same system behaviour. They must be integrated and managed together. Not only does the system evolve through its life cycle stages, but so also does its entire context including analyses methods and computer tools, engineers performing analyses, customer requirements, design standards (e.g. environmental), parts availability, and technology. Thus, life-cycle management of analyses information means the management of all these aspects. The importance of life-cycle management of analyses information cannot be taken lightly; simply excelling in analytical skills does not guarantee that an analysis will be reliable. For example, analysis could fail or be unreliable if the software used is incorrect due to improper testing. Roache [2] describes an example of his own work where he introduced a gross error of a factor 2 into an analysis code. The error was not detected due to partial testing. Following proper testing, the error was corrected and the code produced reliable results without failing on a previously failed problem. Even if analysis tools are veri"ed and validated, there is still ample room for failures. There could be serious consequences to misuse of analysis or simulation tools. A product could fail completely due to erroneous analysis or simulation arising from mismanagement of the analysis process. As it turns out, organisations that lead technology progress are not immune to such failures. Consider the following examples of space-related system failures. Misuse of analysis input. The failure (and waste of $193M) of NASA Mars Lander in December 1999 was due to a wrong use of units*the most basic aspect of doing any engineering calculations. The unit mismatch led to erroneous maneuver required to place the Lander in correct Mars orbit. With proper management and tools, such a failure could be easily avoided. Mismanagement of product development including analysis. The failure of the Ariane 5 rocket in June 96 was caused by a conversion of a 64-bit #oating point number to a 16-bit signed integer that failed because the number was greater than could be represented by a 16-bit signed integer. The code in the particular location was not protected from such failure although it was protected in other similar places in the code. The decision not to protect it probably came from using analysis scenarios data from the Ariane 4 model that were invalid for the Ariane 5 project. Note that such failures could easily occur in any software. Rarely are expert analysts aware of such failure possibilities. Mismanagement of simulation. The loss (and waste of $65M) of NASA Lewis Spacecraft in August 97 is attributed to #awed design that was checked using #awed simulation. As in the Ariane 5 failure, the simulation was based on a simpler product and the new product complexities were neither modelled nor tested. Clearly, practical analysis involves many more issues than using formal analysis theories or even commercial analysis code. The crux of the matter is that we do analysis on models which are by de"nition abstractions of reality that are assumed to capture the essential features of the problem we are investigating while ignoring irrelevant information. The analysis we exercise uses tools that model physical theories or computational algorithms, assuming again that the gap between the theory and its implementation is negligible. These assumptions are never completely true. Managing practical analysis involves the ability to explain the seriousness of these assumptions, create robust tools, and explain examples where they fail yet analysis is successful and situations where they fail and analysis fails. Subsequently, we show how assumptions underlying all modelling decisions could be captured and managed throughout the life cycle of the subject of analysis. This paper deals with managing analysis processes by addressing several key questions: f What needs to be managed for supporting design activities and future information reuse? Following the Ariane 5 and NASA Lewis Spacecraft cases, the answer is all-embracing: everything. The complete analysis scenario including contextual information, assumptions, aspects including units, precision, etc., must be recorded and managed for future reference. It is never clear what might be needed or be critical for future use. f How (in what form and in which process) the complete analysis process should be managed for improving its quality, relevance, and reliability? All data must be linked to the source in the system whose con"guration and versions must be managed over time. The activities carried out with analysis software must be recorded at the lowest level of selecting operational parameters and maintaining links between data and results generated. The decisions made by experts regarding analysis results must also be managed tightly with other engineering decisions regarding the system. Finally, engineering decisions and their e!ect on the real world must be recorded to support credit and assign blame. f How managed analysis information could be used for the bene"t of better performing future analyses and creating future quality systems? This is a tough question involving also technical issues. Consider that the NASA Lewis Spacecraft failed due to general problems similar to those of the Ariane 5, namely, use of previous versions of simulations without proper account of the di!erences between systems. An infrastructure in which various analysis tools could be integrated and within which users could model and record complete analysis processes could o!er a foundation for addressing these problems. Analysis processes would be stored as cases with all their relevant information including the product, customers, technology, and other contextual information. If managed properly, recorded cases could answer questions such as: 1. How to simplify a given problem to be amenable to the application of existing analysis tools? 2. How to assess accuracy of solutions in a particular problem context? 3. Is a particular analysis tool suitable for solving a current problem? 4. Which resources would a particular analysis require? The analysis information model described in Section 3 includes answers to these questions. The remainder of this paper is structured as follows. Section 2 reviews a common analysis process model and explains critical characteristics of practical analysis activities. Section 3 describes in detail an information model designed to manage analysis knowledge. The model is described in UML*an emerging standard for modelling object-oriented systems [3]. This model is called analysis modelling language (AML). AML could represent di!erent analyses models and could be used to develop an analysis support system with moderate e!ort. As such, AML can be considered as a meta-model of analysis processes suitable for integrating complex analysis activities [4]. Section 4 discusses the model with two examples one of which is seldom appreciated yet might have serious e!ects on analysis success or failure. Section 5 concludes the paper.

نتیجه گیری انگلیسی

¹he necessity of AM¸. Quality analysis is much more than a pro"ciency in formal mathematical models. Excelling in analysis requires life-cycle management of, and introspection upon, analysis decisions and accumulated knowledge. Without proper analysis knowledge management, no progress in practical analysis is possible. Analysis failure could occur resulting in loss of products. An information model for analysis knowledge management and the issues involved in this management have been discussed. ¹he scope of AM¸. Present views about analysis processes are quite limited. Our view of industrial problems led us to specify a UML model of analysis process information called AML. We have presented its static structure diagram.AML captures a re"ned process and information objects with opportunities for user interaction, use of analysis knowledge, tools, and the incorporation of rationale. The model explicitly speci"es the connection between analysis and the industrial problem being solved and the object (physical or conceptual) that is the subject of the problem. The decisionmaking activity could be further elaborated by describing the internal structure of the rationale Ra. This structure would specify exactly how the di!erent inputs to the process are manipulated as well as how the uncertainty in analysis decision-making is maintained. We contend that AML could be used to describe most system analyses processes with the necessary detail. It could also be tailored to model analyses information in other domains such as data analysis. ¹he dynamic status ofAM¸. Since all aspects in#uencing analysis processes might change, all items managed must be version controlled, and all structured items such as Pr or Ob should be con"guration controlled. These facilities handle the dynamic nature of the information modelled with AML. However, AML could evolve based on experience using the managed information in future projects. This evolution and the reasons behind it could further our understanding of analysis and its e!ectiveness in addressing problems. Several additional models in the framework of UML, must also be detailed to complete the process management model. Of course, these models will also evolve in time. Several conceptual problems related to managing analysis processes are still open. Four of these problems related to the complexity, dynamics, and uncertainty of the environment are central: f The life-cycle management of knowledge extracted from multiple analyses scenarios (e.g. residing in di!erent cases), including their cooperative use in decision-making is unsolved. This management is critical for saving resources while solving future IPs. f Coordinating analysis decision-making with the evolution of objects and the evolution of AML is unsolved. f The structuring and management of decision-making rationale can improve the reusability of cases but will never be completely formalised. The decisions regarding where to stop formalising and how to formalise remain unsolved. f All engineering decision-making is uncertain. In complex processes that AML could represent and manage it is impossible to account for these uncertainties without proper management. Techniques not based on probability theory [12] might provide a foundation for such management.