یک ابزار برای به حداقل رساندن خطاها برای روز رسانی برنامه های جریان کار: مدل CARD
|کد مقاله||سال انتشار||مقاله انگلیسی||ترجمه فارسی||تعداد کلمات|
|21777||2005||21 صفحه PDF||سفارش دهید||محاسبه نشده|
Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)
Journal : Computers & Industrial Engineering, Volume 49, Issue 2, September 2005, Pages 199–220
Product Data Management, the ASP (application service provider) model of e-business, and other information enterprises involve complex database-driven processes that change frequently. Workflow management for these enterprises requires analysis of the causality of updates on the control logic of the processes and the semantics of the databases involved. Previous workflow analysis tends to focus on the former and leave the details of the database control to application developers. This practice could lead to inconsistent applications when the processes evolve and/or the underlying data and rules change. The field still needs an integrated causality analysis tool to facilitate the update problem. In this research, we develop a Control-Activity-Rule-Data (CARD) model of a decision support tool that helps workflow administrators analyze the effects of update in processes and/or databases on both control flow and data/rules flow. The CARD model contributes a new Workflow Information Resources Dictionary which represents data, rules, activities and control in a mutually independent but collectively integrated way to achieve the goal. An extended Workflow Causality Graph capable of representing workflow integrity rules enables the CARD model for implementation. An empirical validation of the model using three representative Product Data Management workflow cases at Samsung Electronics Corporation shows its correctness and relevance for practical applications
Workflow management has its origin in the office automation (Bracchi & Pernici, 1984). The attention in the early 1990s on business process re-engineering has further developed workflow management systems into an enterprise integration technology built on a database engine (see, e.g. Derungs, Volga, & Österle, 1997). Today, the trend has encompassed other information enterprises including ERP (Enterprise Resource Planning), PDM (Product Data Management), Supply Chain Management, Customer Relationship Management, and the emerging practice of ASP (application service provider) in e-business (an example of similar observation is given in Sheth, Aalst, & Arpinar, 1999). Workflow management technology has become a major tool for the rapid development and evolution of complex business processes (Cho, 2002). In essence, workflow management systems first recognize the fundamental workflow tasks (i.e. activities) that constitute a business process, and then define the sequencing of these activities (i.e. control) for particular instances of the process. Information flow (i.e. data) and workflow control knowledge (i.e. rule) are encapsulated in the logic of both activities and control and stored in a database. Thus, Activity, Control, Data, and Rule are the four basic elements of workflow we recognize formally in this research ( Workflow, 1996 and Workflow, 1999). The problem of workflow updates is concerned with the changes initiated on any of these elements and their immediate cascading effects on other elements. This definition is narrower than some of the previous works by, especially, Ellis, Keddara and Rozenberg, (1995) however, it represents a common and frequent problem in practice. The solution of this problem could facilitate the investigation of the larger problems. We refer to this definition everywhere in the paper when we mention dynamic changes, update effects, and the like. Despite the advancements in the field (Aalst et al., 1998, Aalst et al., 2003, Bi and Zhao, 2003, Casati et al., 1996, Kamath and Ramamritham, 1996 and Nutt, 1996), system administrators still lack sufficient tools to analyze update effects and deal with the problem. The reason is simple: while traditional design methods tend to ignore data semantics and leave database structuring to implementation, large-scale and fluid systems such as the above examples nevertheless feature frequent workflow updates that are subject to the encumbrance of the database. In these cases, they need to analyze the effects of updates on both processes and databases in order to minimize errors, not just to analyze processes. The workflow update problem has two basic technical concerns: how to achieve or retain the soundness of the control of processes (i.e. initiation, termination, and consistent flow logic), and how to maintain the integrity of the database objects and inter-relationships (i.e. unique primary keys or object identifications, consistent foreign key values or class inheritances, and consistent data values). We refer to the achievement of both soundness and integrity the causality analysis of workflow updates. The reason that updates need integrated causality analysis while the analysis for new system design may not is two-fold. First, updates take place during operation and have to minimize disruption to the operation. Second, updates could take many forms and on any elements of workflow. In contrast, workflow designers could and would follow a hierarchical structure and analyze workflow elements only in sequence: they focus on processes (analyzing activities and control) and leave the detailed analysis of their database implementation (concerning data and rules) to workflow application developers to perform at a later stage. The latter, in turn, would typically hard-code the details into the applications (embedding data and rules in individual activities and control software) without also considering the global interactions of data flow and control flow at the database level. Update analysis at run time, on the other hand, does not have the same distinction of stages. Control flow and data flow would have to be analyzed together in the same short time frame. Changes could be initiated on data and data flow rules contained in the database, and then cascade to activities and controls in a bottom-up manner. In practice, such as PDM, the center of integrated analysis is often the databases, where a single update of database elements could trigger multiple iterations of analysis with increasingly broader range of redesign on the process elements. Thus, the solution of the integrated causality analysis problem hinges on integrating the database elements into the traditional focus of process elements. The literature has provided good foundations towards comprehensively considering the inter-dependencies among the four basic workflow elements. Some analyzed the soundness of control flow definition (i.e. activities and control flows) using high-level control graph (e.g. Aalst and ter Hofstede, 2000, Joosten, 1994, Sadiq and Orlowska, 2000, Voorhoeve, 2000 and White, 1998). Some studied workflow data and rules for data initialization and inconsistency problems at design time and for the consistency problems between workflow rules and event history (e.g. Alonso et al., 1997, Choi and Zhao, 2002 and Kumar and Zhao, 1996). Some considered control flow analysis and data flow (e.g. Adam et al., 1998 and Reichert and Dadam, 1998). They all contribute to developing a new integrated framework. Problems still remain in, especially, the integration of database elements with process elements to support simultaneous analysis of update effects on both processes and databases. Workflow practitioners would manually ‘integrate’ some of the previous tools when they need causality analysis on updates. Here exists a gap between what the field needs and what the scientific results provide. We contend that a tool helping the practitioners to apply the otherwise disjoint scientific results in a decision support way is a necessary solution, and represents a significant contribution to the betterment of workflow management. In this research, we develop such a new tool to facilitate workflow administrators performing integrated causality analysis to minimize workflow update errors on both processes and databases. The tool has to be grounded in a sound conceptual model; therefore, the bulk of the work is the development of the conceptual model to which we refer as the Control-Activity-Rule-Data (CARD) model of workflow, because it recognizes data and rules as basic workflow elements at the same level as the traditional activities and control. The model, consistent with the Workflow Management Coalition's definitions, provides causality analysis algorithms based on the new formulation of these four workflow elements. The corner stone, and a major contribution, of the new model is an integrated representation method for database and process elements, called in this paper the Workflow Information Resources Dictionary (WIRD). The WIRD representation method extends the Metadatabase model (Hsu, Bouziane, Ratner, & Yee, 1991) to support extended workflow causality graphs and analysis algorithms, for performing integrated analysis of control flow and data flow. The WIRD, therefore, is simultaneously a meta-model (workflow data and process definitions) and a repository of workflow metadata (data and process models). The CARD model, based on these results, promises to enable a tool capable of facilitating the update analysis problem discussed above. We substantiate this claim in the rest of the paper. We develop the CARD model (the WIRD method and an extended work causality graphs), in Section 2, followed by a discussion of how the model may be applied to previous workflow analysis algorithms and integrate them for use in practice, in Section 3. Some of the algorithms add conspicuously to database integrity analysis for workflow updates. Section 4 discusses the prototyping of the CARD model and its empirical testing at Samsung Electronic Corporation, Korea, in the context of the manufacturer's Product Data Management enterprise. We conclude the discussion and suggest some future research in Section 5.
نتیجه گیری انگلیسی
Contemporary workflow management systems are built on Control Technology and Information Technology; however, previous causality analysis methods do not sufficiently consider the information side of workflow updates. We combine results from both fields to bring about a new model integrating all four basic workflow elements-Activity, Control, Data, and Rule-to minimize update errors on both processes and databases. This model enables an integrated causality analysis tool for workflow administrators, who until now had to rely on manual effort to perform this task when updating workflow logic. The main contribution of the paper is the new integrated method for representation of workflow elements that connects processes with databases and thereby facilitates integrated causality analysis. This integrated method makes it more efficient and effective to analyze update effects due to changes in database objects as well as in activities and control. The promises of the CARD model are observable in an empirical testing on Product Data Management workflow systems at Samsung, one of the largest conglomerates in the industrial world. The WIRD method results in a particular meta-model of workflow elements optimized for causality analysis and a repository of workflow metadata which reveals the impact of updates on all workflow elements. The repository provides meta-information for simultaneous analysis of the soundness of processes (control and activities) and the integrity of the database (data and rules pertaining to control and activities). Its integration of data and rules with activities and control recognizes the interrelationship between processes and databases. It uniquely supports update causality analysis since updates on any part of workflow activities, control, data, and rules at run time require the analyses on other parts be performed simultaneously in order to review the cascading effect to the entire workflow system. Therefore, this model adds the ability to reduce workflow update errors due to changes in data and data flow rules, as well as changes in other workflow elements, to the literature. It also has the potential to serve as a reference model for workflow elements in a way similar to a precedence discussed in Hsu, Cho, Yee, & Rattner (1995). The new model also includes a Workflow Causality Graph and database integrity rules to analyze workflow soundness and integrity; both results are rich in data semantics. The WCG, therefore, is an extension to previous results in its incorporation of data flow for the update analysis. The prototype system adopted algorithms from the literature which were adequate for the testing, but do not represent the state of the art in the field. This can be improved in production systems as long as we have shown how WIRD could be applied to many analysis algorithms. The algorithms so adopted address data and rules directly. For example, the confluence analysis algorithm allows analyzing concurrency problems between data accessible by an activity and those by a rule. The integrity analysis algorithm verifies the consistency between workflow data and rules, as well as the soundness of control flow definitions. Together, the integrated causality analysis developed is immediately beneficial to updating workflow processes; however, it would be beneficial to the design of a new workflow system that changes frequently, as well. Although the initial investment for the implementation of the CARD model could be non-trivial, the formal representation of rules and data in the WIRD promises to make the investment paid-off in the long term when updates are frequent. The present result could benefit from some extensions in the future. First, it could enhance its analysis algorithms by adapting the best results from the literature in a way that takes advantage of the WIRD. Second, it could use a methodology such as how to create and maintain the WIRD for particular application domains. The prototype used existing workflow databases to derive the necessary meta-information about data and rules to construct the WIRD. This reverse engineering would be unnecessary if the developers had followed a methodology to create a WIRD in the first place. Third, the results could expand into analysis for distributed workflow updates. The following issues need to be addressed: correctness criteria for distributed workflow updates and concurrency control on distributed workflow specification updates. Finally, we expect the proposed approach to be beneficial to managing workflow for service operations, where business processes tend to be less rigid than the production systems intended for the research. In particular, we believe the emerging practice of ASP (Application Service Provider) in e-business presents a vital need for something similar to the CARD model, with sufficient extensions.