مدل های آماری برای مدیریت ریسک عملیاتی
|کد مقاله||سال انتشار||مقاله انگلیسی||ترجمه فارسی||تعداد کلمات|
|701||2004||7 صفحه PDF||سفارش دهید||1 کلمه|
Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)
Journal : Physica A: Statistical Mechanics and its Applications, Volume 338, Issues 1–2, 1 July 2004, Pages 166–172
The Basel Committee on Banking Supervision has released, in the last few years, recommendations for the correct determination of the risks to which a banking organization is subject. This concerns, in particular, operational risks, which are all those management events that may determine unexpected losses. It is necessary to develop valid statistical models to measure and, consequently, predict, such operational risks. In the paper we present the possible approaches, including our own proposal, which is based on Bayesian networks.
The aim of this paper is to provide a brief review on some model which allows to manage operational risk (OR) and measure capital requirement, compliant with recommendation of Basel Committee on Banking Supervision (Basel II) for any bank's type, especially internationally active banks (see e.g. Ref. ). Different reviews are provided in  and . In fact, the rising interest of supervisor and banking industry, in the recent years, for OR is due to the growth of e-commerce, large-scale mergers and acquisitions and the use of more highly automated technology which test integrated system and provoke a number of situations increasing OR. In “The New Basel Capital Accord” (Basel II), published by Basel Committee, OR is defined as “the risk of loss resulting from inadequate or failed internal processes, people and systems or from external events”, including legal risk but not strategic and reputational risk (see Ref. ). The Accord asks a minimum capital requirement (K) which has to be detected against credit risk, market risk and OR;1 it has also stated a figure of 12% of minimum capital requirement for OR2 and, in the same time, it allows different calculation approaches for the regulatory capital, rising in complexity and decreasing in capital requirements. In particular, the Committee has stated a figure of 12% of minimum capital requirement that would produce a capital amount in line with the OR actually faced by large and complex banking organizations. 2. Risk management: statistical and Basel's view In general, the objective is to estimate a loss distribution and to derive functions of interest from it (such as the Value at Risk, VaR); in particular, losses in market risk are realizations of a continuous stochastic process, while losses in credit risk are realizations of a convolution between a binary process (default or not) and a continuous one. Differently, losses in OR are realizations of a convolution between a counting process (frequency) and a number of continuous ones (severities). Because of the complexity of event which generate operational loss and the heterogeneity of causes, the Committee purposes three approaches which can be, generally, clustered in two main strategies: top–down and bottom–up. • Top–down methods: ORs are measured and covered at a central level, so local business units (e.g. bank branches) are not involved in the measurement and allocation process. The calculation of the capital requirement is performed using variables that are strongly correlated with risk exposure. The Basic Indicator Approach (BIA) proposed by the Committee is an example. • Bottom–up methods: differently from the preceding methodologies, OR exposures and losses have been broken into a series of standardized3 business units (called business lines) and into a group of OR losses according to the nature of the underlying OR event (called event type); ORs are measured at the level of each business line and then aggregated. Although capital coverage is decided centrally, the contribution of each business line is visible and can be monitored; at the same time it's more expensive to implement, but it allows much better management control and planning in a particular business line. We can include Standardized and Advanced Measurement Approaches (AMA) in this class of methods. As indicated on top, Basel II allows three different calculation approaches for the regulatory capital, rising in complexity and decreasing in capital requirements. (1) The BIA links the capital charge to a single indicator that serves as a proxy for the bank's overall risk exposure. If the Gross Income is the indicator, the capital charge required for the bank, in a fixed holding period (usually one year), will be calculated as:
نتیجه گیری انگلیسی
Still the lack of an appropriate historical database makes difficult to apply statistical inference techniques or to “squeeze” in a correct manner information to check tail of loss distribution. Bayesian networks offer a solution to banks seeking to combine both qualitative and quantitative data and also to meet the requirements of the AMA to measuring OR. The use of Bayesian networks for operational risk management has been set forth in Ref. . In fact, the Bayesian statistical approach (see e.g. Ref. ) allows to integrate, via Bayes theorem, different sources of information coming from loss data collection, self assessment and external, industry loss data (data pooled) and opinion of risk managers, to give a unified knowledge which allows to manage OR (i.e., identification, assessment, monitoring and control/mitigation) and, at the same time, to determine minimum capital requirement better and more risk sensitive through Op VaR. This allows to combine backward-looking historical data with forward-looking expectations and opinions and, at the same time, through Bayesian networks we can also consider the correlation between losses of different business lines and risk types and can evaluate the impact of “causal” factors. A Bayesian network is a set of nodes representing random variables and a set of arrows connecting these nodes in an acyclic manner (for a more precise definition see e.g. Ref.  or ). It is identified through a probability distribution, P, on a random vector (X1,…,Xn), defined by the pair (G,q). Where G is a directed acyclic graph whose nodes correspond to the variables X1,…,Xn. The topology of the graph defines the (probabilistic) dependences between the variables; q is a set of the (local) conditional distributions, P(X|XP), for each variable X, given its parents PX. For our purpose we exemplify a network which is characterized by three types of nodes (but we can add others), discrete random variables, for each crossing Firm/Business line/Event type/Process: the losses we have taken from self-assessment and loss data collection (random variables are made discrete). Each random variable has the same number of states, corresponding to increasing loss levels, obtained on the basis of different size or business volume of each firm and using self-assessment data. The effectiveness of internal and external controls (CI and CE) in each process is formulated on the basis of expert's opinion: which gives information about the quality of internal and external control system of the organization and allows to foresee its operational risks. Each random variable has states which give information about the existence and effectiveness of the controls. The network is initialized only with the expert opinions (qualitative learning) coming from the risk and control self assessment, which means we learn the network's topology from the expert opinions. Since process owner does not know the links between nodes and their direction, network learns from opinions. An opinion on severity is counted a number of times equal to the frequency opinion. Structural learning consists in determining the graphical description of the dependencies (the links between nodes) induced by the data or by the expert's opinions. In other words, the aim is to obtain edges (arrows) that link child nodes X to their parents, if there is conditional dependence on them. Once the network is initialized, loss data is inserted daily. Each day leads to one data point, corresponding to the observed severities of the occurred losses. So far implemented with software Hugin version 6.3. By the Bayesian network built, we learned the conditional probability distribution, or local distribution, P(X|ΠX), for each variable X, given its parents ΠX, by quantitative learning. Marginal distributions (a discrete distribution of probability) for each node of the network are calculated from the local distributions, i.e., for each variable X given each combination of states of its parents P. Basel II requires an annual holding period, while the previous results are based on daily data; therefore, the Bayesian network determines a daily Op VaR. In order to calculate the required capital coverage for the correct holding period, it is necessary to simulate total losses from each marginal loss distribution (by summing over days). For instance, the result of simulation for the node 1/3/7/91 (Firm/Retail Banking/Execution, delivery and process management/Process) gives a VaR between 1324612 and 1360500 Euros, according to different simulation sizes. Conditional distribution probability and also the calculation of posterior can be updated, by Bayes theorem, inserted into the network model and propagated internal loss data (or external data if available). The propagation of evidence produce a new structure of network and, consequently, a new Op VaR. The overall Op VaR, determined trough each cross Firm/Business line/Event Type/Process, represents the capital charge required to satisfy Basel II (First Pillar).