M's Opinion Vol.45 Cutting Edge of Risk Management in Financial Institutions and its Pitfalls -Toward Use of Quantitative Methods Considering their Theoretical Limits-

Mar. 23, 2015

Cutting Edge of Risk Management in Financial Institutions and its Pitfalls -Toward Use of Quantitative Methods Considering their Theoretical Limits-

Naoki Matsuyama
Professor
Department of Mathematical Sciences Based on Modeling and Analysis
School of Interdisciplinary Mathematical Sciences, Meiji University
 
Enterprise Risk Management (ERM), Cutting Edge of Risk Management

I have many years’ experience working as an actuary at a life insurer. The main responsibility of an actuary is to contribute to ensuring financial soundness and fair operation of the insurance system by using quantitative methods in probability and statistics to solve various issues related to insurance and pensions. One of these tasks is determining reasonable insurance premiums, while another is to play a role in risk management for the insurer, as seen in Asset Liability Management (ALM) activities. ALM is a technique that attempts to control serious financial risks that could lead to bankruptcy through harmonization of the quantitative properties of liabilities with the properties of assets. Attention turned to the importance of ALM in Japan after the nation saw the successive bankruptcies of seven life insurers over the years 1997 through 2001. I too worked in ALM at a life insurer for a long time, and even after coming to the University my main subject of research has been risk-management techniques in this field of actuarial sciences.
This field includes Enterprise Risk Management (ERM), a risk-management concept that has attracted considerable attention in recent years. In the past, corporate risk was managed separately by type of risk based on the recognition that risk management serves a brake. In contrast, ERM adopts a comprehensive framework that includes ALM, and as its name implies it is a top-down process conducted by the entire enterprise from the board of directors to employees. Doing so makes it possible to identify key risks throughout the enterprise and allocate management resources efficiently. In other words, ERM can be described as a method of risk management that aims to maximize corporate value. It can serve as an accelerator in addition to a brake.

The Adoption of Solvency II and ERM
In recent years ERM has been attracting attention from both the public and private sectors. One reason behind this is the new insurance regulation called Solvency II planned for adoption by the European Union (EU). Solvency refers to an enterprise’s capacity to meet its financial commitments. Insurers accumulate reserves in preparation for payment of future insurance claims and other obligations. Their solvency margin is their funding ability to handle normally unanticipated risks in excess of these reserves. The solvency margin ratio, calculated by dividing this by total risk, is one important indicator of the financial soundness of an insurer. Regulations call for this ratio to be at least a certain amount, and the higher it is the more sound the insurer is considered to be.
The Solvency II is being considered as a way of modernizing the existing solvency regulations on EU insurers, which have used simple factor based approach on accounting figures. It adopts a structure of three pillars like those of the international capital requirements that apply to banks (the Basel regulations). While the third pillar calls for disclosure of a wide range of information to the markets, each of the first and second pillars has a major distinguishing feature. The first pillar calls for valuation of assets and liabilities based on their economic value using methods conforming to market principles, with solvency margin regulations based on such valuation serving as quantitative requirements. This is quite similar to the advanced ALM method known as Surplus Management / Economic Value Based ALM. The second pillar calls for the qualitative requirements of advanced risk management and enhanced governance. This includes self assessment of risk and solvency in a manner extending beyond the quantitative framework of regulations, called Own Risk and Solvency Assessment (ORSA). This is highly compatible with ERM. In response to the movement toward Solvency II, Japan’s Financial Service Agency also has begun demanding tentative calculation of risk assessments based on economic value and trial of ORSA from Japanese insurers, but it must be noted that since essentially ERM is something that should be conducted autonomously, when employed solely because it is required by regulations it probably cannot be described as true ERM.


Problems with the ERM Framework

ERM can be considered a direction in which risk management should progress. However, we must not forget that at the time of the financial crisis triggered by the collapse of Lehman Brothers in 2008 the curious incident arose of several global financial institutions that had been assessed by a rating agency as top class ERM practitioners falling into deep financial difficulty and had to be bailed out. It seems dangerous that while in the background there must have been some kinds of problem with the ERM assessment criteria advocated by the rating agency, this has not yet been reviewed objectively. It is possible that ERM, considered cutting edge, involves some kinds of pitfalls. I am focusing on the optimization of Risk-Adjusted Performance Measures (RAPM), which are essential to earning a high evaluation under the ERM assessment criteria. RAPM is an indicator calculated through internal assessment of the amount of risks (risk capital) and dividing expected returns by risk capital. As an indicator that truly combines into one the brake (denominator) and accelerator (numerator) functions claimed for ERM, it would appear that optimization of RAPM would be ideal. However, I believe that this concept may involve various problems. Traditionally, it has been considered standard practice to employ in decision-making a two-dimensional, two-parameter approach based on risks and returns. However, when we consider its divergence from the ideal theoretical principle of maximization of expected utility, this should be careful to use. Since at the very least RAPM is a one-dimensional indicator, when viewed from the perspective of the original two-dimensional vectors it clearly involves a loss of some information, and it should not be unexpected that optimization based on it should cause some problems to arise.

The Limits of Quantitative Methods and Risk Management
The risk measures used to calculate the denominator of RAPM also involve some potential problems. It is a well-known fact that Value at Risk (VaR), the most widespread risk measure in risk management worldwide, which indicates the maximum amount of losses at a specified confidence level, has the drawback of not satisfying subadditivity (the diversification effects of risks). Expected shortfall (ES) has appeared as one way of overcoming this drawback of VaR, but this too has imperfections related to time consistency, which is important to measuring risk over multiple periods. In addition, general risk measures assume a homogeneous property in which the volume of risk is proportional to the volume of investment, but due to limits on the capacity of the markets risk can increase rapidly to an even greater extent than the increase in investment when the volume of investment is extremely large. LTCM, a well-known hedge fund founded by Nobel prize winners in economics, collapsed due to this principle. Going even further, there are fundamental problems concerning whether the stochastic models on which measurement of expected returns and risks are based are themselves reliable to begin with. This is because unlike physical phenomena, which are reproducible, financial phenomena essentially are irreproducible. In light of facts such as these, it is clear that at the very least we should be cautious about a ritual approach to optimization in financial risk management. Still, despite these considerations the use of quantitative methods in risk management is very useful in the sense of expanding the frontiers of cognition. The issue is the need to use quantitative methods with an understanding of their theoretical limits. Instead of simply proposing new models, I believe that one of the social responsibilities of academia is to make clear their theoretical limits as well.

Profile

Naoki Matsuyama
Professor, Department of Mathematical Sciences Based on Modeling and Analysis, School of Interdisciplinary Mathematical Sciences, Meiji University

Research Fields:
Mathematics (probability, statistics), Actuarial Science

Research Topics:
Economic value based ALM, ERM

Degree:
Ph.D. (science)

Major Books and Papers:
Financial Enterprise Risk Management (chief translator, Asakura Publishing, 2014)
Selected Papers on Probability and Statistics (coauthor,AMS, 2009)

And others


The information contained herein is current as of September 2014.

Page Top

Meiji University

Copyright © 2015 Meiji University. All Rights Reserved.