Risk Measurement: From Quantitative Measures to Management Decisions [1st ed.] 978-3-030-02679-0, 978-3-030-02680-6

This book combines theory and practice to analyze risk measurement from different points of view. The limitations of a m

508 98 7MB

English Pages XIV, 215 [225] Year 2019

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

Risk Measurement: From Quantitative Measures to Management Decisions [1st ed.]
 978-3-030-02679-0, 978-3-030-02680-6

Table of contents :
Front Matter ....Pages i-xiv
Introduction (Dominique Guégan, Bertrand K. Hassani)....Pages 1-15
Financial Institutions: A Regulation Review Through the Risk Measurement Prism (Dominique Guégan, Bertrand K. Hassani)....Pages 17-36
The Traditional Risk Measures (Dominique Guégan, Bertrand K. Hassani)....Pages 37-67
Univariate and Multivariate Distributions (Dominique Guégan, Bertrand K. Hassani)....Pages 69-114
Extensions for Risk Measures: Univariate and Multivariate Approaches (Dominique Guégan, Bertrand K. Hassani)....Pages 115-142
Linear Dynamics (Dominique Guégan, Bertrand K. Hassani)....Pages 143-166
Risks and Non-Linear Dynamics (Dominique Guégan, Bertrand K. Hassani)....Pages 167-215

Citation preview

Dominique Guégan · Bertrand K. Hassani

Risk Measurement From Quantitative Measures to Management Decisions

Risk Measurement

Dominique Guégan • Bertrand K. Hassani

Risk Measurement From Quantitative Measures to Management Decisions

123

Dominique Guégan University Paris1 Panthéon-Sorbonne LabEx ReFi and IPAG Paris, France

Bertrand K. Hassani Department of Computer Science University College London London, UK

University Ca’Foscari Venezia, Italy

Department of Finance Université Paris 1 Panthéon-Sorbonne Paris, France Department of Financial Regulation LabeX Refi (ESCP - ENA - Paris 1) Paris, France

ISBN 978-3-030-02679-0 ISBN 978-3-030-02680-6 (eBook) https://doi.org/10.1007/978-3-030-02680-6 Library of Congress Control Number: 2018959149 © Springer Nature Switzerland AG 2019 This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, express or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. This Springer imprint is published by the registered company Springer Nature Switzerland AG. The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland

A modest contribution to the world of risk measurement, opening the way for possible alternatives in the field of financial regulation and other areas. I hope my former students will find in this book a source of reflection. Dominique Guégan February 2019

To my sunshines, Lila, Liam and Jihane. To my parents, my brother, my family and my friends. I am grateful for your continuous support and encouragement. With Love, Bertrand Kian Hassani February 2019

Preface

The purpose of this book is to discuss how risk measures can be leveraged for risk management purposes and in the meantime address the regulatory requirements concerning the evaluation of the risk associated with any financial institution. Even if regulation is seen as a necessity for financial institutions, financial institutions should be interested in risk control as part of their core business and should not only address the issue because of regulatory requirement. It is important to remember at that stage that financial institutions are generating profit by a risk transformation mechanism, and as such risk measurement is key. This book will address the different risk measures through this angle. The objective is to provide an adequate framework for risk managers to reach the objectives set by regulatory guidelines while controlling internal risks in the best possible way, in particular when instability is present on the markets, or when specific events are generating turmoil that could engender big losses, illiquidity or bankruptcy. In order to meet that goal, the process is broken down into several compulsory elements required to obtain an appropriate value associated with all the risks faced by financial institutions, i.e. a value deemed representative of the exposure. The measures associated with all these risks depend on various mathematical components that risk managers need to be aware of: the importance and the influence of these tools on the results are illustrated in this book. At the same time, we also demonstrate that the computation of the risks needs to lie on the reality of the risks to measure, and not on some a priori concerning the choice of the mathematical tools we need to use to get this measure. Our points will be illustrated with examples allowing us to understand the links between the concepts discussed within the book and the risk management reality. Indeed, risk measurement raises a number of questions which we address point by point in the following chapters. After introducing a discussion on the Basel’s guidelines around the question of risk measure, we recall the classic models used in finance (variance, CAPM, APT, etc.) for measuring the risks and discuss their limits. Then, we address the characterisation of a risk factor using an unimodal distribution and develop the risk measures associated with this approach (classical VaR, TVaR and ES). We extend the discussion to the risk measures computed ix

x

Preface

using multi-modal distributions (distortion, combination, etc.). We then extend the discussion to multivariate approaches, introducing the Kendall risk measure among others. Finally, new routes are explored concerning spectral measures, spectrum of risk measures and spatial risk measures. We conclude this book showing how risk measures can be used as alert procedures in case of fire in the banks or insurance companies with the stress approach. But before going into the technical aspects of the risk measures, we focus on the evolution of the regulation since 1968. Paris, France Paris, France

Dominique Guégan Bertrand K. Hassani

Contents

1

Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 1.1 1968–1988: Premises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 1.1.1 The Triggering Elements . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 1.2 Basel I . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 1.3 Basel II . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 1.3.1 Basel II.5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 1.4 Basel III.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 1.5 Solvency II Directives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 1.6 Successive Criticisms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 1.6.1 Basel I . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 1.6.2 Basel II . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 1.6.3 Basel III . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 1.6.4 Solvency II . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . .

2 Financial Institutions: A Regulation Review Through the Risk Measurement Prism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 2.1 Credit Risk . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 2.1.1 Standardised Approach .. . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 2.1.2 Internal Ratings-Based Approach .. . . . . . . . .. . . . . . . . . . . . . . . . . . . . 2.2 Credit Value Adjustment.. . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 2.3 Operational Risk . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 2.3.1 Basic Indicator Approach . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 2.3.2 Standardised Approach .. . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 2.3.3 The Advanced Measurement Approach (AMA) . . . . . . . . . . . . . . 2.3.4 Standardised Measurement Approach or New Standardized Approach .. . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 2.4 Market Risk . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 2.4.1 The Fundamental Review of the Trading Book . . . . . . . . . . . . . . . 2.4.2 Standardised Approach .. . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 2.4.3 Internal Models Approach .. . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . .

1 1 1 4 4 6 7 11 12 12 13 13 14 15 17 17 18 19 21 22 23 23 24 25 27 27 28 29 xi

xii

Contents

2.5 IFRS9 as a Risk Regulation . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 2.6 The Stress-Testing Framework . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . .

31 33 35

3 The Traditional Risk Measures. . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 3.1 Measures of Dispersion .. . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 3.1.1 Distance Between Representative Values ... . . . . . . . . . . . . . . . . . . . 3.1.2 Deviation from a Central Value . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 3.1.3 Mean Absolute Difference . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 3.2 Risk Measurement: Portfolio Theory Philosophy .. . . . . . . . . . . . . . . . . . . . 3.2.1 Modern Portfolio Theory .. . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 3.2.2 Efficient Frontier with No Risk-Free Asset: Risk Management in Essence . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 3.2.3 Risk-Free Asset and the Capital Allocation Line . . . . . . . . . . . . . 3.2.4 Risk Management Through MPT: Systematic Risk and Specific Risk . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 3.2.5 Capital Asset Pricing Model . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 3.2.6 Arbitrage Pricing Theory .. . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 3.2.7 Downside Risk Measures.. . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 3.2.8 Downside Deviation . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 3.2.9 Sortino Ratio . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 3.3 Quantile Risk Measure and Affiliated .. . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 3.3.1 VaR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 3.3.2 Expected Shortfall . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 3.3.3 Tail Value at Risk . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 3.3.4 CoVaR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 3.3.5 Entropic Risk Measures: ρ erm and EVaR . .. . . . . . . . . . . . . . . . . . . . 3.4 Distortion Risk Measure .. . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . .

37 37 38 39 45 46 46 48 49 49 50 51 53 55 56 56 56 58 59 60 61 64 65

4 Univariate and Multivariate Distributions . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 69 4.1 Univariate Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 69 4.1.1 Some Recalls on the Probabilistic Characteristic of a Risk Factor . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 69 4.1.2 Examples of Parametric Distributions . . . . .. . . . . . . . . . . . . . . . . . . . 70 4.1.3 Non-parametric Modelling for a Distribution . . . . . . . . . . . . . . . . . 77 4.1.4 Distorted Distributions . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 80 4.2 Multivariate Approach .. . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 92 4.2.1 Definition of a Copula . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 93 4.2.2 Properties.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 94 4.2.3 Examples of Copulas . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 98 4.2.4 Tail Dependence Concept . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 103 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 112

Contents

5 Extensions for Risk Measures: Univariate and Multivariate Approaches . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 5.1 New Risk Measure Computed for One Risk Factor . . . . . . . . . . . . . . . . . . . 5.1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 5.1.2 The Spectral Risk Measure .. . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 5.1.3 The Spectrum as a Risk Measure . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 5.1.4 A Spatial Risk Measure . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 5.1.5 Risk Measure Associated with a Multimodal Distribution . . . 5.1.6 Some Remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 5.2 VaR in High Dimension . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 5.2.1 Archimedean and Extreme Value Copulas in High Dimension .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 5.2.2 Lower- and Upper-Orthant VaR, and Multivariate Quantiles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . .

xiii

115 115 115 115 117 118 123 128 131 131 135 142

6 Linear Dynamics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 6.2 Stochastic Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 6.2.1 Definition of a Stochastic Process . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 6.2.2 Stationarity Framework.. . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 6.2.3 Examples of Particular Stochastic Processes: The Related Random Walk Models .. . . . . . .. . . . . . . . . . . . . . . . . . . . 6.3 Models Used in the Classical Portfolio Theory.. . .. . . . . . . . . . . . . . . . . . . . 6.3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 6.3.2 Regression Model .. . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 6.4 ARMA Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 6.4.1 Stationary Solution for ARMA(p, q) Model .. . . . . . . . . . . . . . . . . 6.4.2 Second Order Properties of an ARMA (p, q) Process .. . . . . . . 6.4.3 Estimation .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 6.4.4 Model’s Selection .. . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 6.4.5 Forecasting .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . .

143 143 145 145 146 147 150 150 152 155 156 157 161 165 165 166

7 Risks and Non-Linear Dynamics . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 7.1 GARCH Modelling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 7.1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 7.1.2 Description of Some Heteroskedastic Processes . . . . . . . . . . . . . . 7.1.3 Properties of Gaussian GARCH Processes .. . . . . . . . . . . . . . . . . . . 7.1.4 GARCH Models with Non-Gaussian Distributions .. . . . . . . . . . 7.1.5 Inference for GARCH Processes. . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 7.2 Markov Switching Modelling . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 7.2.1 Model Formalisation.. . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 7.2.2 Some Properties.. . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 7.2.3 Simulations Studies with Gaussian Noises . . . . . . . . . . . . . . . . . . . .

167 168 168 169 173 180 181 183 183 185 189

xiv

Contents

7.2.4 Simulations Studies with Non-Gaussian Noises . . . . . . . . . . . . . . 7.2.5 Estimation Procedures .. . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 7.3 Dynamical VaR and Expected Shortfall Measures: Illustration.. . . . . . 7.3.1 VaR and ES Measures Computed Using Parametric Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 7.3.2 The RiskMetrics Approach .. . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 7.3.3 The GARCH Approach.. . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 7.3.4 Markov-Switching Approach . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 7.3.5 VaR and ES Computed Using the Empirical Histogram .. . . . . 7.3.6 VaR and ES Computed Using Copulas . . . .. . . . . . . . . . . . . . . . . . . . 7.4 References on Dynamical Modelling for Financial Applications.. . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . .

198 200 204 205 206 206 207 208 208 209 211

Chapter 1

Introduction

In what follows, we discuss the origins of financial institutions modern risk management, before introducing the methodological aspects of risk measurement. Indeed, for the financial industry, if risk measurement is naturally associated with risk management, then risk management cannot be discuss without addressing the regulatory impact on the latter. Therefore, in the following we present what in our opinion triggered the first Basel accord and how these evolved to today’s version of them.

1.1 1968–1988: Premises In this section we present the situation prior to the first piece of financial regulation as well as the specific incident leading to Basel I accords.

1.1.1 The Triggering Elements In the following, we will recall the economic situation prior to the creation of the Basel Committee.

1.1.1.1 A US Default At the end of the 1960s, a negative balance of payments, a large and growing public debt incurred by the Vietnam War as well as great society programs, and monetary

© Springer Nature Switzerland AG 2019 D. Guégan, B. K. Hassani, Risk Measurement, https://doi.org/10.1007/978-3-030-02680-6_1

1

2

1 Introduction

inflation by the Federal Reserve led to an over-valuation of the dollar. This situation resulted in draining US gold reserves, resulting in the London Gold Pool collapse in March 1968. By 1970, the USA had seen its gold coverage decreased by more than 30%. According to neoclassical economists, this specific moment was the tipping point where dollar holders had lost faith in the ability of the USA to cut budget and trade deficits. In 1971 an increasing amount of dollars were being printed in Washington and used overseas, to pay for government expenditure on both military and social programs. In the first 6 months of 1971, $22 billion worth of assets fled the USA. In response, on 15 August 1971, Nixon issued Executive Order 11615 pursuant to the Economic Stabilisation Act of 1970, unilaterally imposing 90-day wage and price controls, a 10% import surcharge, and most importantly “closed the gold window”, making the dollar inconvertible to gold except on the open market. Unusually, this decision was made without consulting members of the international monetary system or even the pertaining State Department. Shortly after, this event was soon referred to as the Nixon Shock (Eichengreen 2011). The August shock was followed by efforts under US leadership to reform the international monetary system. Throughout autumn 1971, a series of multilateral and bilateral negotiations between the Group of Ten countries (i.e. the G-10 which was actually a gathering of representatives of eleven countries, i.e. Belgium, Canada, France, West Germany, Italy, Japan, the Netherlands, Sweden, Switzerland, the United Kingdom, and the USA) took place in order to redesign the exchange rate regime. Meeting in December 1971 in Washington D.C., the G-10 signed the Smithsonian Agreement, stating that the USA would peg the dollar at $38/ounce with 2.25% trading bands, while other countries would appreciate their currency values with respect to the dollar. Besides, the group also planned to balance the world financial system using special drawing rights alone. Unfortunately, the agreement failed to encourage discipline by the Federal Reserve or the US government. The Federal Reserve was concerned about an increase in the domestic unemployment rate due to the devaluation of the dollar.1 Therefore, attempting to undermine the efforts of the Smithsonian Agreement, the Federal Reserve lowered interest rates trying to reach the objective of full national employment established prior to the agreement. Though member countries anticipated dollars to flow back to the USA, the reduced interest rates within the USA caused dollars to continue flowing out of the USA and into foreign central banks. The inflow of dollars into foreign banks continued the monetisation process of the dollar overseas, defeating the aims of the Smithsonian Agreement. As a result, the dollar price in the gold free market continued to cause pressure on its official rate; soon after a 10% devaluation was announced in February 1973, Japan and the European Economic Community countries decided to let their currencies float. This was the beginning of the collapse of the Bretton Woods System. The end of Bretton

1 Note that contrary to the European union, the American Federal Reserve does not only have to deal with inflation but also with the unemployment rate.

1.1 1968–1988: Premises

3

Woods was formally ratified by the Jamaica Accords in 1976. By the early 1980s, all industrialised nations were using floating currencies, and therefore the collapse of the Bretton Woods System led to the creation of a fully integrated interdependent system (Mastanduno 2009).

1.1.1.2 Herstatt Bank As mentioned in the previous section, the collapse of the Bretton Woods System led to the implementation of a floating exchange rate system. This new system was at the root of the incident leading to the creation of the Basel Committee on Banking Supervision (Schenk 2014). Indeed, on 26 June 1974, German regulators decided to force the liquidation of the troubled Herstatt Bank. Unfortunately, that day, a number of banks had released payment of Deutsche Marks to Herstatt in Frankfurt in exchange for US Dollars that were to be delivered in New York. The bank was closed at 4:30 p.m. German time, which was equivalent to 10:30 a.m. in New York. Because of time zone differences, Herstatt ceased operations between the times of the respective payments and as a consequence the counterparty banks did not receive their USD payments.2 Responding to the cross-jurisdictional implications of the Herstatt issue, the G-10 introduced before joined by Luxembourg and Spain, formed a standing committee under the auspices of the Bank for International Settlements located in Basel, nowadays refereed to as the Basel Committee on Banking Supervision. This committee comprises representatives from central banks and regulatory authorities of the aforementioned countries. The failure of the Herstatt Bank (Mourlon-Druol 2015) was the triggering factor that led to the worldwide implementation of real-time gross settlement systems, in order to ensure that payments between banks were executed in real-time and considered final and the works were coordinated by the Basel Committee on Banking Supervision. The continuous linked settlement platform was released 30 years later in 2002. This payment versus payment process enables member banks to trade foreign currencies without assuming the settlement risk associated with the process, whereby a counterparty could fail before delivering their leg of the transaction. Once the Basel Committee had been created to tackle issues, it was only a matter of time before a first set of rules were released.

2 This type of settlement risk, in which one party in a foreign exchange trade pays out the currency it sold but does not receive the currency it bought, is sometimes called Herstatt risk.

4

1 Introduction

1.2 Basel I In 1988, the Basel Committee on Banking Supervision (BCBS) in Basel, Switzerland, published a set of minimum capital requirements for banks referred to as the 1988 Basel Accord or Basel I (BCBS 1988), and was enforced by law in the countries members of the G-10 in 1992. The accord primarily focused on credit risk and appropriate risk-weighting of assets. Assets of banks were grouped in five categories according to credit risk respectively carrying risk weights of 0% (for example, cash, bullion, home country debt like Treasuries), 20% (securitisations such as mortgage-backed securities (MBS) with the highest AAA rating), 50% (municipal revenue bonds, residential mortgages), 100% (for example, most corporate debt), and some assets given no rating. Banks with an international presence are required to hold capital equal to 8% of their risk-weighted assets (RWA) (Cooke Ratio). The elements that banks have to take into account are: (1) The tier 1 capital ratio = tier 1 capital/all RWA; (2) The total capital ratio = (tier 1 + tier 2 + tier 3 capital)/all RWA; (3) Leverage ratio = total capital/average total assets. It appears necessary here to define tiers 1, 2, and 3 capital. Tier 1 capital represents the core capital, i.e., common stock and disclosed reserves (or retained earnings), and non-redeemable non-cumulative preferred stock. Tier 2 capital represents the “supplementary capital”, i.e., undisclosed reserves, revaluation reserves, general loan-loss reserves, hybrid capital instruments, and subordinated debt. Tier 3 capital mainly consists of short-term subordinated debt. Furthermore, banks were also required to report off-balance sheet items such as letters of credit, unused commitments, and derivatives. These assets were all supposed to factor into the risk weighted assets. Then the reports were submitted to the pertaining regulatory body for supervisory purposes. Since 1988 this framework has been progressively introduced in G-10 countries, comprising 13 countries as of 2013: Belgium, Canada, France, Germany, Italy, Japan, Luxembourg, Netherlands, Spain, Sweden, Switzerland, United Kingdom, and the USA. Eventually, most countries adopted the principles prescribed under Basel I, though the enforcement level of these principles was varying from an implementing country to another. During the period of Basel I enforcement the various market and operational risk incident led the various stakeholders to question the extend of the relevance of the accord. The thoughtful process led to the release of a subsequent accord, usually referred to as Basel II.

1.3 Basel II Basel II is the second of the Basel Accords which were recommendations on financial regulations issued by the Basel Committee on Banking Supervision. Initially released in June 2004 (Decamps et al. 2004), Basel II intended to amend international standards that controlled how much capital banks were required to

1.3 Basel II

5

hold to survive the occurrence of financial and operational risks. These rules were supposed to ensure that the amount of capital banks needed to hold was consistent with the exposure faced by the regulated financial institutions in order to safeguard their solvency and economic stability. Therefore, Basel II established risk management and capital requirements to ensure that banks had appropriate risk controls and adequate capital amount for the risk the banks expose themselves to through their lending, investment, and trading activities. The renewed accords were also supposed to bring some consistency between the various pieces of regulation to limit competitive inequality and regulatory arbitrage among and between internationally active banks. Basel II was supposed to be implemented in the years prior to 2008 though the large number of measures required to be compliant with the regulation delayed the roll-out, and consequently the subprime crisis hit the banks before Basel II could be fully effective. Basel II accord objectives are as follows: • Ensuring that capital allocation is more risk sensitive; • Enhancing disclosure requirements which would allow market participants to assess the capital adequacy of an institution; • Ensuring that credit risk, operational risk, and market risk are quantified based on data and formal techniques; • Attempting to align economic and regulatory capital more closely to reduce the scope for regulatory arbitrage.3 Basel II uses a “three pillars” approach—(1) minimum capital requirements (addressing risk), (2) supervisory review, and (3) market discipline contrary to Basel I accord which was only partially dealing with these pillars, for instance, in Basel I only credit risk was dealt with in a simple manner while market risk was an afterthought and operational risk was not addressed at all. The first pillar was addressing the level of regulatory capital calculated for three major components of risk that banks were facing, for instance, credit, operational, and market risk. Other risks were not considered properly quantifiable at this stage. The credit risk component could be calculated according to three different rules of varying degree of sophistication, the standardised approach, the Foundation IRB, Advanced IRB and General IB2 Restriction where IRB stands for “Internal RatingBased Approach”. For operational risk, there were also three different approaches, for instance, the basic indicator approach or BIA, the standardised approach or TSA, and the advanced measurement approach or AMA. For market risk the preferred approach was the VaR (value at risk), a notion we will discuss in detail in this book. As the Basel II recommendations were supposed to be phased in by the banking industry, moving from standardised requirements to more refined and specific requirements that have been developed for each risk category by each

3 Despite the measures taken, in many areas regulatory capital requirements were diverging from the economic capital.

6

1 Introduction

individual bank, in other words banks were expected to gradually move from simple approaches to internal models. Banks were incentivised to develop internal risk measurement systems that could lead to lower risk capital requirements. The second pillar was a regulatory response to the first pillar, giving regulators more appropriate “tools” to supervise the banks. It was also supposed to provide a framework for dealing with systemic, pension, concentration, strategic, reputational, liquidity, and legal risks, which the accord was combining under the title of residual risk. The Internal Capital Adequacy Assessment Process (ICAAP) is a result of Pillar 2 of Basel II accords. The Market Discipline was addressed in the third pillar. This one was supposed to complement both the minimum capital requirements and supervisory review process by developing a set of disclosure requirements supposed to ensure market participants that they were all following the same rules, disclosing details on capital, risk exposures, risk assessment processes, and the capital adequacy of the institution consistently with how the senior management assess and manage the risks of the institution. Indeed, market discipline supplements regulation as sharing information facilitates the assessment of the financial institutions by others, such as investors, analysts, customers, competitors, or rating agencies, supporting good corporate governance. These disclosures were required to be made at least twice a year, except qualitative disclosures providing a summary of the general risk management objectives and policies which could be made annually. Financial institutions were also required to create a formal policy regarding the disclosed elements in terms of exposures, controls around them as well as the validation procedures and the frequency of these disclosures.4

1.3.1 Basel II.5 Basel II has been regularly updated and enhanced, though the global framework was remaining. These updates represent what we sometimes refer to as Basel II.5 (Pepe 2013). Basel II.5 was essentially a revision of Basel II norms, as the existing norms often failed to correctly address the market risks that banks took on their trading books. Basel II.5’s main aim was to strengthen the capital base, and so the banks’ ability to withstand risk, by increasing the banks’ capital requirements. • An additional charge—incremental risk charge (or IRC)—was introduced. This was introduced to estimate and capture default and credit migration risk. Credit migration risk is when a customer moves his loan from one bank to another bank.

4 The disclosures under Pillar 3 were usually applied to the top consolidated level of the target financial group.

1.4 Basel III

7

• An additional charge for comprehensive risk measure was introduced. This was introduced to correctly measure how one risk related to other risks. Often, a rise in one risk also leads to a rise in another risk, although the effects may show later. In finance, this risk is also known as correlational risk. • Basel II.5 introduced stressed value at risk as an additional requirement to calculate capital requirements. The idea behind SVaR was that under stressed conditions, banks may require more capital, and such capital requirements are not fully captured in normal value-at-risk calculations. So to include capital requirements under stressed conditions, stressed value at risk was included. • Basel II.5 also introduces standardised charges for securitisation and resecuritisation positions. Securitisation and re-securitisation were problems for all regulators, as the popularity of these instruments was leading to loans being incorrectly classified. Basel II.5 tried to deal with market risk in an investment banking framework. The banking industry was evolving at a rapid pace, so there was a need for a completely new look at regulations. For this, Basel III was introduced as we shall see in our next part of this chapter. For public consultation, a series of proposals to enhance the Basel II framework was announced by the Basel Committee. It releases a consultative package that includes: the revisions to the Basel II market risk framework; the guidelines for computing capital for incremental risk in the trading book; and the proposed enhancements to the Basel II framework (BCBS 2016). A final package of measures to enhance the three pillars of the Basel II framework and to strengthen the 1996 rules governing trading book capital was issued by the newly expanded Basel Committee. These measures include the enhancements to the Basel II framework, the revisions to the Basel II market-risk framework, and the guidelines for computing capital for incremental risk in the trading book (BCBS 2004). One of the most difficult aspects of implementing an international agreement is the need to accommodate different cultures, varying structural models, complexities of public policy, and existing regulation. Banks’ senior management determines corporate strategy, as well as the country in which to base a particular type of business, based in part on how Basel II is ultimately interpreted by various countries’ legislatures and regulators.

1.4 Basel III Basel III (BCBS 2017) is a global, voluntary regulatory framework on bank capital adequacy, stress testing, and market liquidity risk, issue in 2010, and scheduled to be introduced from 2013; however, the implementation date has been postponed to

8

1 Introduction

31 March 2019.5 The third instalment of the Basel Accords has been developed in response to the deficiencies in financial regulation revealed by the financial crisis of 2007–2008. Basel III intends to strengthen bank capital requirements originally set in previous versions of the accords, increasing bank liquidity by introducing requirements on liquid asset holdings and funding stability and reducing bank leverage. The original Basel III rule required banks to hold 4.5% of common equity (up from 2% in Basel II) of risk-weighted assets (RWAs). Since 2015, a minimum Common Equity Tier 1 (CET1) ratio of 4.5% must be maintained at all times by the bank. This ratio is calculated as follows: CET1 ≥ 4.5% RWAs

(1.4.1)

The minimum Tier 1 capital increases from 4% in Basel II to 6% since 2015, over RWAs. This 6% is composed of 4.5% of CET1, to which an extra 1.5% Tier 1 is added. Furthermore, Basel III introduced two additional capital buffers: • A mandatory “capital conservation buffer”, equivalent to 2.5% of risk-weighted assets. Considering the 4.5% CET1 capital ratio required, banks have to hold a total of 7% CET1 capital ratio, from 2019 onwards. • A “discretionary counter-cyclical buffer”, allowing national regulators to require up to an additional 2.5% of capital during periods of high credit growth. The level of this buffer ranges between 0 and 2.5% of RWA and must be met by CET1 capital. Despite requiring larger capital charges, Basel III introduced a minimum “leverage ratio”, obtained by dividing Tier 1 capital by the bank’s average total consolidated assets (sum of the exposures of all assets and non-balance sheet items) (BCBS 2014). The banks are expected to maintain a leverage ratio in excess of 3% under Basel III.6 Tier 1 Capital ≥ 3% Total exposure

(1.4.2)

As mentioned previously, Basel III also introduced two liquidity ratios. The first is the “Liquidity Coverage Ratio” which requires banks to hold sufficient highquality liquid assets to cover their total net cash outflows over 30 days. This ratio

5 The

implementation date has actually been postponed multiple times over the past years. July 2013, the US Federal Reserve announced that the minimum Basel III leverage ratio would be 6% for eight systemically important financial institution (SIFI) banks and 5% for their insured bank holding companies. 6 In

1.4 Basel III

9

can be formalised as follows: LCR =

High quality liquid assets ≥ 100% Total net liquidity outflows over 30 days

(1.4.3)

The second is the net stable funding ratio which requires the available amount of stable funding to exceed the required amount of stable funding over a 1-year period of extended stress. As of September 2010, proposed Basel III norms asked for ratios as: 7–9.5% (4.5% + 2.5% (conservation buffer) + 0–2.5% (seasonal buffer)) for common equity and 8.5–11% for Tier 1 capital and 10.5–13% for total capital. Basel III proposal can be summarised as follows:

• First, the quality, consistency, and transparency of the capital base will be raised. – Tier 1 capital: the predominant form of Tier 1 capital must be common shares and retained earnings – Tier 2 capital: supplementary capital, however, the instruments will be harmonised – Tier 3 capital will be eliminated. • Second, the risk coverage of the capital framework will be strengthened. – Promote more integrated management of market and counterparty credit risk – Add the credit valuation adjustment—risk due to deterioration in counterparty’s credit rating – Strengthen the capital requirements for counterparty credit exposures arising from banks’ derivatives, repo and securities financing transactions – Raise the capital buffers backing these exposures – Reduce procyclicality – Provide additional incentives to move OTC derivative contracts to qualifying central counterparties (probably clearing houses). Currently, the BCBS has stated derivatives cleared with a QCCP will be risk-weighted at 2% – Provide incentives to strengthen the risk management of counterparty credit exposures – Raise counterparty credit risk management standards by including wrong-way risk • Third, a leverage ratio will be introduced as a supplementary measure to the Basel II risk-based framework. – intended to achieve the following objectives: – Put a floor under the buildup of leverage in the banking sector

10

1 Introduction

– Introduce additional safeguards against model risk and measurement error by supplementing the risk-based measure with a simpler measure that is based on gross exposures. • Fourth, a series of measures is introduced to promote the buildup of capital buffers in good times that can be drawn upon in periods of stress (“Reducing procyclicality and promoting countercyclical buffers”). – Measures to address procyclicality: * Dampen excess cyclicality of the minimum capital requirement; * Promote more forward-looking provisions; * Conserve capital to build buffers at individual banks and the banking sector that can be used in stress; and – Achieve the broader macroprudential goal of protecting the banking sector from periods of excess credit growth. * Requirement to use long-term data horizons to estimate probabilities of default, * downturn loss-given-default estimates, recommended in Basel II, to become mandatory * Improved calibration of the risk functions, which convert loss estimates into regulatory capital requirements. * Banks must conduct stress tests that include widening credit spreads in recessionary scenarios. – Promoting stronger provisioning practices (forward-looking provisioning): * Advocating a change in the accounting standards towards an expected loss (EL) approach (usually, EL amount:= LGD*PD*EAD). – Fifth, a global minimum liquidity standard for internationally active banks is introduced that includes a 30-day liquidity coverage ratio requirement underpinned by a longer-term structural liquidity ratio called the net stable funding ratio. (In January 2012, the oversight panel of the Basel Committee on Banking Supervision issued a statement saying that regulators will allow banks to dip below their required liquidity levels, the liquidity coverage ratio, during periods of stress.) – The Committee also is reviewing the need for additional capital, liquidity, or other supervisory measures to reduce the externalities created by systemically important institutions.

1.5 Solvency II Directives

11

1.5 Solvency II Directives EU insurance legislation aims to unify a single EU insurance market and enhance consumer protection. The third-generation Insurance Directives established an “EU passport” (i.e. a single licence, similar to the financial passport banks need to operate in the European Union) for insurers to operate in all member states if EU conditions are met. Several member states concluded the EU minima were not sufficient, and enhance the requirements with their own reforms, which unfortunately led to differing regulations, hampering the harmonisation goal. Since the initial Solvency I Directive 73/239/EEC was introduced in 1973, more elaborate risk management systems developed. While the “Solvency I” Directive aimed at revising and updating the current EU Solvency regime, Solvency II has a much wider scope. The Solvency II Directive (2009/138/EC (Eling et al. 2007)) is a Directive in European Union law that codifies and harmonises EU insurance regulations. This directive primary concerns the amount of capital that EU insurance companies must hold to reduce the risk of insolvency. Following an EU Parliament vote on the Omnibus II Directive on 11 March 2014, Solvency II came into effect on 1 January 2016. Solvency II reflects new risk management practices to define required capital and manage risk. A solvency capital requirement has the following purposes: • To reduce the risk that an insurer would be unable to meet claims; • To reduce the losses suffered by policyholders in the event that a firm is unable to meet all claims fully; • To provide early warning to supervisors so that they can intervene promptly if capital falls below the required level; and • To promote confidence in the financial stability of the insurance sector Solvency II is somewhat similar to the banking regulations of Basel II. For example, the proposed Solvency II framework has three main areas (pillars): • Pillar 1 consists of the quantitative requirements (for example, the amount of capital an insurer should hold). • Pillar 2 sets out requirements for the governance and risk management of insurers, as well as for the effective supervision of insurers. • Pillar 3 focuses on disclosure and transparency requirements. The pillar 1 framework sets out qualitative and quantitative requirements for calculation of technical provisions and solvency capital requirement (SCR) using either a standard formula given by the regulators or an internal model developed by the (re)insurance company. Technical provisions comprise two components: the best estimate of the liabilities (i.e. the central actuarial estimate) plus a risk margin. Technical provisions are intended to represent the current amount the (re)insurance company would have to pay for an immediate transfer of its obligations to a third party.

12

1 Introduction

The SCR is the capital required to ensure that the (re)insurance company will be able to meet its obligations over the next 12 months with a probability of at least 99.5%. In addition to the SCR capital a minimum capital requirement (MCR) must be calculated which represents the threshold below which the national supervisor (regulator) would intervene. The MCR is intended to correspond to an 85% probability of adequacy over a 1 year period and is bounded between 25 and 45% of the SCR. For supervisory purposes, the SCR and MCR can be regarded as “soft” and “hard” floors, respectively. That is, a regulatory ladder of intervention applies once the capital holding of the (re)insurance undertaking falls below the SCR, with the intervention becoming progressively more intense as the capital holding approaches the MCR. The Solvency II Directive provides regional supervisors with a number of discretions to address breaches of the MCR, including the withdrawal of authorisation from selling new business and the winding up of the company.

1.6 Successive Criticisms 1.6.1 Basel I Number of issues related to Basel I have been highlighted over time. These are summarised in the following: • Basel I lack of risk sensitivity was first criticised. For instance, a corporate loan to a small company with high leverage was consuming the same amount of regulatory capital as a loan to an AAA-rated large corporate company—8% as they were both risk weighted at 100%. • The incomplete coverage of risk sources was the second main criticism. Indeed, Basel I only focused on credit risk. An amendment made in 1996, the so-called market risk amendment, filled an important gap though this one was deemed insufficient, but there were still other risk types—such as operational, reputational, liquidity, and strategic risks—not addressed by regulatory requirements. • The limited collateral recognition was also an issue, indeed the list of eligible collateral and guarantors was rather limited, compared to those effectively used to mitigate their risks. • The “one-size-fits-all” approach was also highly criticised. Indeed, regulatory requirements were identical across the industry and were not taking into account banks’ risk level, sophistication, or activity type. • The 8% ratio was arbitrary and not explicitly based on solvency targets. • The lack of diversification benefit, for instance, the fact that credit-risk requirements were all additive, and diversification through granting loans to various industries and geographical area were not recognised.

1.6 Successive Criticisms

13

Although Basel I was beneficial to bank supervision, it was only a first step towards a more sophisticated regulatory framework, and Basel II was supposed to be the answer to the shortcomings listed above.

1.6.2 Basel II The role of Basel II, both before and after the global financial crisis, has been widely discussed. As while some argued that the crisis demonstrated weaknesses in the framework, others have criticised its procyclicality. Nout Wellink, former Chairman of the BCBS, outlined in September 2009 some of the strategic responses which the Committee should take as response to the crisis. He suggested a stronger regulatory framework which comprises five key components: (a) better quality of regulatory capital, (b) better liquidity management and supervision, (c) better risk management and supervision including enhanced Pillar 2 guidelines, (d) enhanced Pillar 3 disclosures related to securitisation, offbalance sheet exposures, and trading activities which would promote transparency, and (e) cross-border supervisory cooperation. Given one of the major factors which drove the crisis was the evaporation of liquidity in the financial markets (Wellink et al. 2009), the BCBS also published principles for better liquidity management and supervision in September 2008 (BCBS 2008). A recent OECD study (Slovik 2012) suggested that bank regulation based on the Basel Accords encourages unconventional business practices and contributed to or even reinforced adverse systemic shocks that materialised during the financial crisis. According to Slovik’s analysis, capital regulation based on risk-weighted assets encourages innovation designed to circumvent regulatory requirements. Tighter capital requirements based on risk-weighted assets, recently introduced, may further contribute to these skewed incentives. New liquidity regulation, notwithstanding its good intentions, is another likely candidate to increase bank incentives to exploit regulation. The so-called regulatory arbitrage consequence has also been widely discussed. The direct transposition of Basel II through the Capital Requirement Directive was also considered a main issue as private banks, central banks, and local regulators were forced to rely more on assessments of credit risk by private rating agencies, which in essence was as if regulatory authority had abdicated their powers in favour of private rating agencies (Hunt 2009). It has been pointed out that one of the drivers of the global financial crisis was the system dependence on a few rating agencies.

1.6.3 Basel III Though many people have been criticising Basel III, in reality Basel III has not been fully enforced yet, therefore it is complicated to address it without falling in the trap

14

1 Introduction

of a trial by public opinion. Therefore the following summary of criticisms has to be considered bearing in mind the previous statement. Firstly, the leading criticism comes from the fear that the increased regulatory capital required under Basel III may increase barriers to enter into the sector, and this may benefit existing players by preventing further competition. On the other hand, the capital requirements increase for systemically important banks may hinder their growth. It is worth mentioning that the mission of regulator is to protect the system, and as such this is the purpose of regulatory requirements. One may advocate that a wealthy financial sector might also be a healthy one, however, Basel III came as a response to the financial crisis which tend to advocate the contrary. Another criticism stands in the fact that the new regulation does not change the risk-weighting method, which was one of the main issues leading to the subprime crisis. The rating systems have shown their limitation, in particular not properly capturing systemic issues and spillover effects, relying on the same system may appear relatively strange. Indeed, Basel III is still (partially) relying on rating agencies, as weights are assigned based on the rating. Basel III aims at harmonising banking regulations across the world, which may be a mistake. Indeed, countries have different legal systems, cultures, and commercial practices, and a single regulatory framework might not be suitable as it might be incompatible with their domestic environment. Consequently, the single regulatory framework would have to be set on the smallest common denominator which in essence may lead towards not incomplete and therefore worst regulations. It has also been suggested by economists that Basel III may threaten countries economical growth by keeping the scarce capital tied up, and that might be even truer for developing countries. However, this effect might be compensated by the structural effect of the regulation, in the sense that practitioners have a tendency to adapt. Finally, it seems that, except for market risk, internal models have been targeted for termination, and we believe that is not a good signal as it was suggesting that models were in essence inappropriate and that simpler or simplistic methodological and risk measurement approaches were sufficient to ensure the safety of the system. On the contrary, for market risk, more advanced models (some may say overly complicated) have been proposed, though one of the main lessons of the latest financial crisis was actually that credit risk was consuming the largest part of regulatory capital, operational risk was consuming as much, while market risk consumption was much smaller.

1.6.4 Solvency II A number of the large life insurers in the UK were not satisfied by the way the legislation had been developed. Doubts about the enforcement of a market-consistent valuation approach have also been expressed by American subsidiaries of UK parents, in particular the

References

15

fact that the legislation could lead overseas subsidiaries to become relatively uncompetitive with local peers (Al-Darwish et al. 2011). The demanding nature of Solvency II legislation compared to previous regulations has attracted various critics. Complying with Solvency II imposed a complex and significant burden on many European financial organisations, with 75% of firms in 2011 reporting that they were not in a position to comply with Pillar III reporting requirements (Devineau and Loisel 2009).

References Al-Darwish, Ahmed et al. 2011. Possible unintended consequences of basel III and solvency II. Washington, D.C.: International Monetary Fund. BCBS. 1988. Basel Committee: International convergence of capital measurement and capital standards. Basel: Bank for International Settlements. –. 2004. International convergence of capital measurement and capital standards. Basel: Bank for International Settlements. –. 2008. Principles for sound liquidity risk management and supervision - final document. Basel: Bank for International Settlements. –. 2014. Basel III leverage ratio framework and disclosure requirements. Basel: Bank for International Settlements. –. 2016. Revisions to the basel II market risk framework. Basel: Bank for International Settlements. –. 2017. Basel III: A global regulatory framework for more resilient banks and banking systems. Basel: Bank for International Settlements. Decamps, Jean-Paul, Jean-Charles Rochet, and Benoit Roger. 2004. “The three pillars of basel II: optimizing the mix”. Journal of Financial Intermediation 13, no. 2: 132–155. Devineau, Laurent, and Stéphane Loisel. 2009. “Risk aggregation in solvency II: How to converge the approaches of the internal models and those of the standard formula?”. Bulletin Français d’Actuariat 9, no. 18: 107–145. Eichengreen, Barry. 2011. Exorbitant privilege: The rise and fall of the dollar and the future of the international monetary system. Oxford: Oxford University Press. Eling, Martin, Hato Schmeiser, and Joan T. Schmit. 2007. “The solvency II process: Overview and critical analysis”. Risk Management and Insurance Review 10, no. 1: 69–85. Hunt, John Patrick. 2009. “Credit rating agencies and the worldwide credit crisis: The limits of reputation, the insufficiency of reform, and a proposal for improvement”. Columbia Business Law Review 2009, no. 1: 109, 112–149. Mastanduno, Michael. 2009. “System maker and privilege taker: US power and the international political economy”. World Politics 61, no. 1: 121–154. Mourlon-Druol, Emmanuel. 2015. “‘Trust is good, control is better’: The 1974 Herstatt bank crisis and its implications for international regulatory reform”. Business History 57, no. 2: 311–334. Pepe, Giovanni. 2013. “Basel 2.5: Potential benefits and unintended consequences”. Bank of Italy Occasional Paper No. 159. Schenk, Catherine R. 2014. “Summer in the city: Banking failures of 1974 and the development of international banking supervision”. The English Historical Review 129, no. 540: 1129–1156. Slovik, P. 2012. “Systemically important banks and capital regulation challenges”. OECD Publishing, No. 916. Wellink, Nout et al. 2009. “Beyond the crisis: The basel committee’s strategic response”. Financial Stability Review 13: 123–132.

Chapter 2

Financial Institutions: A Regulation Review Through the Risk Measurement Prism

In this chapter, considering the Basel framework (BCBS 2006), we offer a brief overview of financial regulation regarding both risk measures and risk measurement and their impact on risk management. In this chapter the financial regulation will be presented per risk type. This chapter aims at providing elements to put risk measurement into perspective and to provide concrete elements regarding their limitations once applied.

2.1 Credit Risk Credit risk is the risk that counterparties default on their obligations, i.e., the risk that a debtor cannot fulfill his repayment obligations. Credit events include the loss of the principal, the default on interest, some cash flows disruption, or cost escalation. The losses related to credit events vary in amounts and causes. For example, a company which fails to pay one of its employees on the due date for special reason is mechanically considered as default. Therefore, the occurrence of a credit event is not necessarily an evidence of the risk of the investment. In order to limit the risk of losing the money lent to borrowers, sovereigns, companies, etc., a bank undertakes various verifications and evaluates the potential loss engendered by potential credit events. Losses can arise in a number of circumstances, for example: • A consumer fails to make a payment on a mortgage loan, a credit card, or any other loan. • A company is unable to repay asset-secured fixed or floating charge debt. • A company does not pay one of its invoices when due. • A government bond issuer does not make a payment on a coupon or principal payment when due.

© Springer Nature Switzerland AG 2019 D. Guégan, B. K. Hassani, Risk Measurement, https://doi.org/10.1007/978-3-030-02680-6_2

17

18

2 Financial Institutions: A Regulation Review Through the Risk Measurement Prism

• An insolvent insurance company does not pay a policy obligation. • A bank becoming insolvent will not pay back funds to a depositor. • Bankruptcy protection to an insolvent consumer or business is granted by a government (Chapter 11 of Title 11 of the United States Bankruptcy Code). It is noteworthy to mention that these may happen due to the materialisation of natural, economical, and human risks. Credit risk arises when borrowers are unable to pay back their debt either willingly or unwillingly. Therefore, as mentioned before, to reduce the lender’s credit risk exposure, lenders usually perform a credit check on the prospective borrower, besides may require borrowers to take out the appropriate insurance, such as mortgage insurance, or may seek security over some assets of the borrower or a guarantee from a third party. The lender can also either securitise the debt or sell the created assets to other companies. In general, the higher the risk, the higher the interest rate associated with the debt. From a capital charge point of view the exposure is modelled and measured as described in the following.

2.1.1 Standardised Approach The standardised approach refers to a set of credit risk measurement techniques proposed under Basel II. Under this approach the banks are required to use ratings from External Credit Rating Agencies to quantify required capital for credit risk. The risk weights associated with these ratings are summarised below. • Claims on sovereigns Credit assessment AAA to AA− A+ to A− BBB+ to BBB− BB+ to B− Below B− Unrated Risk weight 0% 20% 50% 100% 150% 100%

• Claims on the BIS, the IMF, the ECB, the EC, and the MDBs—Risk weight: 0% • Claims on banks and securities companies – Related to assessment of sovereign as banks and securities companies are regulated. Credit assessment AAA to AA− A+ to A− BBB+ to BBB− BB+ to B− Below B− Unrated Risk weight 20% 50% 100% 100% 150% 100%

– Claims on corporates • Claims on retail products: This includes credit card, overdraft, auto loans, personal finance, and small business—Risk weight: 75

2.1 Credit Risk

19

• Claims secured by residential property—Risk weight: 35% • Claims secured by commercial real estate—Risk weight: 100% • Overdue loans—more than 90 days other than residential mortgage loans—Risk weight: – 150% for provisions that are less than 20% of the outstanding amount – 100% for provisions that are between 20 and 49% of the outstanding amount – 100% for provisions that are no less than 50% of the outstanding amount, but with supervisory discretion are reduced to 50% of the outstanding amount • Other assets—Risk weight: 100% • Cash—Risk weight: 0% Credit assessment Risk weight

AAA to AA− 20%

A+ to A− 50%

BBB+ to BB− 100%

Below BB− 150%

Unrated 100%

2.1.2 Internal Ratings-Based Approach Both foundation internal ratings-based approach (FIRB) and advanced internal ratings-based approach (AIRB) refer to sets of credit risk measurement approaches under Basel II/III capital assessment rules for banks underpinning the fact that these are allowed to develop their own empirical model to evaluate the required capital to cover credit risk exposure. The financial institutions considered are only allowed to use this approach following a thorough review and the approval from the adequate regulatory and/or supervisory authority. FIRB banks are allowed to develop their own model to estimate the probability of default (PD) for individual clients or groups of clients while other parameters are provided by the regulators. FIRB banks are required to use regulator’s prescribed loss given default (LGD) and other parameters required for calculating the riskweighted asset (RWA) for non-retail portfolios while for retail exposures banks are required to use their own IRB parameters, for instance, the probability of default, the loss given default, and the credit conversion factor. Then total required capital is calculated as a fixed percentage of the estimated RWA. Banks can use this approach after they received an approval from their local regulators. AIRB banks are entitled to use their own quantitative models to estimate the PD, the exposure at default (EAD), the LGD as well as any other parameter required for calculating the RWA. The regulatory capital is then calculated as a fixed percentage of the estimated RWA. Credit risk being an element of core banking it follows, that banks are expected to be capable of adopting more sophisticated techniques in credit risk measurement and management.

20

2 Financial Institutions: A Regulation Review Through the Risk Measurement Prism

For public companies, default probabilities are commonly estimated using either the structural model of credit risks such as the Merton model (Tudela and Young 2005) or the Jarrow-Turnbull model (Frühwirth and Sögner 2006). For retail and unlisted company exposures, default probabilities are traditionally estimated using logistic regression (Addo et al. 2018). The objective is to define risk weights by determining the cut-off points between the expected loss (EL) and the unexpected loss (UL). Indeed, the regulatory capital should be held for latter. Then, the risk weights for individual exposures are calculated based on the function provided by the Basel committee. Below the formulas for some banks’ major products have been given: corporate, small-medium enterprise (SME), residential mortgage, and qualifying revolving retail exposure. In the formulas below, N(x) denotes the cumulative distribution function for a standard normal random variable, G(z) denotes the inverse cumulative distribution function for a standard normal random variable, PD is the probability of default, LGD is the loss given default, EAD is the exposure at default, and M is the effective maturity. 1. Corporate exposure: The exposure for corporate loans is calculated as follows (BCBS 2005): Correlation    1 − e−50∗P D 1 − e−50∗P D R = AV C ∗ 0.12 ∗ + 0.24 ∗ 1 − (2.1.1) 1 − e−50 1 − e−50 Asset value correlation (AVC) was introduced by the Basel III Framework, and is applied as follows: AV C = 1.25 if the company is a large regulated financial institution (total asset equal or greater to US $100 billion) or an unregulated financial institution regardless of size, AV C = 1 else Maturity adjustment equals b = (0.11852 − 0.05478 ∗ ln(P D))2

(2.1.2)

Capital requirement equals  K = LGD ∗ N



1 ∗ G(P D) + 1−R



 R 1 + (M − 2.5)b ∗ G(0.999) − P D ∗ 1−R 1 − 1.5b

(2.1.3) Risk-weighted assets equals RW A = K ∗ 12.5 ∗ EAD

(2.1.4)

2.2 Credit Value Adjustment

21

2. Corporate exposure adjustment for SME: For small and medium enterprises with annual sales turnover below 50 million euro, the correlation may be adjusted as follows: Correlation   1 − e−50∗P D 1 − e−50∗P D max(S − 5, 0) ), R = 0.12 ∗ + 0.24 ∗ 1 − − 0.04 ∗ (1 − 1 − e−50 1 − e−50 45

(2.1.5) where S is the enterprise’s annual sales turnover in millions of euro. 3. Residential mortgage exposure: The exposure related to residential mortgages can be calculated as follows, where the correlation R equals 0.15. The capital requirement is obtained as follows:  K = LGD ∗ N



1 ∗ G(P D) + 1−R



 R ∗ G(0.999) − P D , 1−R (2.1.6)

and the risk-weighted assets equals, RW A = K ∗ 12.5 ∗ EAD

(2.1.7)

4. The exposure related to unsecured retail credit products can be calculated as follows, where the correlation R equals 0.04. The capital requirement is obtained as follows:     1 R K = LGD ∗ N ∗ G(P D) + ∗ G(0.999) − P D , 1−R 1−R (2.1.8) and the risk-weighted assets equals RW A = K ∗ 12.5 ∗ EAD

(2.1.9)

2.2 Credit Value Adjustment The credit valuation adjustment (CVA) (BCBS 2015; Gregory 2012) is the difference between the risk-free portfolio value and the true portfolio value that takes into account the possibility of a counterparty’s default. In other words, CVA is the market value of counterparty credit risk. This price depends on counterparty credit spreads as well as on the market-risk factors that drive derivatives’ values and, therefore, exposure. CVA belongs to the family of related valuation adjustments, collectively

22

2 Financial Institutions: A Regulation Review Through the Risk Measurement Prism

known as XVA for X-value adjustment. Unilateral CVA is given by the risk-neutral expectation of the discounted loss. The risk-neutral expectation can be written as





T

CVA(T) = E [L ] = (1 − R) Q

E

Q

0

B0 E(t)|τ = t dPD(0, t) Bt

(2.2.1)

where T is the maturity of the longest transaction in the portfolio, Bt is the future value of one unit of the base currency invested today at the prevailing interest rate for maturity t, R is the fraction of the portfolio value that can be recovered in case of a default, τ is the time of default, E(t) is the exposure at time t, and PD(s, t) is the risk-neutral probability of counterparty default between times s and t. These probabilities can be obtained from the term structure of credit default swap (CDS) spreads. More generally CVA can refer to a few different concepts: • The mathematical concept as defined above; • A part of the regulatory capital and RWA (risk-weighted asset) calculation introduced under Basel 3; • The CVA desk of an investment bank, whose purpose is to: – hedge for possible losses due to counterparty default; – hedge to reduce the amount of capital required under the CVA calculation of Basel 3; – The “CVA charge”. The hedging of the CVA desk has a cost associated with it, i.e., the bank has to buy the hedging instrument. This cost is then allocated to each business line of an investment bank. This allocated cost is called the “CVA Charge”. Assuming independence between exposure and counterparty’s credit quality greatly simplifies the analysis. Under this assumption this simplifies to

T

CVA = (1 − R)

EE∗ (t) dPD(0, t)

(2.2.2)

0

where EE∗ is the risk-neutral discounted expected exposure (EE)

2.3 Operational Risk The Basel II Committee defines operational risk as: “The risk of loss resulting from inadequate or failed internal processes, people and systems or from external events. This definition includes legal risk, but excludes strategic and reputational risk”. However, the Basel Committee recognises that operational risk is a term that has a variety of meanings and therefore, for internal purposes, banks are permitted to adopt their own definitions of operational risk, provided that the minimum elements in the Committee’s definition are included. From a regulatory capital calculation

2.3 Operational Risk

23

standpoint, and once again assuming that capital calculations are somehow a risk measurement approach, operational risk currently has four ways of calculating capital including the future standardised measurement approach (SMA). These are presented in the following subsections.

2.3.1 Basic Indicator Approach Operational risk capital allocation is done using a single indicator: the gross income. The allocation is a fixed percentage (denoted α in what follows) multiplied by its individual amount of gross income. This approach is easy to implement and universally applicable. Nevertheless its simplicity limits responsiveness to firmspecific needs and characteristics. While the basic indicator approach might be suitable for smaller banks with a simple range of business activities, the Basel Committee expects internationally active banks and banks with significant operational risk to use a more sophisticated approach within the overall framework. The Basel Committee provides incentives to move towards more sophisticated approaches: they actually proposed to set α at a higher level, to use the second pillar or to make the standardised approach the entry point for internationally active banks. It is also worth noticing that a sample of internationally active banks has formed the basis of this calibration. As it is anticipated that the basic indicator approach will mainly be used by smaller, domestic banks, a wider sample base may be more appropriate. Formally, the capital allocation (CA) is given by, CA = α × GI,

(2.3.1)

where, GI represents the gross income.

2.3.2 Standardised Approach The standardised approach represents a further refinement along the evolutionary spectrum of approaches for operational risk capital. The capital allocation is not anymore a basic percentage of the overall gross income; banks’ activities are divided into a number of standardised business units and business lines. Thus, the standardised approach is more capable of reflecting the different risk profiles across banks as reflected by their broad business activities. The proposed business units and business lines of the standardised approach mirror those developed by an industry initiative to collect internal loss data in a consistent manner. To each business line corresponds a specific capital allocation computed on a particular indicator. Table 2.1 presents these ones.

24

2 Financial Institutions: A Regulation Review Through the Risk Measurement Prism

Table 2.1 Operational risk capital allocation per business lines Business units Investment banking Banking

Other

Business lines Corporate finance Trading and sales Retail banking Commercial banking Payment and settlement Retail brokerage Asset management

Indicator Gross income Gross income Annual average asset Annual average asset Annual settlement throughput Gross income Total funds under management

The capital charge (CA) is now for each business line a portion of the chosen indicator, formally, CAi = θi × Indicator,

(2.3.2)

where, θi , i = 1, . . . , 8, is different percentage for each business line. The main objective of this approach is to lay the foundation of internal databases, and therefore enable the evolution to a more sophisticated approach.

2.3.3 The Advanced Measurement Approach (AMA) The advanced measurement approach (AMA) is a set of operational risk measurement techniques proposed under Basel II capital adequacy rules for banking institutions. Now, banks are allowed to develop their own empirical model to quantify the required capital to face operational risk. The use of this approach is subject to approval from banks’ local regulators. Besides, according to section 664 of the original Basel Accords, in order to approve the AMA model, a bank must at least satisfy the following requirements: • Its board of directors and senior management, as appropriate, should be actively involved in the oversight of the operational risk management framework; • It requires an operational risk management system that is conceptually sound and is implemented with integrity; and • It must have sufficient resources in the use of the approach in the major business lines as well as the control and audit areas. The AMA requires using the following items: 1. 2. 3. 4.

Internal data External data Scenario analysis Qualitative indicators, the so-called business environment and internal control factors (BEICFs).

2.3 Operational Risk

25

The following subsections provide further explanations on the previous items. The advanced measurement approaches (AMA) is one of the three possible operational risk methods that can be used under Basel II by a bank or other financial institution. The other two are the basic indicator approach and the standardised approach. The methods increase in sophistication and risk sensitivity with AMA being the most advanced of the three. Under AMA banks are entitled to develop an internal model to evaluate the capital charge pertaining to operational risk. Once again, banks have to follow a strict governance process before being allowed to use this approach. Once a bank has been approved to adopt AMA, it cannot revert to a simpler approach without supervisory approval, though some banks have been reverted to standardised such as Lloyds Banking group. Furthermore, the arrival of the standardised measurement approach has replaced the AMA for Pillar one (Basel III), while AMA standards have been pushed down into Pillar II. Also, according to section 664 of original Basel Accord, in order to qualify for use of the AMA a bank must satisfy its supervisor that, at a minimum: • Its board of directors and senior management, as appropriate, are actively involved in the oversight of the operational risk management framework; • It has an operational risk management system that is conceptually sound and is implemented with integrity; and • It has sufficient resources in the use of the approach in the major business lines as well as the control and audit areas.

2.3.4 Standardised Measurement Approach or New Standardized Approach As stated in BCBS (2016b) and BCBS (2017b) the three approaches presented before are supposed to be replaced by a new Standardised Approach (usually referred to as SMA or new SA). This SMA or new SA combines the business indicator component (BIC), a simple financial statement proxy of operational risk exposure, with bank specific operational loss data referred to as the internal loss multiplier (ILM). Since the October 2014 consultation, the structure of the BI has been revised to avoid penalising certain business models, such as those based on the distribution of products bought from third parties, and those based on high interest margins. Adjustments have also been made to address issues related to the treatment of financial and operating leases1 . Before obtaining the BIC, a business indicator (BI), made up of almost the same profit and loss (P&L) items that are found in the composition of gross income (GI), is calculated. The main difference relates to how the items are combined. The BI uses positive values of its components, thereby avoiding counterintuitive

1 In

this section financial values are assumed in euros.

26

2 Financial Institutions: A Regulation Review Through the Risk Measurement Prism

negative contributions from some of the bank’s businesses to the capital charge (e.g. negative P&L on the trading book), which is possible under the GI. In addition, the BI includes income statement items related to activities that produce operational risk that are omitted (e.g. P&L on the banking book) or netted (e.g. fee expenses, other operating expenses) in the GI. In particular, changing the impact of other operating expenses on capital requirements from negative (in GI) to positive (in the BI) is necessary to improve the coherence of the BI as a proxy indicator for operational loss exposure, as other operating expenses typically include operational losses, and thus an increase in other operating expenses should not result in a decrease in operational risk capital requirements. Three components, that are calculated from P&L positions as well as balance sheet positions, are added up to give the Business Indicator value, i.e. Interest, Lease and Dividend Component (ILDC), Services Component (SC) and Financial Component (FC). Therefore, BI = I LDC + SC + F C where, ¯ I LDC = min[|Interest Income -¯ Interest Expense|; 2.25%∗Interest Earning Assets]+Dividend¯ Income,

(2.3.3) ¯ ¯ ¯ ¯ SC = max[Other Operating Income; Other Operating Expense] + max[Fee Income; Fee Expense]

(2.3.4) ¯ ¯ F C = |Net P&L Trading Book| + |Net P&L Banking Book|.

(2.3.5)

Then, 1. if the BI ≤ 1 billion then the BIC is equal to BI ∗ 12%, 2. if the 1 < BI ≤ 30 billion then BIC is equal to BI ∗ 15%, 3. if the BI ≥ 30 billion then BIC is equal to BI ∗ 18%. A bank’s internal operational risk loss experience affects the calculation of operational risk capital through the Internal Loss Multiplier (ILM). The ILM is defined as:     LC 0.8 I LM = ln exp(1) − 1 + , (2.3.6) BI C where the Loss Component (LC) is equal to 15 times average annual operational risk losses incurred over the previous 10 years. The ILM is equal to one when the loss and business indicator components are equal. When the LC is greater than the BIC, the ILM is greater than one. That is, a bank with losses that are high relative to its BIC is required to hold higher capital due to the incorporation of internal losses into the calculation methodology. Conversely, when the LC is lower than the BIC, the ILM is less than one. That is, a bank with losses that are low relative to its BIC is required to hold lower capital due to the incorporation of internal losses into the calculation methodology.

2.4 Market Risk

27

The minimum operational risk capital (ORC) requirement is then determined as follows: ORC = BI C ∗ I LM.

(2.3.7)

For banks in bucket with BI inferior to 1 billion, internal loss data do not affect the capital calculation, i.e. the ILM is equal to 1, so that operational risk capital is equal to the BIC.

2.4 Market Risk Market risk is the risk that the value of an investment decreases due to changes in market factors. These factors will have an impact on the overall performance on the financial markets and can only be reduced by diversification into assets that are not correlated with the market—such as certain alternative asset classes. Market risk is sometimes called “systematic risk” because it relates to factors, such as a recession, that impact the entire market. There are several different risk factors that make up market risk, for instance: • Currency risk: The risk that exchange rates will go up or possibly down • Equity risk: The risk that share prices will go up or down • Inflation risk: the potential for inflation to increase the price of all goods and services such that it undermines the value of money • Commodity risk: the possibility of commodity prices such as metals change value dramatically • Interest rate risk: the risk that comes from an increase or decrease in interest rates In the next section, we will present the fundamental review of the trading book that led to the capital calculation presented in the subsequent sections.

2.4.1 The Fundamental Review of the Trading Book The fundamental review of the trading book or FRTB regulations (BCBS 2014) is a response to a pre-crisis framework that has been deemed inadequate and weak in many areas. This is particularly true for the definition of the boundary of the trading book. Indeed, internal model approach was not sufficient and many issues were to be dealt with for a better regulatory capital framework. For instance, tail risk was something that the VaR approach did not capture adequately along with illiquidity. Most IMA-based approaches also allow for generous diversification effects as they are based on historic parameters which definitely do not hold in a crisis situation (correlation largely become relevant in very stressed markets).

28

2 Financial Institutions: A Regulation Review Through the Risk Measurement Prism

The current standardised approach is highly inadequate as the linkage between the internal model and the standardised approach is inappropriate. Besides, the current standardised approach lacks risk sensitivity. This issue needs to be dealt with along with constraining the diversification benefits and hedging. The FRTB addresses the boundary issue between the banking and the trading book in order to reduce regulatory arbitrage between the two books limiting the will to transfer from one book to the other and introducing reporting guidelines and regulatory oversight that should allow for a much better framework that governs the boundary between the two books. The FRTB also aims at capturing the effect of tail risk more effectively as well as capturing liquidity effects. Tail risk is captured moving from a VaR to an expected shortfall approach for various horizon depending on asset/risk classes. Under FRTB internal models have to be approved at the desk level. If desks are not approved, these will be moved back to the standardised approach. Trading desks will have to show that their models are compliant by showing that they have adequate P&L attribution and backtesting procedures in place. It is important to note that P&L attribution (i.e. model-based P&L by opposition to risk-based theoretical P&L) will be under scrutiny to ensure that risk models properly capture the risk associated with the models themselves. Besides, hedging and diversification benefits will be constrained and an additional charge will come to cover non-modellable risk factors. The revised standardised approach (RSA) will be considered for banks willing to use simple approaches. This approach will also be the fallback for banks not gaining approval for there internal models. The main methodological modification is that the approach is now based on risk sensitivities across asset classes. The RSA aims at providing a consistent way to measure risks across geographic areas, giving authorities a better way to compare IMA and SA banks as the two approaches are sharing a common framework. Furthermore, a standardised default risk charge will be added along an add-on for residual risk, clearly harder to model. Therefore, following the FRTB and from a capital calculations standpoint, two possibilities are offered to banks to perform them. These are presented in the following.

2.4.2 Standardised Approach As presented in the standardised approach capital requirement (BCBS 2016a) is the simple sum of three components: the risk charges under the sensitivities-based method, the default risk charge, and the residual risk add-on. The risk charge under the sensitivities-based method must be calculated by aggregating the following risk measures: • Delta: A risk measure based on sensitivities of a bank’s trading book to regulatory delta risk factors. Delta sensitivities are to be used as inputs into the aggregation formula which delivers the capital requirement for the sensitivities-based method.

2.4 Market Risk

29

• Vega: A risk measure that is also based on sensitivities to regulatory vega risk factors to be used as inputs to a similar aggregation formula as for delta risks. • Curvature: A risk measure which captures the incremental risk not captured by the delta risk of price changes in the value of an option. Curvature risk is based on two stress scenarii involving an upward shock and a downward shock to a given risk factor. The worst loss of the two scenarii is the risk position to be used as an input into the aggregation formula which delivers the capital charge. In order to address the risk that correlations may increase or decrease in periods of financial stress, three risk charge figures must be calculated for each risk class defined under the sensitivities-based method, based on three different scenarios on the specified values for the correlation parameter ρkl (i.e. correlation between risk factors within a bucket) and γbc (i.e. correlation across buckets within a risk class). There must be no diversification benefit recognised between individual risk classes. We refer to BCBS (2016a) for more details on the parameters. The bank must determine each delta and vega sensitivity and curvature scenario based on instrument prices or pricing models that an independent risk control unit within a bank uses to report market risks or actual profits and losses to senior management. The default risk charge captures the jump-to -default risk in three independent capital charge computations for default risk of non-securitisations, securitisations (non-correlation trading portfolio), and securitisation correlation trading portfolio. It is calibrated based on the credit risk treatment in the banking book in order to reduce the potential discrepancy in capital requirements for similar risk exposures across the bank. Some hedging recognition is allowed within a risk weight bucket. There must be no diversification benefit recognised between different buckets. Additionally, the Committee acknowledges that not all market risks can be captured in the standardised approach, as this might necessitate an unduly complex regime. A residual risk add-on is thus introduced to ensure sufficient coverage of market risks. Supervisory authorities will be able to insist on a period of initial monitoring and live testing of a bank’s internal model before it is used for supervisory capital purposes. In addition to these general criteria, banks using internal models for capital purposes will be subject to the additional requirements detailed below.

2.4.3 Internal Models Approach The use of an internal model for the purposes of regulatory capital determination will be conditional upon the explicit approval of the bank’s supervisory authority. Home and host country supervisory authorities of banks that carry out material trading activities in multiple jurisdictions intend to work cooperatively to ensure an efficient approval process.

30

2 Financial Institutions: A Regulation Review Through the Risk Measurement Prism

Table 2.2 Liquidity horizons for expected shortfall calculation purposes 1 10

j LHj

2 20

3 40

4 60

5 120

The supervisory authority will only give its approval if at a minimum: 1. It is satisfied that the bank’s risk management system is conceptually sound and is implemented with integrity; 2. The bank has, in the supervisory authority’s view, sufficient numbers of staff skilled in the use of sophisticated models not only in the trading area but also in the risk control, audit and, if necessary, back office areas; 3. The bank’s models have, in the supervisory authority’s judgement, a proven track record of reasonable accuracy in measuring risk; 4. The bank regularly conducts stress tests along the lines discussed in BCBS (2016a); and 5. The positions included in the internal model for regulatory capital determination are held in approved trading desks that have passed the required tests. From a quantitative standpoint, the document states the following: Banks will have flexibility in devising the precise nature of their models, but the following minimum standards will apply for the purpose of calculating their capital charge. Individual banks or their supervisory authorities will have discretion to apply stricter standards. “Expected shortfall2” must be computed on a daily basis for the bankwide internal model for regulatory capital purposes. Expected shortfall must also be computed on a daily basis for each trading desk that a bank wishes to include within the scope for the internal model for regulatory capital purposes. In calculating the expected shortfall, a 97.5th percentile, one-tailed confidence level is to be used. In calculating the expected shortfall, the liquidity horizons described in BCBS (2016a) (see Table 2.2) must be reflected by scaling an expected shortfall calculated on a base horizon. The expected shortfall for a liquidity horizon must be calculated from an expected shortfall at a base liquidity horizon of 10 days with scaling applied to this base horizon result as follows:

 ⎛   ⎞2    LH − LH j j −1 ⎝EST (P , j ) ⎠ ES = (EST (P ))2 + T

(2.4.1)

j ≤2

where, • ES is the regulatory liquidity-adjusted expected shortfall; • T is the length of the base horizon, i.e., 10 days; 2 The expected shortfall will be formally introduced in the next chapter, but for clarity purposes, we will state that the expected shortfall represents the expected loss above a threshold, usually the VaR.

2.5 IFRS9 as a Risk Regulation

31

• EST (P ) is the expected shortfall at horizon T of a portfolio with positions P = (pi ) with respect to shocks to all risk factors that the positions P are exposed to; • EST (P , j ) is the expected shortfall at horizon T of a portfolio with positions P = (pi ) with respect to shocks for each position pi in the subset of risk factors Q(pi , j ), with all other risk factors held constant; • the ES at horizon T , EST (P ) must be calculated for changes in the risk factors, and EST (P , j ) must be calculated for changes in the relevant subset Q(pi , j ) of risk factors, over the time interval T without scaling from a shorter horizon; • Q(pi , j ) is the subset of risk factors whose liquidity horizons, as specified in (BCBS 2016a—section 181(k)), for the desk where pi is booked are at least as long as LHj according to the table below. For example, Q(pi , 4) is the set of risk factors with a 60-day horizon and a 120-day liquidity horizon. Note that Q(pi , j ) is a subset of Q(pi , j − 1); • the time series of changes in risk factors over the base time interval T may be determined by overlapping observations; and • LHj is the liquidity horizon j , with lengths in Table 2.2.

2.5 IFRS9 as a Risk Regulation IFRS 9 is an International Financial Reporting Standard (IFRS) promulgated by the International Accounting Standards Board (IASB) (IASB 2018). It specifies the accounting for financial instruments. Three main topics are being addressed: classification and measurement of financial instruments, impairment of financial assets, and hedge accounting. IFRS 9 requires an entity to recognise a financial asset or a financial liability in its statement of financial position when it becomes party to the contractual provisions of the instrument. At initial recognition, an entity measures a financial asset or a financial liability at its fair value plus or minus, in the case of a financial asset or a financial liability not at fair value through profit or loss, transaction costs that are directly attributable to the acquisition or issue of the financial asset or the financial liability. When an entity first recognises a financial asset, it classifies it based on the entity’s business model for managing the asset and the asset’s contractual cash flow characteristics, as follows: • Amortised cost—a financial asset is measured at amortised cost if both of the following conditions are met: – the asset is held within a business model whose objective is to hold assets in order to collect contractual cash flows; and – the contractual terms of the financial asset give rise on specified dates to cash flows that are solely payments of principal and interest on the principal amount outstanding.

32

2 Financial Institutions: A Regulation Review Through the Risk Measurement Prism

• Fair value through other comprehensive income—financial assets are classified and measured at fair value through other comprehensive income if they are held in a business model whose objective is achieved by both collecting contractual cash flows and selling financial assets. • Fair value through profit or loss—any financial assets that are not held in one of the two business models mentioned are measured at fair value through profit or loss. When, and only when, an entity changes its business model for managing financial assets it must reclassify all affected financial assets. All financial liabilities are measured at amortised cost, except for financial liabilities at fair value through profit or loss. Such liabilities include derivatives (other than derivatives that are financial guarantee contracts or are designated and effective hedging instruments), other liabilities held for trading, and liabilities that an entity designates to be measured at fair value through profit or loss (see “fair value option” below). After initial recognition, an entity cannot reclassify any financial liability. An entity may, at initial recognition, irrevocably designate a financial asset or liability that would otherwise have to be measured at amortised cost or fair value through other comprehensive income to be measured at fair value through profit or loss if doing so would eliminate or significantly reduce a measurement or recognition inconsistency (sometimes referred to as an “accounting mismatch”’) or otherwise results in more relevant information. Impairment of financial assets is recognised in stages: • Stage 1—as soon as a financial instrument is originated or purchased, 12-month expected credit losses are recognised in profit or loss and a loss allowance is established. This serves as a proxy for the initial expectations of credit losses. For financial assets, interest revenue is calculated on the gross carrying amount (i.e. without deduction for expected credit losses). • Stage 2—if the credit risk increases significantly and is not considered low, full lifetime expected credit losses are recognised in profit or loss. The calculation of interest revenue is the same as for stage 1. • Stage 3—if the credit risk of a financial asset increases to the point that it is considered credit-impaired, interest revenue is calculated based on the amortised cost (i.e. the gross carrying amount less the loss allowance). Financial assets in this stage will generally be assessed individually. Lifetime expected credit losses are recognised on these financial assets. The objective of hedge accounting is to represent, in the financial statements, the effect of an entity’s risk management activities that use financial instruments to manage exposures arising from particular risks that could affect profit or loss or other comprehensive income. Hedge accounting is optional. An entity applying hedge accounting designates a hedging relationship between a hedging instrument and a hedged item. For hedging relationships that meet the qualifying criteria in IFRS 9, an entity accounts for the

2.6 The Stress-Testing Framework

33

gain or loss on the hedging instrument and the hedged item in accordance with the special hedge accounting provisions of IFRS 9. IFRS 9 identifies three types of hedging relationships and prescribes special accounting provisions for each: • fair value hedge: this one corresponds to a hedge of the exposure to changes in fair value of a recognised asset or liability or an unrecognised firm commitment, or a component of any such item, that is attributable to a particular risk and could affect profit or loss. • cash flow hedge: this one corresponds to a hedge of the exposure to variability in cash flows that is attributable to a particular risk associated with all, or a component of a recognised asset or liability (such as all or some future interest payments on variable-rate debt) or a highly probable forecast transaction, and could affect profit or loss. • hedge of a net investment in a foreign operation as defined in IAS 21. When an entity first applies IFRS 9, it may choose to continue to apply the hedge accounting requirements of IAS 39, instead of the requirements in IFRS 9, to all of its hedging relationships.

2.6 The Stress-Testing Framework A stress-testing exercise (BCBS 2017a) means choosing scenarii that are costly and rare which can lead to the financial institution failure, Thus we need to integrate them into a model in order to measure their impact. The integration process may be a simple linear increasing of parameters to enlarge the confidence interval of the outcomes, or switch to a more advanced model predicting the potential loss due to an extreme event or a succession of extreme events by implementing various methodologies allowing the capture of multiple behaviours, or adding exogenous variables. The objective of this exercise is to strengthen the framework for the risk management by understanding extreme exposures, i.e., exposure that may fall beyond the “business as usual” capture domain of a model. We define the capture domain of a model by its capability to be resilient to the occurrence of an extreme event, i.e., the pertaining risk measure would not fluctuate or only in a narrow range of values, or would not breach the selected confidence intervals too many times. By stressing a value the financial institution naturally acknowledges the fact that their models and the resulting measurements are reflecting their exposure only up to a certain extent. As a matter of fact this is due to data sets on which the models are calibrated that are not containing the entire information or are not adapted to the evolution of the real world. Indeed, the data set only contains past incidents, and even if crisis outcomes are integrated it does not integrate the future extreme events which are by definition unknown. In other words, even if the models are

34

2 Financial Institutions: A Regulation Review Through the Risk Measurement Prism

conservative enough to consider eventual Black Swans (the stress testing enables envisaging Black Swans with blue eyes and white teeth). Selecting the appropriate scenario is equivalent to selecting the factors that may have an impact on the models (e.g. covariates) and to define the level of stress. These scenarii are supposed to characterise shocks likely to occur more than what historical observations say: shocks that have never occurred (stress expected loss), shocks reflecting circumstantial break downs, shocks reflecting future structural breaks. Mathematically all new categories of shocks entail drawing from some new factor distribution f ∗ which is not equal to the original distribution f characterising the original data set. Every type of shocks have to include correlations, comovements and specific events, such as crash, bankrupt, and systemic failure. When scenarii are assessed, practitioners have to check the various outcomes. Are they relevant for the goal? Are they internally consistent? Are they archetypal? Do they represent relatively stable outcome situations? The risk managers should identify the extremes of the possible outcomes of the driving forces and check the dimensions for consistency and plausibility. Almost three key points should be addressed: 1. The time frame: are the “new” trends compatible within the time frame in question? 2. The internal consistency: do the forces describe uncertainties that can construct probable scenarii? 3. The stakeholder influence: Is it possible to create reliable scenarii considering a potential negative influence from the stakeholders? In order to properly stress market, credit, operational, and liquidity risks, the information considered in the process should be continuously updated in order to avoid missing a spot. Applied to market risk some procedures have traditionally been applied to banks’ trading portfolios by considering multiple states of nature scenarii impacting various risk factors. Traditionally three kinds of approaches are used, standard scenarii, historical scenarii, and worst-case scenarii. This type of stress testing is probably the simplest and therefore suffers from the limitations related to over simplicity. As being the fundamental activity of a bank credit risk stress testing is much wider and has even been integrated into the capital calculations formulas through the loss given default (LGD) done by financial institutions. But approaches should not be limited to capital calculations as stress testing is also interesting for more traditional credit risk measure. Besides, counterparty credit exposure may either be represented by the “current” exposure, the “expected” exposure, or the “expected positive” exposures, consequently, stressing the exposure distributions would naturally impact the measures based on them, for instance, the credit value adjustment (CVA) via the expected exposure or the expected loss via the expected positive exposure. By using extreme scenarii operational risks measurement are naturally stressed. Besides as the methodologies implemented are expected by the regulator to be conservative, i.e., to provide risk measures larger than what empirical data or

References

35

traditional approach would give to practitioners. However these are not sufficient to provide an accurate representation of the risks over time. Alternative strategies need to be developed such as those presented in the next section. Liquidity risk arises from situations in which an entity interested in trading an asset cannot do it because nobody wants to buy or sell it with respect to the market conditions. Liquidity risk becomes particularly important to entities which currently hold an asset (or want to held it) since it affects their capability to trade. Manifestations of liquidity risk are very different if it comes from price dropping to zero. In case an asset’s price falling to zero the market is saying that the asset is worthless. However if one bank cannot find a counterparty interested in trading the asset this may only be a problem of market equilibrium, i.e., the participants have trouble finding each other. This is why liquidity risk is usually found to be higher in emerging or low-volume markets. Accordingly liquidity risk has to be managed in addition to market, credit, and operational risks. Because of its tendency to compound other exposures it is difficult or impossible to isolate liquidity risk. Some ALM techniques can be applied to assessing liquidity risk. A simple test for liquidity risk is to look at future net cash flows on a day-by-day basis where any day that has a sizeable negative net cash flow is of concern. Such an analysis can be supplemented with stress testing. In this section we partially discuss the “micro”’ liquidity risk, i.e., the liquidity of an asset, by opposition to the “macro” liquidity exposure, i.e., of a financial institution which is an aggregated measure assuming that it is included in market prices dropping up to a certain extent which is captured in the market-risk measurement. The liquidity position of a financial institution is measured by the quantity of assets to be sold immediately to face the liquidity requirements, even considering a haircut, while the price of an asset on the market is illiquid if there is no demand and its price is actually equal to 0, and consequently the measure should be forward. As a result considering the previous statement—in the next section—we focus on methodologies to measure and stress the solvency of financial institutions in relation to market, credit, and operational risks.

References Addo, Peter Martey, Dominique Guegan, and Bertrand Hassani. 2018. “Credit risk analysis using machine and deep learning models”. Risks 6, no. 2: 38. BCBS. 2005. International convergence of capital measurement and capital standards: A revised framework. Basel: Bank for International Settlements. –. 2006. International convergence of capital measurement and capital standards: A revised framework, comprehensive version. http://www.bis.org/publ/bcbs128. –. 2014. Fundamental review of the trading book: A revised market risk framework. Basel: Bank for International Settlements. –. 2015. Review of the Credit Valuation Adjustment (CVA) risk framework. Basel: Bank for International Settlements. –. 2016a. Minimum capital requirements for market risk. Basel: Bank for International Settlements.

36

2 Financial Institutions: A Regulation Review Through the Risk Measurement Prism

–. 2016b. Standardised measurement approach for operational risk. Basel: Bank for International Settlements. https://www.bis.org/bcbs/publ/d355.pdf. –. 2017a. Stress testing principles. Basel: Bank for International Settlements. –. 2017b. Basel III: Finalising post-crisis reforms. Basel: Bank for International Settlements. Frühwirth, Manfred, and Leopold Sögner. 2006. “The Jarrow/Turnbull default risk model: Evidence from the German market”. The European Journal of Finance 12, no. 2: 107–135. Gregory, Jon. 2012. Counterparty credit risk and credit value adjustment, vol. 881, 882. Hoboken: Wiley. IASB. 2018. IFRS 9 financial instruments. https://www.ifrs.org/issued-standards/list-of-standards/ ifrs-9-financial-instruments/. Tudela, Merxe, and Garry Young. 2005. “A Merton-Model approach to assessing the default risk of UK public companies”. International Journal of Theoretical and Applied Finance 8, no. 06: 737–761.

Chapter 3

The Traditional Risk Measures

Traditional literature enumerates risk measures without any attempt to classify them. Nevertheless several taxonomies can be used to distinguish between them. We choose to distinguish between measures of dispersion and downside risk measures (sometimes classified as safety risk measures, see, for example, Giacometti and Ortobelli 2004). This decomposition presents the advantage of integrating almost all of the existing risk measures. In the following, the following space is considered unless stated otherwise. Let (, F , P ) be a probability space with  a set of simple events, F a σ -algebra of subsets of , and P a probability measure on F . Let X be a continuous random variable with probability density function f (x). f is an L2 function, therefore the square of the function is Lebesgue integrable. F (x) denotes the related cumulative distribution function, and F −1 its inverse sometimes referred to as quantile function. All generic random variable (r.v.) X used in the following are considering that setting. When a sequence of r.v. X1 , . . . , Xn is considered, then their respective values x1 , . . . , xn belong to .

3.1 Measures of Dispersion Probability-weighted dispersion constitutes the first way of dealing with risk measurement. In this case, those measures of dispersion could be divided into three classes. The first class groups together measures of the distance between some representative values. The second one gathers the measures obtained from deviation of each data from a reference point (also called symmetric measure of risk). The last class is made up of any measures obtained from the deviation of all the data among themselves (Yamane 1973, Bickel and Lehmann 2012).

© Springer Nature Switzerland AG 2019 D. Guégan, B. K. Hassani, Risk Measurement, https://doi.org/10.1007/978-3-030-02680-6_3

37

38

3 The Traditional Risk Measures

3.1.1 Distance Between Representative Values Three main measures are constituting this group: • In statistics, the range is simply the difference between the highest and lowest value taken by the variable under consideration, but it might have a more complex meaning (see below). 1. For n independent and identically distributed continuous random variables X1 , X2 , . . . , Xn with cumulative distribution function F (x) and probability density function f (x), let t denote the range of a sample of size n from a population with distribution function F (x). The range has cumulative distribution function (Gumbel 1947)

G(t) = n

∞ −∞

f (x)[F (x + t) − F (x)]n−1 dx.

(3.1.1)

The mean range is given as follows (Hartley and David 1954):

1

n

F −1 [F n−1 − (1 − F )n−1 ] dF.

(3.1.2)

0

2. For n non-identically distributed independent continuous random variables X1 , X2 , . . . , Xn with cumulative distribution functions F1 (x), F2 (x), . . . , Fn (x) and probability density functions f1 (x), f2 (x), . . . , fn (x), the range has cumulative distribution function (Tsimashenka et al. 2012) G(t) =

n

 i=1

∞ −∞

fi (x)

n 

[Fj (x + t) − Fj (x)] dx.

(3.1.3)

j =1,j =i

3. For n independent and identically distributed discrete random variables X1 , X2 , . . . , Xn with cumulative distribution function F (x) and probability mass function f (x) the range of the Xi is the range of a sample of size n from a population with distribution function F (x). The range has probability mass function as follows (Evans et al. 2006; Burr 1955; Abdel-Aty 1954; Siotani 1956): ⎧ N n ⎪ ⎪ x=1 [f (x)] ⎪ ⎛ ⎞ ⎪ ⎪ ⎪ [F (x + t) − F (x − 1)]n ⎪ ⎪ ⎨ ⎜ ⎟ n ⎟ g(t) = N−t ⎜ ⎜ − [F (x + t) − F (x)] ⎟ ⎪ ⎜ ⎟ ⎪ x=1 ⎪ n⎟ ⎜ ⎪ − [F (x + t − 1) − F (x − 1)] ⎠ ⎪ ⎝ ⎪ ⎪ ⎪ ⎩ + [F (x + t − 1) − F (x)]n

t =0

t = 1, 2, 3 . . . , N − 1.

(3.1.4)

3.1 Measures of Dispersion

39

• The interquartile range is a trimmed estimator, which gives the difference between the lowest and highest quartiles (i.e. between 75th and 25th percentiles) and therefore contains one-half of the total population. It is the most significant basic robust measure of scale. The previous definition requires the introduction of quantiles. These are cut points dividing the range of a probability distribution into contiguous intervals with equal probabilities, consequently quartiles are the three cut points dividing the range of a probability distribution into four groups of identical size. Let q denote the number of contiguous intervals, the k-th quantile is the data value where the cumulative distribution function crosses qk . That is, x is the k-th quantile for a random variable X if P [X < x] ≤

k q

or, equivalently, P [X ≥ x] ≥

1 − and P [X ≤ x] ≥ or, equivalently, P [X > x] ≤ 1 − qk . • The maximum loss corresponds to the largest loss obtained from a sample. Formally, and assuming that losses can be represented by a real-valued function f defined on a domain X, this function has a global maximum point at χ if f (χ) ≥ f (x) for all x in X. Similarly, the function has a global minimum point at χ if f (χ) ≤ f (x) for all x in X. If the domain X is a metric space, then f is said to have a local maximum point at the point χ if there exists some > 0 such that f (χ) ≥ f (x) for all x in X within distance of χ. Similarly, the function has a local minimum point at χ if f (χ) ≤ f (x) for all x in X within distance of x∗. k q

k q

Unfortunately, those measures give no information about the dispersion inside the range and as a consequence, have very limited utility in the case of risk management. We only get an intuitive idea of the distribution spread.

3.1.2 Deviation from a Central Value By construction, the sum of all the deviations from the mean is equal to zero. In order to avoid the negative and positive deviation offsetting each other, two alternatives approaches might be considered. The first is the sum of the squared deviations. The second consists in calculating the sum of the absolute deviations from the mean.

3.1.2.1 The Variance The variance and its square root, i.e., the standard deviation, constitute the most widely employed measures. The variance is defined as the expected value of the squared deviations of the data values from the mean, and thus simply measures the dispersion of the estimates around their mean value. Let X be a random variable defined on the probability space previously introduced, then the expected value of

40

3 The Traditional Risk Measures

X, denoted by E[X], is defined as the Lebesgue integral

E[X] =

X(ω) d P(ω).

(3.1.5)



In our case, the expected value corresponds to the mean. Formally, the variance of a random variable X is the expected value of the squared deviation from the mean of μ = E[X] ! Var(X) = E (X − μ)2 .

(3.1.6)

The variance is also the second moment or second cumulant of a probability distribution that generates X. The variance is typically designated as Var(X), σX2 , σ 2 . The expression for the variance can be expanded as follows: Var(X) = E (X − E[X])2

!

! = E X2 − 2X E[X] + E[X]2 ! = E X2 − 2 E[X] E[X] + E[X]2 ! = E X2 − E[X]2

(3.1.7)

If the random variable X follows a continuous distribution with probability density function f (x), then the variance of X is given by Var(X) = σ 2

= (x − μ)2 f (x) dx

=

x 2 f (x) dx − 2μ

xf (x) dx +

μ2 f (x) dx

(3.1.8)

=

x 2 f (x) dx − μ2 ,

where μ is the expected value of X given by the following:

μ=

xf (x) dx,

and where x is ranging over the range of X.

(3.1.9)

3.1 Measures of Dispersion

41

Let X be a discrete distribution with probability p1 in x1 , p2 in x2 , . . . , pn in xn , then Var(X) =

n 

pi · (xi − μ)2 ,

(3.1.10)

i=1

or equivalently  Var(X) =

n 

 pi xi2

− μ2 ,

(3.1.11)

i=1

where μ is the average value, i.e., μ=

n 

(3.1.12)

pi xi .

i=1

The variance of a set of n equiprobable values can be written as 1 (xi − μ)2 , Var(X) = n n

(3.1.13)

i=1

where μ is the expected value, i.e., 1 xi . n n

μ=

(3.1.14)

i=1

The variance of a set of n equiprobable values can be expressed in terms of squared deviations of all points from each other: Var(X) =

n n 1  1  1 (xi − xj )2 = 2 (xi − xj )2 . 2 2 n n i=1 j =1

i

(3.1.15)

j >i

The basic properties of the variance are the following: V ar(X) ≥ 0

(3.1.16)

P (X = a) = 1 ⇔ Var(X) = 0

(3.1.17)

Var(X + a) = Var(X)

(3.1.18)

Var(aX) = a 2 Var(X).

(3.1.19)

42

3 The Traditional Risk Measures

Var(aX + bY ) = a 2 Var(X) + b2 Var(Y ) + 2ab Cov(X, Y ),

(3.1.20)

Var(aX − bY ) = a 2 Var(X) + b2 Var(Y ) − 2ab Cov(X, Y ),

(3.1.21)

where Cov(X, Y ) is the covariance between X and Y . In general we have for the sum of N random variables {X1 , . . . , XN }: Var

N 

 Xi

=

N 

N 

Cov(Xi , Xj ) =

i,j =1

i=1

Var(Xi ) +



Cov(Xi , Xj ).

i=j

i=1

(3.1.22) These results lead to the variance of a linear combination as: N  N   Var ai Xi = ai aj Cov(Xi , Xj ) i,j =1

i=1

=

N 

ai2 Var(Xi ) +

N 

ai aj Cov(Xi , Xj )

(3.1.23)

i=j

i=1

=





ai2 Var(Xi ) + 2

ai aj Cov(Xi , Xj ).

1≤i 0, E

dQ dP

! log dQ dP is the relative entropy of

Q