Comment
2012; University of Chicago Press; Volume: 26; Issue: 1 Linguagem: Inglês
10.1086/664002
ISSN1537-2642
Autores Tópico(s)Credit Risk and Financial Regulations
ResumoPrevious articleNext article FreeCommentHyun Song ShinHyun Song ShinPrinceton University and NBER Search for more articles by this author Princeton University and NBERPDFPDF PLUSFull Text Add to favoritesDownload CitationTrack CitationsPermissionsReprints Share onFacebookTwitterLinked InRedditEmailQR Code SectionsMoreThe proposals in this paper are path-breaking, but one of their many advantages is that the proposals are also practical. Most systemic financial institutions that are to bear the brunt of the reporting requirements already collect the necessary information. Estimating the sensitivity (the "delta") to key risk factors (such as commercial and residential property prices, interest rates, credit spreads, and so on) are the staple of the risk management functions of large banks and other financial institutions.However, private sector risk management practices failed comprehensively in the run-up to the recent global financial crisis, and so one natural concern might be that the proposals in the current paper would also similarly fall short. The answer to this concern is that although the measurement exercise bears a superficial resemblance to existing private sector risk management practices, the information will be put to a very different use, pursuing very different objectives and based on a very different philosophy.Take the paper's proposal to construct the Liquidity Mismatch Index (LMI). The index will build on the discrepancy measure for individual financial intermediary assets and liabilities in realizable value, but the objective is to aggregate the information across firms to come up with an overall picture of the extent of maturity transformation in the financial system, and how such maturity transformation interweaves with other vulnerabilities.The perspective here is very different from the thinking in a private sector institution. There, the objective is to lay off one's risks to others by hedging and to remain agile and flexible so that the institution can "cut and run"—that is, flee from exposures—when risks begin to materialize. Although cutting and running and "letting the devil take the hindmost" is a feasible strategy for an individual institution, the system as a whole cannot do it. There is a limit to how much risk can be shed when viewed in the aggregate. A thought experiment due to Hellwig (1995) expresses the limits well. [C]onsider an institution that finances itself by issuing fixed-interest securities with a maturity of n months and that invests in fixed-interest rate securities with a maturity of n + 1 months. On the face of it, maturity transformation is small, and interest risk exposure is minimal. Suppose, however that we have 479 such institutions. These institutions may be transforming a one-month deposit into a forty year fixed interest rate mortgage, with significant interest rate risk exposure of the system as a whole. The interest rate risk exposure of the system as a whole is not visible to the individual institution unless it knows that it is but an element of a cascade and that credit risks in the cascade are correlated. (p. 730)Cutting and running is not only a theoretical possibility. It happens in practice, especially among the most sophisticated institutions with the best risk management systems. Figure 1 plots the two weighted average measures of Value-at-Risk for the (initially five, then down to two) Wall Street investment banks from 2001 to 2010, taken from Adrian and Shin (2008), updated to include the crisis period.Fig. 1. Value-at-Risk and leverage for Wall Street investment banks Source: Adrian and Shin (2008, updated 2011).View Large ImageDownload PowerPointValue-at-Risk (VaR) is a quantile measure that gives the approximate worst case loss in the sense that the realized loss being larger than the Value-at-Risk is smaller than some fixed, small probability set by the risk manager. The dotted line (unit VaR) is the Value-at-Risk per dollar of assets, while the solid line is the VaR normalized by equity. All series are measured in units of precrisis standard deviations.The unit VaR series shoots up at the height of the crisis, reflecting the increase in measured risks such as the VIX index, credit and CDS (credit default swap) spreads, and so on. But the increased measured risks are met by the shedding of those risks, as can be seen by the sharp decline in leverage (the dashed line). VaR / Equity actually drops even as unit VaR explodes at the height of the crisis. Importantly, the decline in leverage is achieved not by the raising of new equity, but by the shedding of assets (Adrian and Shin 2008).The shedding of assets by an intermediary means contraction of lending to the customers of the intermediary. The prudent shedding of risks by the lenders to Bear Stearns or Lehman Brothers will feel like a run from the point of view of Bear Stearns or Lehman Brothers. It is this fallacy of composition that is the weakness of existing private sector risk management practices. A close cousin of private sector risk management practices is the microprudential approach to financial regulation that builds on such private sector "best practice."1 The strength of the proposal by Brunnermeier, Gorton, and Krishnamurthy is that their approach starts with an explicitly system-wide perspective.However, it would be important not to underestimate the difficulty of the task the authors have set themselves. In line with the changed focus toward system-wide risks, much of the weight will be borne by the model that is imposed on the data. The exercise will only be as useful as the model that underpins the aggregation of the inputs.To see the size of the task, consider how the authors propose to use the disclosed inputs. The individual deltas and liquidity mismatch will be aggregated taking account of intermediation chains and weeding out mutually inconsistent plans to solve the mapping from the measured risks to the true realized risks. That is, the mapping envisaged is of the form: where Δi,t is the profile of deltas of firm i at date t, st−1 are shocks to the deltas, θi,t−1 are firm i's characteristics, and xt–1 is the macro state. The objective is to derive the true deltas {Δi,t} from the individual measured deltas inside the brackets on the right-hand side of (1).Getting the right mapping f(·) in (1) will determine the success or otherwise of such an exercise. This will be a new venture. Existing private sector risk management exercises do not rest of calculating (1), since their objective does not extend to incorporating system-wide externalities.The challenge in constructing (and updating) the correct model that underpins the function f(·) will be formidable. Theoretical modeling will need to be employed much more than is customary in regulatory settings. The reason is that a purely empirical exercise relying on historical data will fail to capture the pent-up risks in the system.Take the example from the period before the global financial crisis. Figures 2 and 3 plot the CDS spreads of Bear Stearns and Lehman Brothers, with figure 2 giving the long perspective and illustrating how the spreads increase sharply with the onset of the crisis. However, what is remarkable is how tranquil the CDS measure is before the crisis. There is barely a ripple in the series in the period 2004–2006 when the vulnerability was building up fastest. Figure 3, which plots the CDS series for the precrisis period of January 2004 to January 2007 shows that CDS spreads were actually falling over the period when the worst excesses were building up in the financial system.Fig. 2. CDS spreads of Bear Stearns and Lehman Brothers (2004–2008)View Large ImageDownload PowerPointFig. 3. CDS spreads of Bear Stearns and Lehman Brothers (2004–2006)View Large ImageDownload PowerPointThe challenge for the proposals in Brunnermeier, Gorton, and Krishnamurthy is to come up with a theoretical construct—the theory that underpins the mapping in (1)—that will be able to flag the impending problems while they are building up rather than waiting for the signs of trouble to materialize. As the former BIS (Bank of International Settlements) head Andrew Crockett (2000) has put it, The received wisdom is that risk increases in recessions and falls in booms. In contrast, it may be more helpful to think of risk as increasing during upswings, as financial imbalances build up, and materialising in recessions.The authors are skeptical that balance sheet aggregates such as the liabilities of financial intermediaries will be able to capture the overall risks to the system due to the neglect of off-balance sheet exposures such as over-the-counter derivatives. However, it would be important to maintain the system-wide perspective rather than slipping back into the thinking of individual financial institutions. For individual banks and intermediaries, the off-balance sheet exposures are as important as the on-balance sheet ones, but from the perspective of the system as a whole, the off-balance sheet items would be of interest only due to their contributions to aggregate risks. If the bet between two institutions is zero-sum, then the net impact of this exposure will be less than if they held long positions in the same risky asset. To this extent, balance sheet aggregates may capture a good part of the overall risks to the system, and tracking such aggregates may be a good first step.Another way to express this principle is to note that the behavior of individual institutions is the result of solving constrained optimization problems of the form: When the environment is tranquil, more exposure can be taken on without breaching the risk constraint. The solution to the optimization problem in tranquil times will result in larger balance sheets as a consequence. As with any nontrivial constraint in an optimization problem, the risk constraint binds all the time. In tranquil times, the constraint binding means that the notional exposures have to be that much larger. Tracking aggregate notional exposures, which is what is involved when measuring balance sheet aggregates, will therefore be a useful first step in gauging the potentially highly nonlinear crisis dynamics around the corner.EndnotesFor acknowledgments, sources of research support, and disclosure of the author's material financial relationships, if any, please see http://www.nber.org/chapters/c12414.ack.1.Morris and Shin (2008) discuss the various dimensions of the fallacy of composition in financial regulation.ReferencesAdrian, Tobias, and Hyun Song Shin. 2008. "Procyclical Leverage and Value-at-Risk." FRBNY Staff Report 338. http://www.newyorkfed.org/research/staff_reports/sr338.html.First citation in articleGoogle ScholarCrockett, Andrew. 2000. "Marrying the Micro-and Macro-Prudential Dimensions of Financial Stability." Bank for International Settlements. http://www.bis.org/review/rr000921b.pdf.First citation in articleGoogle ScholarHellwig, Martin. 1995. "Systemic Aspects of Risk Management in Banking and Finance." Schweizerische Zeitschrift für Volkswirtschaft und Statistik 131 (4 / 2): 723–37.First citation in articleGoogle ScholarMorris, Stephen, and Hyun Song Shin. 2008. "Financial Regulation in a System Context." Brookings Papers on Economic Activity (Fall):229–74.First citation in articleGoogle Scholar Previous articleNext article DetailsFiguresReferencesCited by NBER Macroeconomics Annual Volume 262011 Sponsored by the National Bureau of Economic Research (NBER) Article DOIhttps://doi.org/10.1086/664002 Views: 140 © 2012 by The National Bureau of Economic Research. All rights reserved.PDF download Crossref reports no articles citing this article.
Referência(s)