In December 2013, the New York Jets played the Oakland Raiders at the Met Life stadium in New Jersey in temperatures close to freezing. Those of us at the game dressed for deep winter and watched the fourth quarter in the falling snow. Two weeks later, the Jets played their final home game in the same stadium, in 71 degrees, with the crowd wearing t-shirts and tailgating like it was late summer.
The Super Bowl will kick off at the Met Life stadium on February 2, 2014. This presents a preparation issue for the two teams who will contest the spectacular finale. While seemingly abstract to risk management, this major American football event offers an interesting and timely analogy to data, tail events and the use of data calibration to predict and prepare for extremes, similar to that of a risk management function on Wall Street.
In the context of the Super Bowl, New York weather statistics illustrate the challenges. The lowest ever February temperature recorded is -15°F, versus an average low of 24°F. The game of football is made all the more challenging amid possible snow falls, with the average in New York reaching totals of 22 inches for the winter season. The record is a huge 76 inches, achieved in 1995/96, with the snowiest single month being February 2010, with 36 inches. Some frozen food for thought for the head coaches and teams playing to win this year.
Risk management teams across the financial world get snowed under by similar issues every day. The relatively standard techniques of time weighting histories, giving greater weightings to more recently observed data, may seem reasonable. However, extremely rare but highly damaging events tend to be moved further away from the analysis. Whether or not this is the right way to manage risks will depend on what is viewed as the purpose of the risk management function. If the role is seen as one to lower capital charges (by determining the most favorable view of risks being taken) then lowering the effects of rare extremes would be an expected component. If, however, the role is more integrated into the daily workings of the institutions, then statistical methods would be more appropriate in determining the likely risks, while simultaneously running extreme loss simulations to see how well the firm could withstand such events, in terms of both market losses and capital planning. Back on the field, this could be seen as designing plays for extreme inclemency, which the head coach hopes not to need, but has to have as a contingency for the biggest game of the year.
In the financial world, recent regulatory changes are forcing these approaches on the market. The two specific regulations of note are Stressed Value-at-Risk (VaR) and the Comprehensive Capital Analysis and Review (CCAR) stress tests. Stressed VaR refers to a calculation where the data set used for scenario generation specifically represents the worst possible 12 month period for the portfolio. This means that the measure cannot really be hedged as any attempt to do so will simply shift the data set used to a different twelve month period. The idea is to show possible losses and have those serve as the underlying driver for an additional capital buffer. The CCAR stress tests are provided by the US regulator, which require a series of systematic deteriorations of market conditions to be modelled over nine quarters, and the bank’s books to be subject to analysis under those scenarios. It is also required that the banks submit their capital plans, which are analyzed against the stressed conditions. Banks, like football teams, either win or lose (in the eyes of the regulator), based on the reported results.
This industry-wide regulatory expansion has been treated with a large degree of apprehension, and certainly some of the criticism is valid. It does, however, seem like solid best practice to work extremes into the risk management mix, and to subject core bank planning to the results of that analysis. The key here is not to eliminate risk but to embrace it and understand exactly which risks are being taken., That risk assessment is based upon statistical analysis of the most recent data and the most likely events, sensitivities to observable market risk factors and standardized scenarios, which are typically re-runs of large loss days from history (Black Monday, Russian debt crisis etc). It is less common to apply longer running deterioration scenarios and to extend the analysis to internal capital planning.
It would be useful to test any strategic plans against long-term scenarios and worst case analytics like Stressed VaR. Again, it does not mean that such events and analysis should lead to ‘no action’ risk elimination, but it does mean that those in charge of the planning should be aware of what could be at stake in the worst case. It is for that reason that these regulatory changes should be embraced and adapted into the core risk management of financial institutions.
So, as we prepare to enjoy the potentially snowy Super Bowl, we should spare a thought for coaches faced with short term temperature swings of warmer than 60°F to below freezing and mid-term snowfall deviations of 50 inches a season. An unlikely but important analogy that highlights the significance of playing a strategic risk management game, by reassessing our own usage of data, and the risk analytics we run against it.
A version of this blog post was originally published on GARP.