For many risk managers, the recent financial meltdown has left them questioning the veryessence of risk modeling, used by many since the 1990s to measure their firm’sfinancial risk. Investment firms have traditionally relied on fantasticallycomplex mathematical models for measuring the associated risk in their variousportfolios, primarily to reassure investors that all is well. However, theworldwide events of the past 18 months have, arguably, left the reputation of riskmodeling in tatters. Here, Xavier Bellouard, co-founder of Quartet FS, explainswhat the future holds for risk modeling and how the use of Value at Risk (VaR)in particular needs to evolve to enable more accurate risk management.
The painful lessons of recent times have shown that the risks taken by the largest banks and investment firms in much of the Westernworld were so excessive and foolhardy that they threatened to bring down thefinancial system itself. The system had relied on many mathematical models, butby far the most widely used is VaR. Built around statistical ideas andprobability theories that have been around for centuries, VaR was developed andpopularised in the early 1990s by a handful of scientists and mathematicians or“quants”. VaR’s great appeal is that it expresses risk as a single number, apound figure, no less. Plus, it is the only commonly used risk measure that canbe applied to just about any asset class.
However, many would argue that the recent financial crisis was ultimately a crisis of modern metrics-based risk management. While undoubtedlyrisk modelling, as we know it, needs to evolve to better cope with market risk,one could argue that it is more banks’ and traders’ approach to, and use of VaR,that needs to change, rather than the actual model itself.
Historically, investment firms were using VaR primarily as a reporting tool to keep the regulators and shareholders happy, providing ‘afterthe fact’ analysis. Today, however, VaR must be viewed as an operational,rather than a reporting, metric. Firstly, VaR needs to be broken down andanalysed by traders. Instead of relying on a single number, traders need tolook beyond the top line, delve into the complex mathematical calculations andgain a better understanding of the type of risk they’re taking and how it can bestbe mitigated. In doing so, VaR willbecome a valuable management tool, alongside other factors, such as Profit andLoss.
Secondly, as well as better analysis, traders need to receive VaR calculations in a timely manner. When used simply as a reportingtool, receiving VaR calculations within 24 or 48 hours is adequate. However, iftraders are to have the ability to act on the information provided within theVaR calculation, they require the information much more quickly. While, real-timeVaR might not be necessary, within the trading day is crucial.
Of course, many of these considerations are currently being mandated by the regulators and the emerging regulations will not only require achange of mindset but also a review of banks’ systems. As it stands, manyfinancial institutions simply do not have the right technology in place todeliver in-depth VaR analysis in a timely fashion.
This is particularly true when you consider Marginal VaR. Marginal VaR is important as it analyses the impact – in terms of risk - of aparticular asset class or country, for example, on the business. The MarginalVaR of a position with respect to a portfolio can therefore be thought of asthe amount of risk that the position is adding to the portfolio. CalculatingMarginal VaR is enormously beneficial as it allows traders to understand wherethe largest risk is sitting within the business. However, few traders use it atpresent as it is challenging to compute Marginal VaR rapidly without the righttechnology in place.
As investment firmsget to grips with the new regulatory requirements however, it’s clear that atechnology refresh will be on the cards. At the end of the day, the widespreadinstitutional reliance on VaR is only a gamble if traders do not have the righttechnology solutions in place to help them analyze and break down VaR in nearreal-time.Most of Europe's biggest banks comfortably passed the Committeeof European Banking Supervisors’ recentstress tests and the sector has breathed a visible sigh of relief. However,plenty of uncertainty remains over banks that either squeaked by with justenough capital or passed but did not fully disclose the data that went intotheir calculations. The spectre of counterparty risk, last seen in dramaticform in the wake of the Lehman collapse, is not far from regulators’ andinvestors’ minds amid the continuing eurozone sovereign debt crisis. So, withregulation and stress tests to one side, what else can financial institutionsdo to convince the market that they are appropriately managing credit risk andthereby encourage more lending back into the market?
Risk is an acceptable, even a welcomed part of trading. Every trade comes with some riskattached - it is just a case of knowing what the potential problems are andbeing comfortable with them. But as the financial crisis showed, pinpointingwhere the risk lies is not always so straightforward – even big institutionsdidn’t always understand the full extent of their exposures. However, betterunderstanding and management of risk – especially counterparty risk - will bekey to the ongoing financial recovery and to re-injecting confidence back intothe interbank lending market.
A large part of better managing credit risk will come from providing risk managers with increased access to data inorder to gain a thorough view of counterparty risk exposure and calculations.The evolving regulatory landscape and greater need to monitor, measure andreport, has advanced in such a way that understanding and realising these risk exposures in real-time has becomecrucial to a financial institution’s ongoing stability and success. Risk managers need to be able to provide aggregated figuresquickly and accurately in order to truly establish the position the businessfinds itself in.
However, with a multitude of other risk facets to consider, including liquidity, market and operational, many riskmanagers are struggling to adequately analyse and establish their positions. No-whereis this truer than in the front office, where 21months after the collapse of Lehman Brothers, financial institutions are still strugglingto properly calculate counterparty risk. Due to its interconnected nature,counterparty risk analysis must not only take into account various data,including VaR, P&L and sensitivities, but must also integrate a number ofasset classes and draw together an understanding of the collective impact thatthey have.
Once counterparty risk is accurately established, risk managers need to gain better access to and analysis of thisdata so they can more effectively bridge the gap between ‘quants’ anddecision makers. For example, businessdecision makers need information that goes beyond a single indicator orfigure so that they can fit credit risk into the broader perspective for the seniormanagement team and board.
Banks’ traditionally siloed approach however has meant that to date such a scenario is more of a pipedreamthan a reality. However, aggregating high volumes of data from multiple streamsto produce both snapshots and the ability to drill down into the data in real-timeis now possible with the right technology solutions.
Whilst technology is by no means the panacea to solving all the issues associated with risk management andindeed counterparty risk, if implemented and deployed appropriately, it can providedetailed analysis and ensure that decisions that need to be made quickly arenot only well informed but also support the business.
In short, the quicker a firm can realise its true counterparty positions and potential exposures in the context of theoverall risk picture, the greater its overall market competitiveness andconfidence will be. Of course, implementing this kind of change through atechnology refresh will take time. In the short-term, it will be the interbanklending markets that will have the answer as to whether confidence is returningto the European banks. However, in the longer term, understanding where afinancial institution’s risk lies, and what would happen in a worst case scenario,through better access to, understanding and analysis of data will reduce thefear of counterparty risk that is currently stifling the market’s fledgling rebuildingof confidence.
This article can be read at http://www.dofonline.co.uk/content/view/4787/115/