July-August 2013


RulerA lack of faith in standard risk measures is pushing portfolio managers to look externally for benchmarking of their internal risk management. Nicholas Pratt investigates. It was inevitable that, following the Lehman Brothers default and the consequent market chaos, standard risk management practices would suffer a crisis of credibility. Value at Risk (VaR), the most widely used risk measure for a given portfolio defined as a probability of loss exceeding a certain threshold on a given day, came in for most criticism. This criticism was most succinctly described by hedge fund trader David Einhorn, who compared it with “an airbag that works fine except when you have a car accident”.     Since then, the credibility of risk management has continued to suffer through a string of incidents: the Madoff case highlighted poor levels of due diligence; the Libor scandal suggested a weakness in the supposed market standards for rates; and the $2 billion-loss (€1.6 billion) suffered by JP Morgan in 2012 as a result of a VaR miscalculation showed that even the most highly rated financial institutions can make costly mistakes within their own risk departments. As Andries Terblanche, previous head of financial services, KPMG Australia, and now adjunct professor in risk and actuarial studies at University of New South Wales, says: “Many a brilliant mind endowed us with maths and data processing techniques to use as the prognosis for making capital allocation decisions. “So why did we end up here? Why have we lost confidence in techniques that were previously widely accepted?” One cause, says Terblanche is the historical data basis for modern risk theory – the decades between 1950 and 1971 produced an unprecedented period in terms of the absence of macroeconomic shocks, allowing risk theories to reach a high water mark in confidence levels. It resulted in unrealistically high expectations of VaR when applied in a macroeconomic environment with vastly different characteristics, as is the case today, and has left a gap in risk management and prognostic tools that requires urgent attention. “If someone is still using VaR as an absolute measure of risk exposure, I would have grave reservations,” says Terblanche. “Our economic future is so vastly different to our economic past – the complexity of the derivatives market, the levels of globalisation, the extent of peacetime fiscal debts, to name a few. None of these evolving issues has fully played out, let alone in historical data, so great care must be exercised in using any technique that draws on the past to determine future risk scenarios.” Another limitation is the fact VaR is essentially a market risk measure designed to forecast the maximum a portfolio manager might lose in one day should the market move against them. But, says Fred Ponzo, managing partner at consultant GreySpark Partners, there are counterparty and liquidity risks that that can either mitigate or compound those market risks. “It is not about exaggerating risk so that investors become much more cautious. It is about creating some assurance and ensuring that you do not take too much risk when you do not have the full picture. It is about having the right view of risk. The main problem with VaR is that people thought they were safe when they were not,” says Ponzo. DEVIL IS IN THE DETAIL
For portfolio managers looking to actively hedge, a more detailed measure of risk is needed and they will look beyond VaR, says Rohan Douglas, chief executive at risk analytics firm Quantifi. Now a similar
level of detail is demanded at the firm-wide level, from stress testing to scenario analysis and incremental improvements on VaR that employ expected shortfall and extreme value theory.   “Anytime you try to summarise risk in a portfolio to a single number you will be missing a lot of detail. So there has been a step away from that single number approach to something that is more detailed in terms of the underlying risks.” There has also been a step away from relying purely on internally generated numbers and, instead, a demand for externally provided numbers to act as an independent benchmark, especially in the hedge fund world. The Alternative Investment Fund Managers Directive in the EU requires separation of the risk process from the investment process. And there is the open protocol where hedge funds are voluntarily providing risk information in order to develop standardised reporting procedures for collecting, collating and conveying hedge fund risk information. “Internally generated numbers are no longer enough,” says Jim Ramenda, senior vice president, enterprise risk at risk software vendor SS&C. “We have moved into the regime of the trusted advisor,” says Ponzo. “It is not that there is a distrust of internal risk management but there is a wish to benchmark everything that is coming out of the risk department.” There is also a lot of focus on computer power and a shift from sub-optimal internal models to large, powerful, external platforms. The internal systems of four years ago did not anticipate the range and the granularity of risk requirements demanded today, says Ramenda, and many are simply not up to it. By contrast, a number of risk systems vendors have spotted an opportunity to brand themselves as full service risk providers and fill a void as the industry moves from a reliance on internal risk models to an independent and industry-wide standard of risk measures. BlackRock and Thomson Reuters have formed a partnership to provide fixed income analytics for institutional investment managers to use in their risk management. “The proposition is that the analytics feed into their existing processes and allows practitioners to focus on value added activities rather than dealing with data and system challenges,” says Dennis Kirincich, Managing Director at BlackRock Solutions. “It provides more transparency into underlying risk exposures and can act as an external benchmarking and model validation service where clients continue to use in-house solutions”. “Investors are looking for more yield and are taking on more risk by investing in broader asset classes across global markets so it is even more important today to get good quality analytics. The institutional investors that we are targeting will be able to use our analytics and save on the work and expense associated with building in-house, quality controlled, risk analytics solutions.” The partial outsourcing of risk may hold some concerns, though. Does it signal a change in mindset, away from the use of risk as a competitive advantage to a world where the over-riding concern is not to make a mistake? And is there a danger that if everyone is working from the same standard set of risk measures that a risk monoculture will develop? For example, Ramenda says continued investor scrutiny will drive further standardisation in risk reporting. “The best and logical outcome would be to have some kind of independent risk audit similar to the international accounting rules that exist for financial reporting.” However, we are a long way away from this point and the use of external risk management services will not necessarily lead to a wholesale exodus of internal risk managers. “They may use our infrastructure and a pricing model that matches the market because there is no competitive advantage there,” says Douglas. “But it is important that they take on the running of that model, setting up the scenarios and deciding on the assumptions.” CONCERNS
And while there is a distinction between internal models and external platforms, it is not about moving from one to the other but combining the two, says Ponzo. It is about leaving the heavy lifting and number crunching to the large platforms so that the numbers that are fed into the internal models are correct in the first instance. If there is one concern, it is that the notion of “correct” numbers will be yet another way for a false sense of confidence in risk to manifest, setting a path for another market crisis. As celebrated statistician George E P Box once said, “all models are wrong but some are useful”. This could be true whether they be internally or externally developed. ©2013 funds europe

Executive Interviews

INTERVIEW: Put your money where your mouth is

Jun 10, 2016

At Kempen Capital Management, they believe portfolio managers should invest in their own funds. David Stevenson talks to Lars Dijkstra, CIO of the €42 billion manager.

EXECUTIVE INTERVIEW: ‘Volatility is the name of the game’

May 13, 2016

Axa Investment Managers chief executive officer, Andrea Rossi, talks to David Stevenson about bringing all his firm’s subsidiaries under one name and the opportunities that a difficult market...


ROUNDTABLE: Beyond the hype

Oct 13, 2016

The use of smart beta investing continues to grow. Our panel, made up of both providers and users, discusses what the strategy actually means, how it should be used and the kind of pitfalls that may arise when using this innovative investment technique.

MIFID II ROUNDTABLE: Following the direction of travel

Sep 07, 2016

Fund management firms Aberdeen and HSBC Global meet with specialist providers to speak about how the industry is evolving towards MiFID II.