20
June 2010 Real-time Risk Intelligence Leading Practices from Quartet FS

Real-Time Risk Intelligence_final Version 070910

Embed Size (px)

Citation preview

Page 1: Real-Time Risk Intelligence_final Version 070910

June 2010

Real-time Risk IntelligenceLeading Practices from Quartet FS

Page 2: Real-Time Risk Intelligence_final Version 070910

2 © Copyright Chartis Research Ltd 2010. All Rights Reserved

About Chartis ResearchChartis is the leading provider of research and analysis on the global market for risk technology. Our goal is to support enterprises as they drive business performance through better risk management, corporate governance and compliance. We help clients make informed technology and business decisions by providing in-depth analysis and actionable advice on virtually all aspects of risk technology.

This includes technology solutions for managing:

• Credit risk• Operational risk and GRC• Market risk• ALM and liquidity risk• Financial crime risk• Insurance risk• Regulatory requirements including Basel II and Solvency II

Chartis has a total focus on risk technology giving it significant advantage over generic market analysts.

Chartis has brought together a leading team of analysts and advisors from the risk management and financial services industries. This team has hands-on experience of implementing and developing risk management systems and programmes for Fortune 500 firms and leading consulting firms.

www.chartis-research.com

© Copyright Chartis Research Ltd 2010. All Rights Reserved.

No part of this publication may be reproduced, adapted, stored in a retrieval system or transmitted in any form by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior permission of Chartis Research Ltd.

The facts of this report are believed to be correct at the time of publication but cannot be guaranteed. Please note that the findings, conclusions and recommendations that Chartis Research delivers will be based on information gathered in good faith, whose accuracy we cannot guarantee. Chartis Research accepts no liability whatever for actions taken based on any information that may subsequently prove to be incorrect or errors in our analysis.

See Chartis “Terms of Use” on www.chartis-research.com

RiskTech100TM is a Registered Trade Mark of Chartis Research Limited (US Trade Mark Registration No. 3454398).

Page 3: Real-Time Risk Intelligence_final Version 070910

3© Copyright Chartis Research Ltd 2010. All Rights Reserved

Table of Contents1- Executive Summary ............................................................................................................................................. 52- The need for improved risk intelligence .............................................................................................................. 63- Recent technological developments .................................................................................................................. 104- The Convergence of CEP and BI ....................................................................................................................... 115- Leading practices from Quartet FS ................................................................................................................... 126- From theory to practice ..................................................................................................................................... 167- Conclusion ........................................................................................................................................................ 198- Further Reading ................................................................................................................................................ 20

Page 4: Real-Time Risk Intelligence_final Version 070910

4 © Copyright Chartis Research Ltd 2010. All Rights Reserved

List of Figures and TablesFigure 1: Time Savings ...................................................................................................................................... 13

Figure 2: Example Deployment ......................................................................................................................... 13

Figure 3: ActivePivot Architecture .................................................................................................................... 15

Page 5: Real-Time Risk Intelligence_final Version 070910

5© Copyright Chartis Research Ltd 2010. All Rights Reserved

1- Executive SummaryA focus on the delivery of pertinent information to the right people in a timely manner is a key part of financial institutions’ review of risk management policies and procedures. Some of the most complex firms in terms of size as well as complexity of instruments and asset classes traded are striving for real-time risk intelligence. Many consider navigating the volatility of today’s markets without the ability to understand and manage risk intraday is to be very dangerous. There is a desire amongst them for tools permitting the close monitoring of risk positions at any time in live market conditions.

Conventional risk management technology architectures already employ generic business information (BI) applications that are tailored to furnish financial institutions with risk intelligence and perform analysis. Most of these architectures are more than a decade old and their embedded BI tools are not designed to aggregate enterprise-wide risk data, let alone for real-time performance.

Furthermore, technology vendors marketing integrated enterprise-wide solutions have so far failed to deliver on their promise of providing a real-time “single view of the world” when it comes to firms’ risk positions. Instead, financial institutions are seeking component solutions that can link, aggregate and visualize data from many systems.

Fortunately, technological advancements, including faster and cheaper infrastructures, in-memory data servers and complex event processing (CEP) are making real-time risk insight a reality. A new generation of specialized BI vendors is using in-memory technology and CEP to make sub-second processing possible for large data volumes. Financial institutions no longer have to wait hours for a risk report but rather can now have these reports available in seconds.

This new breed of BI tools is deployed as components, an approach proving to be quicker to implement and more flexible and than standard packaged solutions. Firms are also seeking to reuse component tools for a number of different applications. For example, a firm might implement a BI solution for position keeping and then further customise it for real-time value at risk (VaR).

This paper investigates the factors driving the development of real-time BI tools. It discusses the improved online analytical processing (OLAP) technology and in-memory solutions that together make real-time risk intelligence possible. The paper features real-life examples of how real-time BI tools have been implemented successfully for different purposes.

Page 6: Real-Time Risk Intelligence_final Version 070910

6 © Copyright Chartis Research Ltd 2010. All Rights Reserved

2- The need for improved risk intelligenceRisk intelligence refers to the ability to make informed decisions about risks based on historical, current and future views of a business. It involves identification, extraction and analysis of internal and external data such as transactional, financial, operational, counterparty and market data.

Traditionally, financial institutions (FIs) approached the need for risk information like any other business information need by adopting business intelligence (BI) tools developed internally or purchased from generic BI vendors such as Business Objects and Cognos. Most of these solutions are based on technology architectures now over a decade old. Then the challenge for system architects was to spare expensive memory and accommodate slow computer processing. Since then, there have been technological improvements including faster processors, cheaper memory and zero-footprint web deployment allowing technology vendors to improve their BI solutions.

Implementation has been a serious problem with traditional BI technologies being used for risk management purposes. Implementation projects can take months, sometimes years to complete and often morph into multi-million dollar systems integration initiatives. Ongoing frustrations with data quality as well as the process of accommodating business users’ and regulators’ requests for new reports, add to the IT resources needed to maintain systems. These issues push up the overall cost of ownership. Managing growing costs is also an obstacle to a successful outcome.

The implementation problem was compounded further, because many first- and second-generation risk application providers based their reporting and BI layers on the same traditional architectures. They also embedded third-party BI tools into their applications via original equipment manufacturer (OEM) agreements. For example, a bank might have its own standard BI tool for generic business intelligence/analytics and reporting (e.g. Business Objects). Its credit risk system would have a different BI tool (e.g. SAS) embedded in it and its operational risk system, would have still another BI tool embedded in it (e.g. Cognos). The operational risk function would likely have some in-house-developed tools as well as a number of Excel- or MS Access-based tools – resulting in BI practices.

Surveys conducted by Chartis Research between 2006 and 2009 show the average total implementation time for a risk management system to be 14 months with five months to deploy the first usable analytic application. These surveys also revealed that on average 70% of the project effort (time and resources) were spent on data management tasks, for example extract, load and transfer (ETL), data quality and data mapping tasks. These data management activities are often underestimated in terms of complexity and cost which add to overall implementation time. These surveys revealed a number of key lessons:

• Building a risk-data warehouse and positioning generic BI tools on top does not lead to better risk-based decision making. It just provides more technology. FIs should involve the end-users (i.e. decision makers) in the development process to ensure that the technology will deliver timely risk insight to the right place.

• Data reporting and visualization has been an afterthought in many risk technology projects. Too much focus has been given to the implementation of specific silo applications. The consequence is risk data aggregation challenges throughout the financial services industry.

• Risk technology vendors have consistently failed to deliver their promise of an enterprise-wide risk management system integrating the spectrum of risk applications (i.e. credit risk, operational risk, market risk, ALM, liquidity risk) onto a single platform that delivers a “single version of the truth” across all business lines, asset classes and entities. Increasingly FIs are seeking component tools to help them link, aggregate and visualize data from disparate risk and trading systems by providing an umbrella layer over all their internal and external risk applications. This component approach provides significantly more flexibility and greater control than the packaged solution approach.

• The traditional static and reactive online analytical processing (OLAP) functionality used by standard BI tools is not sufficient for monitoring risk within FIs’ fast-paced and dynamic environments. Many FIs now view real-time risk intelligence as a mission-critical capability, particularly within the trading and capital markets arena. The need for real-time risk intelligence was highlighted during the financial crisis when it became clear conventional technology architectures could not deliver real-time data aggregation and reporting. A lack of vital and updated risk information such as counterparty risk exposures, liquidity numbers and risk-based performance metrics for intraday trading made informed decision-making very difficult. This lack resulted in wrong decisions, and in many cases no decisions, being taken.

Page 7: Real-Time Risk Intelligence_final Version 070910

7© Copyright Chartis Research Ltd 2010. All Rights Reserved

Real-time risk intelligence may be the goal for many FIs, but to date only a handful have the capability, and it is limited. The situation is changing rapidly as they seek to roll out real-time analytics across their operations such as foreign exchange, interest-rate, credit, equities and fixed-income trading. These FIs are the vanguard, however, and most are still struggling with the fundamentals of risk management. FIs’ abilities in this area vary greatly and are influenced by their size, location, strategy, risk management experience and approach to IT. Some FIs have experienced chief technology officers with clear ideas of what’s available on the market and how they can build in-house solutions. Others are more dependent on vendors for ideas and solutions.

Chartis has identified stages of maturity, which serve to characterize any FI seeking to implement risk management solutions. Chartis uses a five-level scale: silos, risk data aggregation, advanced analytics, stress and scenario testing, real-time risk insight.

2.1 Silos

Silos are still widely used by FIs as they seek to segregate risk types and manage them. They house separate risk management systems and address business-specific risk exposures. Silos developed, because in most cases FIs do not take an enterprise-wide approach to risk management. Each business line tends to build its own systems independently to reflect the specific risks they seek to manage. These systems are maintained by specialist IT staff. Silos have been criticized for as long as they have been recognized as a way to implement risk management solutions, because of the costly duplication of technology and effort. However, the failure of a new solution to replace them means they have remained as standard practice at many FIs.

One of the biggest criticisms of the silo approach is that it leads to FIs having many single-purpose or duplicated risk solutions. One consequence is firms are paying to have the same solution implemented many times over. Even where FIs do need point solutions to achieve a certain outcome, they end up duplicating the provisioning of information for risk analysis, particularly where they are using independent and proprietary information sources.

FIs using the silo-approach for risk management are not able to share or aggregate information across organizational boundaries. That means it’s very difficult to create a view of enterprise-wide risks, often called a “single version of the truth”. Information within silos may be relevant and intelligent, but its usefulness is limited or even hidden, because it is often concealed in application-specific formats or databases. For a risk manager looking at enterprise-wide exposures, siloed information is often opaque or invisible.

Silos also cause problems for those charged with generating risk reports for internal and external consumption. Duplication makes it harder to create reporting templates to distribute risk reports and analysis. With increased demand from regulators for one-off or ongoing risk reports, FIs trying to aggregate information across many silos may struggle to adapt to increased and evolving regulatory requirements. The resource constraints created by the duplicate efforts could hinder new investment in productivity tools to reduce implementation costs and deploy solutions for new risk types.

2.2 Risk data aggregation

FIs seeking to achieve an enterprise-wide view of risk exposures take the next step of interconnecting risk data and measurements previously housed in business-specific silos. This process is called aggregation and it creates a centrally managed repository of trusted information accessible to all end-users. Aggregation is quite a complex process, because it is difficult to capture risk information in real time and assess it intraday.

This task collates information from internal and external sources and allows FIs to take a more sophisticated approach to risk management by establishing new ways to defining and analysing risk. At this level, risk analysts begin to simplify access to risk information and its analysis by creating enterprise models for risk data.

At this stage, FIs can use their risk information to build an infrastructure that will provide them with an enterprise-wide view of risk. This view focuses on business, risk, and customer intelligence and gives FIs the ability to deliver relevant and timely risk analysis to risk managers, senior executives and regulators.

Page 8: Real-Time Risk Intelligence_final Version 070910

8 © Copyright Chartis Research Ltd 2010. All Rights Reserved

2.3 Advanced analytics

Once a FI can aggregate and store its risk information in a central repository, it can begin to use it to gain insight into its enterprise-wide exposures and align risk with strategy and execution. The next step it takes is to refine and tailor the quantitative analysis it performs on risk data using statistical and mathematical modelling. The use of advanced analytics for optimization allows a FI to put to work what it has learned from aggregating its risk information to better understand its business and maximize revenues and profit. At this stage, FIs are starting to integrate risk analytics into financial decisions, business modelling, planning, budgeting and forecasting.

A FI can use analytics and modelling tools to factor risk information into parts of the business where decisions are being made, such as algorithmic trading, loan approvals or flagging fraudulent transactions. These tools include:

• Quantitative and qualitative modelling of financial risk and operational risk measurements using the analyses performed to generate risk insight;

• Models that predict outcome of various risk mitigation actions on the risk position of the enterprise;• Additional models or extensions to existing models to predict and analyze the consequences of

probable and improbable events.

Risk modelling is considered to be a sign of greater maturity, because these applications can be sourced from vendors. They don’t always need to be developed in-house, which could be a barrier for some smaller FIs.

2.4 Stress testing and scenario modelling

The economic instability caused by the Credit Crunch showed the importance of strong corporate governance and the use of stress testing and scenario modelling. Those FIs that have robust enterprise-wide risk assessment processes including stress testing and scenario modelling were generally better positioned to cope with stressful economic times. Broadly, stress-testing programmes include a thorough description of enterprise risk in enough detail to be able to forecast how a FI’s risk will behave in a volatile or distressed environment.

When stress testing, a FI must consider risks that are often not well understood such as the behaviour of complex products under stressed liquidity conditions, basis risk in relation to hedging strategies, warehousing risk, contingent risks and liquidity risk.

2.5 Real-time risk insight

At this stage, FIs are using real-time risk insight and controls to manage performance across their business lines. These FIs are able to respond to capital markets’ price movements at sub-millisecond latencies and perform billions of calculations to support complex scenario and stress-testing processes.

The real-time risk insight stage includes the ability to analyze unstructured information from newswires, emails and other sources and to factor it into decision-making processes. A FI with real-time risk insight can react immediately and perform stochastic risk calculations, adjust predictive analytics, and act on the results. FIs are able to react to and manage unforeseen circumstances. The models used for risk calculations can be recalibrated and tuned in real time to react to emerging intraday market conditions.

For sophisticated FIs dealing in complex derivative instruments and large trading volumes across many asset classes, the ability to view their risk positions in real-time has become goal for those charged with developing risk management IT. Waiting for overnight risk numbers or for calculations to take hours to complete is no longer desirable. FIs want to be able to monitor and act on risk measurements intraday. Globally, it is really only a small number of FIs that have begun to aggregate their exposures in real time and slowly they are beginning to tackle new challenges like real-time value at risk (VaR).

Page 9: Real-Time Risk Intelligence_final Version 070910

9© Copyright Chartis Research Ltd 2010. All Rights Reserved

3- Recent technological developmentsThe last decade has seen a number of key technological developments that are now converging to address the problems with conventional risk intelligence. Chartis believes that, in the medium term, these developments will bring about a paradigm change in enterprise-risk management and lead to real-time risk intelligence to become a standard capability for most FIs globally. This quiet revolution has already started. Chartis interviewed some of the early adopters and determined the key developments to be:

• Faster and cheaper technology infrastructures: Today there is a significantly different technology infrastructure available to us upon which to build risk intelligence. The mainstream availability of 64-bit processors has raised the amount of memory (RAM) that a computer can utilize relative to the old 32-bit processors – by a factor of four billion more gigabytes. Processors are much faster that they were ten years ago and memory is significantly less expensive. In some cases, the price-to-performance ratio of processors is 1,000 times higher than a decade ago. Recent developments in grid infrastructures and computing farms to handle very large volumes of data and to provide on-demand computing (utilizing grid and caching middleware such as Jini, GigaSpaces, Coherence, DataSynapse) are providing significant performance gains and allowing end-user organizations to maximize computing processing resources.

• The rise of in-memory BI: Given the problems inherent to conventional BI, particularly managing large volumes of intraday trading and risk data in real-time, a number of generic and specialized BI vendors have invested in developing in-memory capabilities associated with low-latency event-processing environments. Most traditional databases are built upon a relational model. This approach creates a need to pre-aggregate data, predefine complex dimensional hierarchies and generate cubes. The new generation of BI tools is designed so that the entire application, including the data model, is held in RAM creating an in-memory data model as it loads data from a data source. This construction enables it to access millions of cells of data and still respond to queries in less than a second.

• The rise of complex event processing (CEP): CEP is the analysis of event data, such as price changes in the securities market or financial transactions, in real-time to generate immediate support for decision making. CEP is still considered an immature market, but it has already delivered some outstanding results for FIs and brokers active in the global capital markets. In reality, CEP, or more specifically event processing (EP) is not that new. For years, firms operating in the global capital markets have been building in-house transaction management systems embedding CEP capabilities. One example is the tools firms use to monitor the changes in price to make real-time arbitrage decisions (algorithmic trading). Here the information complexity is low, but decision timeliness is fast.

Page 10: Real-Time Risk Intelligence_final Version 070910

10 © Copyright Chartis Research Ltd 2010. All Rights Reserved

4- The Convergence of CEP and BIBusiness users, traders and risk managers are generally indifferent with regards to technology and data processing. Their focus is on having access to information that supports their business decisions in a timely manner. However, Chartis believes that risk technology professionals should look into the benefits of combining the best of CEP and BI technologies to meet the requirements for real-time risk intelligence.

The linkage between CEP and BI is most likely to take place within the OLAP component of BI tools. OLAP technology makes it possible to take decisions based on a large amount of data where the decision-making is not continuous, but does require context or foresight for strategic future planning. The standard application of OLAP technology assumes a latent decision making timeline (i.e. it assumes that the need for decisions to be made will be daily, weekly, monthly). For many use cases this is effective and meets end-users information needs (e.g. monthly or daily P&L analysis, daily risk metrics, monthly compliance and financial reporting). However, in dynamic, intraday and real-time environments such as securities trading or online fraud monitoring, this inability to process data in real-time leads to an increase in risk or a loss of potential revenue.

The ability to marry CEP and OLAP technologies has the potential to realize the promise of real-time risk intelligence. CEP can deliver current and predictive analytics that round out the historical or near-real-time views delivered by most BI solutions. This convergence can be achieved in several different but complementary ways:

• Use CEP for in-memory analytics for real-time risk intelligence requirements involving data sets too large for storage by BI solutions;

• Use BI to help specify CEP processing logic. For example to provide reality checks to ensure that the right events are being pinpointed by the CEP engine and/or whether the rules or policies for responding to these events are still on target;

• Use CEP to drive the forward-looking risk analytics visualized within dynamic BI-based dashboards.

The convergence of CEP and BI has already started. There are real-life examples where the two technologies have got close to each other. These include embedding advanced BI analytics into real-time credit scoring and loan approval processes and using CEP for best-price analysis in foreign exchange trading processes monitoring multiple feeds.

The convergence of CEP and BI has also taken effect in the vendor landscape. It is most apparent at IBM. The firm has made a string of strategic acquisitions including: AptSoft (business event reporting), Micromuse (network services event processing tool, NetCool), Ilog (business rules management) Cognos (BI), and SPSS (predictive analytics). Together with its own InfoSphere Streams offering (event processing), these companies provide IBM with a broad set of capabilities in CEP and BI. IBM’s challenge is that its solutions are quite generic and require significant investment by clients to be tailored to their specific risk-intelligence requirements.

Other players that have both BI and CEP capabilities include SQLstream, Informatica, Streambase, Tibco and Oracle.

In the short to medium term, Chartis believes the winners in this space will be vertically focused vendors with innovative CEP and BI applications embedding specific domain knowledge. These vendors are more likely to provide an accelerated path to reaching the goal of enterprise-risk intelligence and provide a cost-effective solution. One such firm is Quartet FS. The firm is currently focused on the financial services industry with deep domain knowledge within the treasury and capital markets sector. A number of FIs have already implemented Quartet FS’s real-time BI solution and are benefiting from the high-performance risk -intelligence solution.

Page 11: Real-Time Risk Intelligence_final Version 070910

11© Copyright Chartis Research Ltd 2010. All Rights Reserved

5- Leading practices from Quartet FSOne vendor emerging as an innovator in real-time BI solutions is Quartet FS. This privately funded company was founded in 2005 by a group of entrepreneurs experienced in capital markets software solutions. Many of the senior team came from Summit Systems. Over the past five years, the company has developed ActivePivot, a real-time, in-memory, object-based generic aggregation OLAP engine. Its ease of integration with in-house and third-party systems as well as its ability to deliver real-time CEP makes ActivePivot a powerful point solution.

FIs are using ActivePivot for real-time aggregation, position keeping, P&L, sensitivities, and VaR. It can also be used for limits monitoring, real-time margining, potential exposure analysis, liquidity management, securities lending and as an event dashboard. Those FIs using ActivePivot have found it to be an adaptable solution. Once implemented for one use, for example position keeping, FIs have started to use ActivePivot as a solution for other projects requiring real-time capabilities, such as real-time VaR.

Today’s FIs decision-making is constant and happens under the pressure of emerging market conditions. Traditionally, organizing and correlating the mass of data needed to take decisions often took hours, or even days. However, in the wake of the Credit Crunch, new risk management priorities mean that FIs are seeking solutions to enable real-time decision-making. The challenge is being able to handle large amounts of complex data at speed. Until recently, it has been very difficult to achieve, leading to an increase in risk or loss of potential revenue.

Quartet FS’s ActivePivot application brings together continuous decision-making processes and complex data to deliver real-time business intelligence. ActivePivot aggregates and applies business methods in real-time to any object data without restriction to source, form or representation. It allows IT departments to set up a common technology to build and deploy trade blotters, inventory position, cash flow and security inventories and online risk and hedging analytics in the context of a proliferation of complex instruments, growing trading volumes and market volatility.

5.1 Improvements on standard OLAP technology

ActivePivot can provide real-time BI, because it takes standard OLAP technology to another level and combines it with the ability to provide real-time reporting and alerts. Most conventional OLAP technology enables decision-making for future or strategic planning, because it assumes decisions will be made daily or monthly, but not continuously. Like many OLAP databases ActivePivot builds n-dimensional sets of data in-memory with efficient compression algorithms. However there are a few differences that allow ActivePivot to deliver results in real-time.

ActivePivot features a transactional real-time aggregation engine that works on objects. It allows for efficient memory management and fast performance. Its compression algorithm enables a low memory footprint and incremental updates. This design makes it possible to have multiple dimensions and levels as well as real-time updates. This component also does non-linear aggregation, which allows it to perform complex functions, such as real-time VaR reporting or credit exposure.

Updates to the OLAP cube are done incrementally and only those nodes of the cube that are impacted by a data change (e.g. trade or market data) are updated. ActivePivot can process hundreds of thousands of updates per second allowing the FI to implement true real-time applications.

Another important difference from standard OLAP tools is it takes feeds directly from source systems or through a distributed cache (i.e. does not require relational or other types of databases). It accepts multiple, heterogeneous real-time data sources enabling faster data aggregation and query performance.

In addition, unlike classic OLAP tools, ActivePivot was designed to work with any object. It does not make assumptions about data sources or formats and is not limited to a database or Excel cells. Working on objects allows ActivePivot to react to real-time data services as opposed to snapshots persisted to a database or table thus removing the latency associated with classic data-mining tools.

Page 12: Real-Time Risk Intelligence_final Version 070910

12 © Copyright Chartis Research Ltd 2010. All Rights Reserved

ActivePivot allows clients to plug in their own business logic at every step of the aggregation process and to manipulate complex objects. Conventional OLAP tools work on numbers represented as columns in a relational database or file sources. This approach offers only limited scripted extension to data.

The object-oriented design of ActivePivot means that it does not have many of the limitations of an Excel pivot table. This improvement is achievable because, the axis within the pivot can be driven by the source objects and/or the data from these objects and/or the results of any calculations performed on these objects. Furthermore, there are no constraints and limits with regards to the number of axis or subcategories that can be defined or aggregated within the pivot.

ActivePivot does not impose a graphical user interface (GUI) and allows integration with a number of services or clients. Depending on the type of GUI selected for visualization, data can be either pushed or pulled to the front end. Real-time alerts can be plugged in and these will react to updates of the aggregated data and other external sources like market data.

As a component, ActivePivot can fit in with any system and is cost effective, because the server licence cores can be used in any combination on clients’ production, development and recovery machines.

Compared to packaged solutions, ActivePivot, which is a component, allows for more flexibility and client control. It doesn’t come with some of the high maintenance costs that characterize many financial risk management systems, which are tough to renegotiate. ActivePivot will work in many application domains reducing research and development and support costs. Once implemented for one purpose, ActivePivot can be reused by IT and end-users for other projects.

5.2 Faster reporting

ActivePivot shortens the time it takes to produce end-of-day reports. In a conventional architecture the data warehouse is located between the calculation engine and the reporting application, which requires all calculations to be stored in the database before reporting and aggregation can start.

ActivePivot eliminates this time-consuming step. As figure 1 illustrates the aggregation process begins when the first results are produced. The results are then ready to be visualized and analyzed by end-users immediately after the last calculation is done. This approach also facilitates calculation of incremental VaR when figures from various geographic locations are produced in different time zones, or to adjust data and perform what-if analysis.

ActivePivot uses significantly less memory than traditional relational OLAP, which enables for faster drill down into dimensions, drill through to detailed P&L vectors, interactive changes of parameters such as confidence intervals and on-the-fly calculation of marginal VaR from one level to the next.

Page 13: Real-Time Risk Intelligence_final Version 070910

13© Copyright Chartis Research Ltd 2010. All Rights Reserved

Figure 1: Time Savings

5.3 Implementation of a typical risk management application

ActivePivot’s key differentiator is that it aggregates objects instead of just numbers. Figure 2 is a simplified representation for the implementation of a typical risk management application showing how ActivePivot works to achieve real-time aggregation.

Figure 2: Example Deployment

Greeks resultsGiga Spaces

Market DataGiga Spaces

Update market

data

Risk EngineTrader

Workstation

Market Data

Trades

DataSynapse grid

Notify

Notify

Refreshresults

Request pricing6

5

4Publish greeks

results

3

2

1

REAL TIME - AGGREGATION

Page 14: Real-Time Risk Intelligence_final Version 070910

14 © Copyright Chartis Research Ltd 2010. All Rights Reserved

Trade sources send objects to ActivePivot, directly from a trading system or systems like Murex, Calypso, Summit, OpenLink or in-house applications. This process can be performed as a batch update from start-of-day initialization, as intraday periodic refreshes, or as a real-time feed using either a middleware technology like Tibco or MQSeries or a distributed cache shared with the source system. It is possible for each source system to use its own data model, which also may be different for batch or real-time updates. Intraday updates can be new trades or amendments to existing ones, which ActivePivot automatically replaces in the cubes.

To enrich the trade objects, client-specific business logic is plugged into ActivePivot. This step provides additional measures the trade source system does not provide such as data bucketing (i.e. 1M, 3M, etc.) or calling an external service, like a quantitative library or risk engine. It also can bring in additional attributes that allow end-users to drill down, perhaps using an external reference database.

Other calculations can be plugged to work on aggregated measures, also possibly using external data sources such as market data prices. These post-aggregation calculators can perform non-linear aggregations, for example VaR. They can also react to real-time updates of data sources, as well as real-time updates of aggregated measures resulting from new or amended trade objects from source trade systems. The central point is that ActivePivot is very flexible and open so internal IT teams can tailor applications to meet the exact needs of their business/end-users.

End-users can query aggregated measures and post-calculated measures through a wide range of user interfaces, including Excel. Some user interfaces can work with real-time updates and alerts can be defined and created using client-specific business logic.

A typical ActivePivot implementation project generally takes about two to three months with one or two consultants and runs with up to three incremental phases.

1. Cube: Define the cube(s) with a few lines of Java code. This step simply consists of naming dimensions and their levels, naming measures and selecting aggregator functions (such as “SUM”).

2. Object: Define the object source(s). Each source whether it is batch or real-time can use Quartet’s facades for cache, JMS, file readers, XML or CSV translation to speed up implementations.

3. Calculator: Implement a calculator for each object and or source combination. This step is where mapping can be plugged in, object attributes renamed to match dimension level and measure names. Extra measures can be calculated on the fly. Calls to external services (on grid or point to point) can be made. Calculators being written in Java offer unlimited potential of customisation. If objects contain attributes for each level and measure, this step is not required. ActivePivot’s introspection mechanism will do the work.

A proof of concept project can also run through steps 1 to 3. Figure 3 shows ActivePivot’s architecture and how it typically fits into a FI’s existing systems.

Page 15: Real-Time Risk Intelligence_final Version 070910

15© Copyright Chartis Research Ltd 2010. All Rights Reserved

Figure 3: ActivePivot Architecture

Reporting Services

Feeder Services

Market Data Service

Conguration Services

Exel, MDXViewer Browser

Htt

p XM

LACo

met

Htt

pJa

vaSc

ript

RMI/WS

GW

T (A

jax)

MD

X

Store

AP Live

Calc

ulat

ors

Inde

xers

Aggr

egat

ors

API

s

ActivePivot

Cube(s)

Post

Pro

cess

ors

Database

CACHE

JMS

Files ObjectStorage

CachingSecurity

Cubes

Schema

Catalogue

User perferences

trigger

Page 16: Real-Time Risk Intelligence_final Version 070910

16 © Copyright Chartis Research Ltd 2010. All Rights Reserved

6- From theory to practiceSeveral clients have already implemented ActivePivot for real-time position keeping and data aggregation that enables the generation of real-time business information such as VaR and market risk. These clients told Chartis they found ActivePivot’s combination of advanced OLAP and in-memory technology allows them to address the challenges of producing real-time BI.

In interviews and reviews of actual implementations, clients told Chartis that the ability to implement ActivePivot relatively quickly is another feature they found very useful. Our research also revealed the solution appeals to clients seeking components for in-house-developed systems as well as those deploying large vendor-supplied ones. Many clients intend to reuse the application for other projects requiring real-time capabilities, which also makes ActivePivot a cost-effective solution.

6.1 Position keeping and reporting

Real-time position keeping and reporting is increasingly a critical part of risk management at banks and broker dealers dealing in large transaction volumes across many asset classes and instruments. Being able to generate a snapshot of risk positions at anytime during a trading day and to analyze or “slice and dice” data as market conditions emerge will soon become a standard requirement at firms with an advanced approach to risk management. Many position-keeping and reporting tools on the market lack the ability to deliver real-time results, because they use older OLAP technology and databases.

ActivePivot enables firms to overcome the barriers to real-time risk intelligence. CMC Markets, one of the largest online derivatives trading firms, recently developed a real-time position-keeping tool using ActivePivot for its event-driven OLAP component. The firm required a single tool that was able to cope with the wide array of asset classes and instruments in which the broker deals including FX, equities and commodities.

As a firm that caters to institutional, day traders and retail clients, CMC Markets wanted to move from an architecture where customer positions, revaluations and collateral requirements were computed a few times a day to one where positions could be viewed in real-time. Improving risk management and garnering a deeper understanding of client activity certainly was one of their primary goals. Computing clients’ net positions and equity with a greater frequency would permit the firm to control risk better by delivering real-time positions to its risk engines. The firm wanted a tool that could view and analyze client positions in real time with multiple views and the ability to “slice and dice” data. Improving client experience and ability to trade was another goal. The firm wanted to give its clients better and faster margin calculations to increase their trading potential as well as offering a more up-to-date view of their open risk positions.

CMC Markets takes a build-and-buy approach to designing risk management systems. It studied some of the position-keeping tools from established players like Fidessa and SunGard. It also considered building its own in-memory margin and position server before choosing ActivePivot. What the trading firm realized was that by using an object-orientated OLAP component with an in-memory server, such as ActivePivot, it could build the tool it needed and use it more than once. The firm’s chief information officer said using ActivePivot, allowed the firm quickly to build a better product than currently offered by established players, at a lower cost.

Embedding ActivePivot into its trading platform provided the firm with a flexible BI solution that met its objectives for enhancing client experience and improving risk management. It achieved real-time and intraday client activity, margining and collateral monitoring and is now benefiting from real-time analytics. The firm is considering adding more functions to ActivePivot, for example, real-time VaR computation.

Page 17: Real-Time Risk Intelligence_final Version 070910

17© Copyright Chartis Research Ltd 2010. All Rights Reserved

Another client, WestLB, one of Germany’s largest banks, was able to plug in ActivePivot and benefit from its real-time, incremental, object-based aggregation OLAP technology to fill a gap when migrating its treasury trading transactions to a new risk management system. The client required real-time position keeping for FX trading and money market products with the ability to generate reports on P&L, risk and cash flow by counterparty, currency pairs and books. All screens needed to be able to update in real-time in response to market data changes and new trades. The bank also needed a platform that would allow them to handle large volumes from the FX business and handle heavy market data refreshes that would continually be changing the calculations and data the front office would need. For this client, swift implementation was key and it turned out it was easier to plug in ActivePivot rather than tailor its new system to its unique reporting requirements.

During a short proof-of-concept, ActivePivot was used to aggregate and load data quickly from their real-time market data feed and transaction flow. At the same time, WestLB found ActivePivot’s real-time, incremental, object-based aggregation OLAP technology would be able to handle all the functional requirements the legacy system satisfied as well as many extras that it could not offer.

Along with the benefits of advanced OLAP technology, ActivePivot allows WestLB to analyze data at many levels, using its slice and dice functionality. Again, with this first application complete, the bank is now seeking to use ActivePivot for other applications requiring real-time functionality. It also allowed the client to eliminate its legacy treasury management system thus achieving significant cost savings.

6.2 Data aggregation

After the collapse of Lehman Brothers in September 2008, it became clear to one of Quartet FS’s large European banking customers that it didn’t want to wait for a few hours for a risk report. Developing real-time risk reporting became a priority and the bank started to develop this capability using its in-house IT team.

The bank has a real-time market risk system it developed in house, which takes all its interest-rate, FX and bond exposures and computes market risk figures. It is able to calculate all the first- and second-order Greeks and real-time P&L. It runs on a farm of servers and it had a large server aggregating the results. What the bank wanted was a real-time aggregation tool.

However, for computing complex numbers, the bank required something better than the traditional OLAP cube. The disadvantage of a traditional OLAP cube is it aggregates data and numbers, whereas ActivePivot aggregates objects. Its in-memory cube is updated each time a new object is inserted.

The bank found this approach to be an elegant solution, because it could then compute VaR on the fly as a complex defined measure on the object. That ability was the killer application as far as this bank was concerned. The fact that the bank then had a solution capable of working in real time from the start was also deemed an advantage.

As a component solution, ActivePivot worked well for this bank. The majority of risk management systems have been developed in-house and the bank did not want to buy a monolithic risk system. The overall risk IT strategy has been to buy best-in-class tools and plug-ins and to use vendors that concentrate on a single product. Once ActivePivot was selected, it was deployed within a few months. The solution has met the requirements for the initial use case of real-time aggregation and now the bank has embarked on a project for using ActivePivot for real-time VaR calculations and reporting.

6.3 Eliminating the database bottleneck

Another large European bank has deployed ActivePivot as part of its proprietary Monte Carlo simulation application for VaR. This firm deals in plain vanilla products as well as options, credit derivatives and hybrid structures. Previously all calculations were deployed on a grid computing architecture and completed during their overnight batch schedule. Producing the VaR reports requires a large time allocation in the batch schedule, because of the massive data store required. Reporting was a time -consuming process.

Page 18: Real-Time Risk Intelligence_final Version 070910

18 © Copyright Chartis Research Ltd 2010. All Rights Reserved

Other OLAP tools add an additional layer on top of the data warehouse, which actually increases processing time. ActivePivot allowed the bank to remove the data bottleneck, because it is designed to work incrementally and its in-memory cube is updated each time a new object is inserted. Using it, the bank’s grid is able to produce the risk vectors continuously and, as they are produced, ActivePivot then detects these changes and updates its cube.

This process works in parallel with the grid calculation. When the last vector is computed, the cube is immediately ready for analysis. The architecture eliminates the need for an expensive database server, and decreases the time allocated for reporting in the nightly batch. In addition, using ActivePivot allowed the bank to grow its trading volume without increasing the size of the grid farm.

Page 19: Real-Time Risk Intelligence_final Version 070910

19© Copyright Chartis Research Ltd 2010. All Rights Reserved

7- ConclusionCoping with the volatility and uncertainty that characterizes today’s markets means firms need to be able to react to emerging market changes quickly. Firms dealing in large volumes of complex financial instruments, particularly derivatives, now realize risk needs to managed and understood in real time. For the first time, a new generation of BI tools designed specifically for capital markets and trading firms is making real-time risk intelligence possible. Firms using these BI tools are able to see their risk positions across different asset classes and instruments in real time. Others are using the new BI tools for real-time VaR calculation. This development is an important step forward in firms’ ability to manage risk.

To date enterprise-wide integrated risk management systems have not been implemented in a meaningful way at any institution. Instead, firms seeking a “single version of the truth” or a single view of risk positions have been forced to build their own systems. Those sophisticated enough to design their own risk management technology architecture are using these advanced BI applications for risk intelligence, which they can plug into their architecture and are easy to implement, cost effective, adaptable and reusable.

These advanced BI tools combine the latest OLAP and in-memory technologies to enable complex event processing (CEP) and thus real-time risk intelligence. Leading vendors in risk BI created an event-driven OLAP, which aggregates objects as well as simple numbers. These objects are then held in-memory where multiple users can access them instantly. Event-driven OLAP is designed to work with any object and does not make assumptions about data sources and is not limited to a database or Excel cells. Working on objects allows solutions using event-driven OLAP to react to real-time data services as opposed to snapshots persisted to a database, which allows it to work at sub-second latencies.

The combination of event-driven OLAP and in-memory solutions to create advanced BI tools means, for the first time, clients can perform a number of key risk management functions and calculations in real time. Financial institutions are using such solutions for real-time aggregation, position keeping and real-time VaR. It can also be used for limits monitoring, real-time margining, potential exposure analysis, liquidity management, securities lending and as an event dashboard. Now firms can make decisions constantly with current metrics instead of making judgements based on out-of-date information. These tools also allow firms to “slice and dice” data to obtain different risk metrics, also in real time.

Advanced BI tools like Quartet FS’ ActivePivot are already delivering benefits to clients. In addition to real-time position keeping, clients are using the tool for market risk, VaR calculations as well as real-time risk management for commodities trading. Clients who implement the tool for one purpose quickly have found other uses for it, instead of having to bring in different solutions to meet their various BI needs. The advanced BI tools are sold as components, which firms can quickly plug into their existing architecture with relative ease. The so-called pluggable architecture means clients can have complete control over their system and be totally independent from the vendor.

Chartis believes vertically focused vendors with innovative CEP and BI applications embedding specific domain knowledge will win out. Vendors of such solutions are more likely to help customers reach the goal of enterprise-wide risk intelligence and provide a cost-effective solution.

Page 20: Real-Time Risk Intelligence_final Version 070910

20 © Copyright Chartis Research Ltd 2010. All Rights Reserved

8- Further ReadingRiskTech 100 2009

Market Risk Technology Solutions 2010

Risk & Finance Integration

Credit Risk Management Systems 2010

All related research can be found at www.chartis-research.com