Upload
catalyst-development-ltd
View
35
Download
0
Embed Size (px)
Citation preview
BCBS 239: The Hidden Game Changer
www.catalyst.co.uk
Why robust risk data increases profitability, strategic value, market stability – and the Board’s ability to make the right decisions November 2014 Despite crisis, regulation and even new announcements from the FSB, banks do not manage data well. This failure is systemic: dangerous not just for individual banks but for the market as a whole.
How can this be? Banks deal in numbers. Getting them right is what literally makes money. Every day, banks worldwide deal with almost unimaginable amounts of financial data. Most of the time, albeit with some sizeable, persistent flaws, they do it well: its core business. But the difficult data are not financial accounts; they’re management information and in particular, risk data - possibly the most complex numbers of all. Most banks simply do not have the right information in the right place at the right time to present to the right people in the right way that will enable them to make the best decisions. This is a serious, strategic matter. Risk data is among the most important information any bank must manage. Accountability for risk cannot be delegated: responsibility for risk management lies inescapably and directly with the Board. Yet at the same time, they cannot afford to drown in detail. When making high-stakes decisions, they must rely on others across the firm to provide accurate, timely, ‘sensible’ data. That data must be trustworthy, not only to satisfy regulators but also so that the bank’s own clients and shareholders can trust the bank’s judgements. BCBS239 tackles the potentially catastrophic issue of unreliable risk data head on. Strikingly, it also offers banks a vision of increased profitability, strategic value and enhanced market stability. Achieving that will, however, be tough. How can all parts of the bank collaborate to aggregate multiple data sources, align diverse stakeholders and reconcile disparate processes in timely, meaningful manner across hard-wired structural, technological and behavioural silos? This paper shows how to achieve exactly that.
© Catalyst Development Ltd 2
What’s the problem? BCBS 239 allows globally and in all likelihood domestically significant banks (G-SIBs and D-SIBs) a three year timeframe for compliance with a set of 14 Principles. The first banks required to comply must do so by January 2016. This, however, is far more than a matter of compliance. Implementing the principles will ensure banks - and their Boards - have better risk data with which to make crucial decisions, are less exposed to loss and safer from shocks. The clear intention of BCBS 239 is to effect radical, beneficial change through a series of clear principles and straightforward timelines. Implementing them is a different matter, presenting banks with a dual challenge of deeply ingrained complexity and hard wired culture. It is well-known that one key issue thrown up by the financial crisis of 2007 onwards was the potentially catastrophic weakness of a global economy relying on individual banks who became ‘too big to fail’. Hidden from view has been the equally devastating consequence of banks being ‘too complex to manage’ or at least too silo’d to manage well. Nowhere is this more true than with regard to the fundamental management information data that any organisation must manage well.
Historically, data ‘aggregation’ was
seen as the solution: creating lumps
of theoretically more digestible
information as the basis for key
decisions.
Unfortunately, the more aggregation,
the harder to preserve meaning and,
while good risk data is hard to
aggregate well, bad risk data is
dangerous to aggregate at all. The
underlying problem of data quality
persists.
1 http://www.bis.org/publ/bcbs239.pdf
Timing has also been a major hurdle and in risk, timeliness is everything. Further, even those banks which report risk well to the Board take too long and do so manually, opening up the dangerous combination of outdated data compounded by human error. Now, the Basel Committee are determined to avoid any recurrence of the “severe consequences” of weak risk data aggregation and risk reporting. Conscious that these important but inconspicuous matters be slipping onto the ‘slow track’ as memories of the financial crisis start to fade, they have laid out a number of potentially game-changing principles1.
What’s the solution? One of the sharpest learnings from the financial crisis, echoed in BCBS 268, was the inadequacy of banks’ IT and in particular their data architecture. As with data quality, this IT inadequacy also persists today. Addressing it will help banks recover, anticipate and prevent future problems while at the same time becoming more efficient, more profitable and part of a better functioning market community. In our opinion, BCBS 239 is a game changer precisely because it proposes a clear set of actions that allow the Board, the bank and the market to make a dramatic impact on the quality of risk management through one fundamental dimension: data. However while overtly both simple and sensible, implementing BCBS 239 will not be easy. Effective adoption demands:
significant culture change and better
governance to establish true data
ownership and its associated
responsibilities;
the establishment of a consistent
front to back data modelling
approach, embedded within the
© Catalyst Development Ltd 3
organisation to form the backbone of
any data management strategy;
the establishment of data quality
measurement and controls, as part of
a mature data management strategy.
A view from the industry The challenges of effective implementation are clear from BCBS 2682 (the December 2013 report based on the self-assessment responses of thirty banks to BCBS 239) as well as from industry forums and discussions with clients. These fall into broad themes as follows:
Many banks face difficulties in
establishing strong data aggregation
governance, architecture and
processes. As a result, they resort to
extensive manual workarounds which
are likely to impair risk data
aggregation and reporting.
Nearly half the self-assessment banks
reported material non-compliance on
Principle 2 (data architecture/IT
infrastructure), Principle 3.
(accuracy/integrity) and Principle 6
(adaptability).
Fig 1: Self Assessment Summary
2http://www.bis.org/publ/bcbs268.pdf
A third of banks said they currently do
not expect to comply fully with at least
one Principle by the deadline.
Banks may also have overstated their
compliance in various ways.
For example:
there was clear double speak in that
the risk reporting principles scored
better on compliance yet have a clear
dependency on the other two
categories;
banks’ own assessment did not
necessarily include all material group
entities, all levels of management
reporting and all types of material risk
(ie not just credit and market risk).
The report also underlined those
requirements where the regulator has
real concerns over the banks’ ability
to comply. These are summarised in
the next section.
© Catalyst Development Ltd 4
Basel self-assessment themes There are several recurring themes beyond the automation and control requirements already familiar to banks. In fact the guidance is underpinned by a consistent requirement for data classification, consistency, documentation, quality and ownership. In our opinion, banks need the following three high level attributes to comply and to benefit fully from compliance:
a coherent data management
strategy, underpinned by a detailed
understanding of the data and the
interrelationships via data modelling
solutions;
a culture of data ownership and
responsibility for risk data running
through the organisation;
robust oversight through an effective
governance structure, driven by policy
set by senior management.
Fig 2: Catalyst Data Modelling Framework
The route to compliance We have created a clear, high impact data modelling framework (illustrated below) that addresses the core challenges, fast tracks the bank’s weakest areas and forms part of a coherent data management strategy that will achieve BCBS 239 compliance. Our approach goes beyond minimum compliance, providing a properly designed data management framework that significantly benefits the bank’s overall business strategy. In particular, it paves the way for:
a clearer path to architecture
simplification and application
rationalisation, persistent cost drivers
within the industry;
a consistent approach to dealing with
organisational complexity and data
aggregation. Many banks are being
challenged by clients to be more
transparent and to provide better
evidence to support their reporting.
14 Principles Organisational Model Control Framework Programme Delivery Application Architecture Data Architecture
Governance Governance, culture & communication strategy
Policy, TOM & data quality scorecard
DM strategy & programme governance
IT governance & accountability
Data governance & modelling standards/ tools
Data architecture & IT infrastructure
Accountability, roles & responsibilities
Principles, data quality & architectural controls
Application, integration & data model change
Fit-for-purpose Risk target architecture (F2B)
Data ownership, concept model, LDM & PDM
Accuracy & integrity Source data ownership & delivery responsibilities
Input, validation & reconciliation controls
Source application, data, model & process change
Target “golden source”, automation & validation
Taxonomies, metadata, common standards & IDs
Completeness Risk requirements & source data delivery
Input, validation & reconciliation controls
Source application, data, model & process change
Source data, application, and functional coverage
Full FIBO aligned LDM, with lineage & derivation
Timeliness Risk requirements & source data delivery
SLAs, delivery & monitoring controls
Application, integration, model & process change
Architecture design, automation & integration
State & communication models with contracts
Adaptability Risk ownership & source data responsiveness
Stress/crisis procedures & scenarios testing controls
Organisation, application, process & model change
Architecture design, flexibility & extensibility
LDM rules & constraints
Reporting accuracy Precision requirements & risk data responsibilities
Precision validation & reconciliation controls
Organisational, process & risk application change
Application reports accurately reflect risk
Data lineage, derivation & aggregation
Comprehensiveness Risk profile alignment & forward forecasting
Owner review & sign-off on risk data & reports
Organisational, process & risk application change
Exposure and position data for all risk areas
LDM aligned to bank complexity & risk profile
Clarity & usefulness Risk data ownership & delivery responsibilities
Recipient review & sign-off on report clarity
Risk applications change & value-add interpretation
Meaningful & tailored risk application reports
Clear LDM design for core risk reporting
Frequency Ad hoc requests & risk delivery responsibilities
Monitoring, review & scenario test controls
Risk applications & process change
Architecture, integration & system monitoring
State & communication models with contracts
Distribution User segregation & risk delivery responsibilities
Recipient sign-off, access & segregation controls
Risk applications & process change
Security architecture & segregation capability
State Models and User Access & Profiling
Supervisory review Supervisor responsibility & regulator preparation
Supervisory & RCA controls Project lifecycle support DM review cycle
Application & architecture fit-for-purpose review
Regulatory & Industry alignment (FIBO)
Remedial actions & supervisory measures
Supervisor empowered & regulator alignment
Data quality measures & LDM production controls
Change management support DM change cycle
Risk front to back test & change environment
LDM to facilitate business decisions
Home/ host cooperation
Global supervisory & regulator alignment
Governance controls & global change controls
Global project & change management alignment
Global application & architecture
Global and entity wide LDM view
IT Data Management Operating Model
© Catalyst Development Ltd 5
While it is essential to consider a complete picture, our framework also recognises nuances. For instance, banks may be mature – or more mature - in certain areas and as a consequence, closer to compliance with regard to those requirements in time to meet January 2016. Our model also addresses any data
aggregation problem including, but not
limited to, all aspects of BCBS 239.
For example, a front to back data model will not be as daunting for the bank if the tools are available to automate the pre-loading of logical data from key end-user risk reports. The red areas highlight where we believe the biggest challenges lie and where we provide solutions to areas of greatest weakness. This grid highlights and also gives more specific details behind the key requirements for stronger strategy, culture and oversight.
Benefitting from the wider vision Our framework combines organisational development, operating model, risk, regulatory and operational expertise with IT and Data Modelling talent and tools, to produce a framework that not only specifically targets BCBS239 compliance but also provides a clear template for a data management strategy for the whole organisation. The framework contains five streams of activity, addressing each principal in turn. When put together it provides comprehensive coverage and a transparent communication strategy to the organisation internally and the regulators externally. The whole data architecture stream requires focus because only those banks who are most mature in developing a front to back data modelling approach are likely even to approach satisfying the BCBS239 requirements by the relevant deadline. Even the most advanced banks recognise that achieving “real” data ownership and governance as part of an overall Data Management strategy is a major challenge.
BCBS 239 offers a rare opportunity for banks to recognise that excellent data management is a solution, not a burdensome administrative problem and to embed quality measurement and data modelling best practice within the bank’s change culture.
Addressing areas of greatest need Without belittling the range and complexity of issues posed by BCBS 239 requirements, the reality is that most banks have a standard, well-rehearsed response. Our experience, confirmed by significant client feedback, confirms that for the majority of banks, the areas highlighted in our framework require the most fundamental change. Given the timeframes they also require immediate attention. Specifically, both the Data Architecture and Organisational Model streams should be immediate high priorities.
Tackling data architecture For most banks, Data Architecture is likely to be the least mature stream of activity. While the majority will model data, our experience shows that even the most mature have an inconsistent, piecemeal approach to the roll-out of data models and are a considerable distance from an effective front to back model. The approach here should be to start with the critical risk reporting data and work back to source data, before expanding to cover risk data, principal validation and gap remediation. Our tools fast-track the creation of data models, allowing clients to:
develop working risk data models in a
matter of hours, reflecting ‘as is’ risk
systems;
synthesise wide ranges of data into
congruent, industry-standard forms;
embed data quality, provenance and
controls into architecture from the
start;
deploy to desktops with web browser
access;
enable rapid reporting, semantics and
analytics over current applications,
© Catalyst Development Ltd 6
virtually independently of current
problems and disconnects;
provide clear evidence and easy
access for regulatory reporting.
This approach does not negate the fact that the analysis needs to be done and the data needs to be modelled. However, automation into the tool and effective distribution at the back end will make a major impact on productivity by focussing on core analysis. Given the consensus among banks that the challenge to model risk data fully, front to back cannot reasonably be achieved by January 2016. It is therefore even more critical to be able to offer the regulator strong evidence of progress and activity. Rapid progress along a coherent Data Management strategy directly aligned with BCBS239 is likely to be far more favourably met than a programme of system changes with additional manual controls and supervisory oversight which, while necessary, will not be satisfactory.
Achieving real culture change A requirement for fundamental cultural change underpins the whole of BCBS 239. Responsibility is placed firmly at the top table and existing approaches to data stewardship and control are judged insufficient. There is also a clear expectation that data ownership and responsibility will be required organisation-wide. This change will in itself be complex to achieve. In particular, data providers will need to be responsible for and confident in the quality of all the data they contribute to risk aggregation and reporting processes. For the Board to be able to act on risk data in good faith, there will need to be a clear chain of ownership and accountability of risk data right back to source. Currently, data validation and stewardship responsibility lives ‘downstream’. BCBS 239 requires a fundamental change – one that will not readily be accepted by data providers, who will (rightfully) claim they cannot be held responsible for how data is used elsewhere.
It will be essential to engage with these concerns in any proposals, transitional or handover arrangements, as well as to ensure ownership and responsibility are correctly modelled in the underpinning data lifecycle.
Benefits beyond basic compliance “Effective implementation of the Principles should increase the value of the bank.” BCBS239 aims to instil the need for a safe and coherent risk data management strategy through a formal framework of principles. It is however widely recognised that, if done well, it will also bring far-reaching benefits, well beyond compliance. In our view, all Boards should seize this opportunity to address fundamental, hitherto hidden problems and put in place a robust foundation for growth. A full front to back model covering all data (not just risk data) may feel a distant aspiration. But starting with the risk data use case will, in time, also allow programmes right across the business to benefit from this new approach and from stronger foundations created by a new, more robust approach. Once a bank has a mature approach to data modelling its data models will be seamlessly and inextricably linked, from business conceptual or semantic models, through logical and physical data models and ultimately to production data. This means it will be able to use data models to make far more meaningful decisions on programme development, based on a firm view of reality. As data becomes increasingly better understood and modelled through aggregation processes and reporting, the bank will also be able to build more effective automated solutions within an improved target application and integration architecture. This will in turn avoid four key risks and costs:
significant man days’ effort required
by manual adjustments to risk,
finance and regulatory reports;
© Catalyst Development Ltd 7
high levels of operational risk required
by manual or semi manual data
manipulation and transformation;
significant man days’ effort required to
rework data analysis as each
programme of project initialises;
the proliferation of applications that
duplicate functionality and create new
views of the same underlying data.
Currently, perhaps the two most obvious, topical use cases are Application Simplification or Rationalisation and Client Reporting. The latter is a further instance of creating clarity on aggregated and reported data, albeit to clients rather than regulators. The former is more fundamental, as once applications are truly mapped to a consistent data model built under a common language, then organisations will be able to drive towards a properly federated architecture with single (and best) applications being the single source of data for which it is responsible.
Conclusion Clearly, any G-SIB which has not already done so must immediately initiate a programme, with the appropriate governance and budgetary and resource commitment to deliver over the next 14 months. Those who have already started work should ensure that they are embarked on programme that will deliver wider benefits than compliance alone. Compliance will demand substantial effort. Banks should quickly establish a Data Management strategy that targets their current data modelling weaknesses and makes real headway in breaking down the cultural barriers to business data ownership and responsibility.
Once established, other key challenges, such as automation and data quality should be addressed with the right framework and under the right governance. D-SIBs should assume that they too will shortly be held to the same standards of account as regional regulators are currently actively considering this. Although D-SIBs have not yet been notified of their status, it is relatively simply to work out whether the bank will be designated as a D-SIB3. Above all, whether or not officially designated, all banks should embrace BCBS 239. If managed properly if offers benefits far beyond compliance – an integrated data management framework for risk data that acts as a powerful tool in allowing your organisation to measure, analyse, communicate and take business action on risk faster, more safely, more accurately and with greater accountability than ever before.
How can we help? Our approach to BCBS 239 is innovative and unique. We combine market-leading experts in risk and regulation, IT and data modelling operating model, governance and organisational development. Our framework not only allows banks to expedite compliance, but also delivers the wider strategic impact intended, namely “gains in efficiency, reduced probability of losses and enhanced strategic decision-making and ultimately increased profitability… Effective implementation of the Principles should increase the value of the bank.” 4
3 http://www.bis.org/publ/bcbs233.pdf
4 http://www.bis.org/publ/bcbs233.pdf
Meet our authors
About Catalyst
We are experts in optimising our clients’ balance sheet, reducing the total cost of trading and enabling regulatory compliance. We work in joint teams with our clients, combining our experience in financial markets and programme execution to deliver results. We provide honest guidance to help you succeed. We are Catalysts for enduring excellence.
Christian Lee Christian leads Catalyst’s Clearing, Risk and Regulatory team. He is acknowledged as a world-leading authority on risk, with in-depth specialist knowledge of OTC clearing and experience in a variety of risk management roles and specialisms, including market and credit risk, financial markets, middle office and regulatory matters. As Head of Risk at LCH, Christian managed the Lehman’s default: the biggest market-shaping event of recent times.
Stephen Loosley Stephen is a leading expert in OTC clearing. Having originally joined LCH Clearnet as an analyst, he left as Head of SwapClear’s middle office, running risk and operations for $300tr of IRS swaps across 50 banks. Stephen leads Catalyst’s international delivery teams in specialist OTC clearing, risk and regulatory projects.
Disclaimer: Comments in this paper are based on Catalyst’s understanding of the global regulatory landscape as at November 2014.
This document is neither intended to be comprehensive, nor to provide legal or accounting advice.
Fraser Sharpe
Fraser is an expert in IT and banking operations and a specialist in applications development, business analysis, programme and project management, global and regional departmental leadership, front to back technology, project lifecycle and business functions. Through the diversity of this experience, Fraser has extensive, detailed product, functional and process expertise.