Taking a Holistic Approach to Regulatory Change

AxiomSL | Inside View - Data Governance and BCBS239

May 23rd 2016 –

AxiomSL CEO Alex Tsigutkin recently joined a group of banking and regulatory experts at the annual North American Financial Information Summit in New York on May 18th 2016 for a conversation on the pace of regulatory change and its AxiomSL | Inside View - Data Governance and BCBS239consequences for data managers.

Throughout a wide-ranging discussion, the group agreed upon a number of core enterprise-level and collective approaches that will help financial institutions succeed as waves of regulation become the new normal—specifically around data governance strategy, risk aggregation best practices, and data interoperability.

Below is a quick roundup of these points and how they fit together.

Governance: Defining the Strategic Approach

The panelists, including voices from Deutsche Bank, State Street and the Federal Reserve, first agreed that new regulatory framework breeds—from Liquidity to BCBS 239 to capital adequacy reviews—require firms to take a step back, examine their entire data management framework and reassess how their data is governed.

Two areas were highlighted for consideration. First, as a strategic starting point banks should consider integrating their risk and finance data functions. While these functions have historically sat in separate spaces, rules from global prudential regulators increasingly require them to align and collaborate on risk reporting, with finance taking on a broader and more responsible role for information accuracy and data quality than ever before.

Second, banks need to beef up their use of technology “enablers”—including workflows, analytics, data lineage and other functionalities—as they refine data governance to cope with regulatory requirements. These functionalities help get key data in front of the right eyes faster and more accessibly, while in the back they allow institutions to adapt to new regulations quickly without systems reengineering.

Grappling with Risk Aggregation

No topic in data governance is more challenging than risk data aggregation and reporting (RDAR), a market now worth over $10 billion on its own, so the group tackled this sprawling challenge in-depth.

Many of the strategic issues cited above are especially acute in this area. Aggregating risk information at a global bank is incredibly challenging, with the myriad systems involved reflecting geographical differences in these multiple jurisdictions, merger and acquisition history creating nightmares in mapping to relevant data, new bolted-on business lines and short-term workarounds to meet stringent deadlines. Ensuring controls are in place throughout this entire process to adapt quickly to changes is a monumental task.

Two streams of regulatory pressure are pushing this part of the organization to the fore. One is timeliness: major banks must have positional data accurate and available for a wider variety of reporting purposes, more often. The other is examination of data governance, itself. The 14 BCBS 239 Principles, for example, highlight the need to look beyond the finished reporting product and delve more into underlying logic of the process. The implicit message is that RDAR must now be viewed as an enterprise-level capability, rather than a siloed, surface-only exercise. Industry progress on this front has been slow, but arguably very productive.

Coping with both of these pressures is a heavy lift, and requires strengthened aggregation capabilities—bridging multiple systems together without interruption—as well as expertise around the development of data lineage and controls. Banks that actively engage with RDAR will find ways to justify the technology investment involved, including being able to shape standardization efforts, while those lagging behind subject themselves to increasing operational risk and even punitive penalties down the line.

Interoperable Data Standards

Indeed, standardization has continued to be a crucial topic for both regulators and banks alike, too. Part of the reason why risk reporting is so difficult is that, as transactions take place, taxonomic and ontological differences in data management across firms (and even within them) can lead to significant delays, errors and exceptions management and a mess for the subsequent reporting process.

Perhaps the industry could get away with this before, but higher volumes of activity and heightened emphasis and scrutiny on systemic risk and especially the interrelationships between banks will expose these issues more today. Tracking these iterations on data becomes a challenge, too, which leaves a mess to sort out before finally producing reports—which, themselves, may be deliverable in slightly different formats depending on the country. The twin goals of transparency and operational efficiency become harder to achieve as a result.

In short, this is a major—yet broadly fixable—problem, the panel said.

Any holistic approach to data governance must include collective efforts at speaking the same language. Interoperable standards have proven highly valuable already, with XBRL provided as one successful illustration. Such a single interoperable scheme—as opposed to XML-based reporting schema varying by national regulator—would make it much simpler for firms to report, with a more precise view into exposures that regulators are looking for. Progress here could have the additional benefit of standardizing data transmission further upstream, as well.

As often is the case, the issue is actually less about expense—making these changes, and leveling the playing field for the software community, will cost relatively little—and more about engagement and negotiation.

But much like functional integration and stronger aggregation capabilities, standardization will provide necessary technical bedrock for firms as they face fast-moving and wide-ranging regulatory change in the years to come.