Does Your Firm Distrust Its Own Data?

July 27, 2017
By Eugene Grygo

A new survey from AxiomSL finds that a significant number of executives at securities firms harbor doubts about the reliability of the data that is the lifeblood of their operations.

(AxiomSL, a risk data management and regulatory reporting solutions vendor, recently surveyed 132 senior-level risk and regulatory executives in North America and Asia-Pacific (APAC) about their regulatory technology concerns. One of the key takeaways from the survey is that more half of the Asia-Pacific respondents at the executive level and more than one-third of North American executives say that they do not have faith in the accuracy of their firms’ data. FTF News got time with Gordon Elliot, the chief operating officer (COO) of AxiomSL, and asked him about this disturbing distrust of data.)

Q: Why do APAC firms distrust their own data?

A: Regulators in countries like Singapore and Australia are implementing a significant number of changes to core regulatory reporting requirements, some of which have not changed in the last 15 years.

Although most will already have the currently required information available to meet existing requirements, very often these numbers were either pulled using reference data out of legacy systems working in silos that may not have the ability to consolidate or reconcile with each other.

Over the years, banks have developed a patchwork of data sources and systems that create unnecessary friction, compromising data lineage and quality. In that vein, it’s not entirely accurate to say that banks distrust their data per se, but more the case of them becoming increasingly aware of the shortcomings of their existing data collection process and quality.

Q: What are they doing to make their data reliable?

A: The Basel Committee on Banking Supervision [BCBS] guidelines, called BCBS 239 risk aggregation principles, triggered the need among financial firms to enhance their data governance process.

For example, BCBS 239 comes on the heels of the Federal Reserve’s “Seven Principles of an Effective Capital Adequacy Process,” requiring that chief financial officers establish internal controls and data accuracy attestation requirements for their FR Y-14 reports and CCAR [Comprehensive Capital Analysis and Review].

This CFO attestation is more than a sign-off; it is a comprehensive process that includes continuous monitoring to bring end-to-end accountability for data, and it is that one of many of our tier-one clients have taken with extraordinary seriousness. But they are not alone.
Broker dealers are seeing similar urgency of their own with Net Capital Calculations, while asset managers cope with greater shareholding disclosure requirements, and tier-two and tier-three banks prepare for their own eventual regulatory mandates.

In other words, a data governance approach to risk reporting is no longer a conceptual priority; it already reaches far beyond the touchstone concerns of BCBS 239.

As a result, new impetus has grown at institutions of all sizes to create a single integrated and holistic reporting platform, one allowing firms to collaborate across desks, business units and geography, which delivers capabilities from internal analysis and aggregation to final regulatory disclosure in a comprehensive manner.

Though the details and flavors vary significantly, all of these initiatives (CCAR, BCBS 239, IFRS 9, etc.) — whether frameworks, standards or prescriptive requirements — reinforce much of what AxiomSL has supported all along: an effective governance framework establishing data lineage, standardizing and streamlining metadata characteristics, and integrated across enterprise functions like risk, finance and ops as much as possible.

It’s the only way to cope with the business and regulatory requirements yet to come.

It is truly the foundation, and having a robust and agile technology in place today is vital to ensure full control and confidence of management and regulators alike of the firms’ data and processes.

Q: What is data lineage?

A: Data lineage is key to the foundation of data strategy and quality.

Understanding the origin of the data and the flows and transformation it went through provides an organization with a complete picture that ensures data integrity. For regulatory disclosure, it is critical to have the ability to quickly respond to inquiries related to the numbers of the regulatory-facing reports.

The right lineage solution has the ability to bring business and technology users together and should be able to answer questions like:<

  • Which data sources are used in the generation of a report or reporting line?
  • What data transformation and business logic is applied to populate a reporting line/cell?
  • Can I export data lineage information in a readable and printable format to share with team members, keep track of business logic updates between different reporting cycles and for audit?
  • Can I get the list of all different reports that utilize a data source?
  • What will be the impact of updating or decommissioning a data attribute?

Industry initiatives such as BCBS 239 have pushed data lineage to the fore.

Whereas this area was concerned primarily with the back office and auditing, data lineage has now moved forward to the front-office end-users, who require access to granular detail about data flow and governance rules without being overwhelmed.  Meanwhile, regulatory bodies have begun more closely examining metadata as well.

Very few data governance tools have historically achieved such a balance of drill-down and traceability capability, documentation and filtering.

Q: What process governance changes would improve data quality?

A: To enhance your process governance, one needs to adapt a change management approach with a collaborative integrated platform.

Change management isn’t confined anymore to finance and controllers’ departments.

Instead, it has extended across the enterprise IT, CDO [chief data officer] and business functions to the CFOs, to treasury and to risk and accounting.

It’s no longer enough to just produce a report for regulators; they want to know about data sources, aggregation and how this information has navigated throughout the entire workflow process with full transparency. They are asking not only for granular and accurate regulatory reporting, but they are requiring banks to demonstrate full data and process lineage across enterprise core financial and risk management functions. Further, as stated earlier, they want top management at firms to attest the accuracy of the information.

In summary, to adapt to the changing regulatory and business environment and provide better process governance, firms will need to build a robust, transparent and flexible data-driven infrastructure which should

  • Deliver the scalability to operate on larger data sets, and run risk analysis on longer historical time periods;
  • Improve data management to curate and store much larger sets of data;
  • Provide flexible infrastructure to adapt quickly to new and more stringent regulatory, business and technical mandates;
  • Re-use and leverage data and processes while avoiding siloed responses to deliver trusted information;
  • And provide data lineage that identifies the source of every input and demonstrates how this object navigates throughout the entire workflow process.