Key Regulatory Drivers to Key Business Drivers

AxiomSL | Inside View - Data Governance and BCBS239

May 12, 2017 – Alex Tsigutkin, CEO

Today’s financial services firms sit at a confluence of major internal and external change. They need to address more intrusive, fragmented, and data-intensive regulatory regimes which require greater economies of scale, information consistency, and operational transparency. As a result, the need to optimize business processes and exert higher data integrity and control is a necessity to successfully respond to the wide range of prudential reporting requirements and increasing standards around enterprise data management, including but not limited to: BCBS 239, FRTB, MiFID II, CFO Attestation and IFRS 9/CECL. Strong IT structure providing data lineage, automated workflow, analytics, reconciliation and other business functions is a crucial need for firms facing these evolving mandates. At the same time, C-suites and top executives are asking their ops and tech teams to do more with less budget. That means investments made must be optimized to deliver timely, cost-effective and demonstrable results. In short, operational optimization.

AxiomSL’s data driven infrastructure enables financial firms to adopt a strategic posture and acts as a catalyst for integrating Risk, Finance and Operational data environment where close coordination with other data-related initiatives (e.g. Basel III, stress testing, BCBS 239/RDARR, CCAR, Liquidity, FRTB, MiFID, IFRS 9, etc.) is needed to optimize regulatory spending through streamlined processes. Further, the data driven approach strengthens internal controls across the enterprise. This is accomplished by providing senior executives the capability to monitor and design processes while enforcing control frameworks at a business unit and product level.

In summary, to adapt to the changing regulatory and business environment, firms will need to build a robust, transparent and flexible infrastructure which should a) deliver the scalability to operate on larger data sets, and run risk analysis on longer historical time periods; b) improve data management to curate and store much larger sets of data, c) provide flexible infrastructure to adapt quickly to new and more stringent regulatory, business and technical mandates, d) reuse and leverage data and processes while avoiding siloed responses to deliver trusted information and e) provide data lineage that identifies the source of every input and demonstrates how this object navigates throughout the entire workflow process.