October 26, 2016 –
In the face of pressures from the incoming IFRS 9 regime, walls between risk and finance divisions must tumble and co-operation and communication take centre stage, says Jean-Bernard Caen, subject matter expert at AxiomSL, who fleshes out this challenge in a conversation with Risk.net
IFRS 9 has been designed to value assets and liabilities in a more risk-sensitive manner than under the prevailing IAS 39 framework. The pressures of the incoming regime can only be met if the risk and finance divisions within banks learn to work more closely together. Presently, risk and accounting data is aggregated separately and obeys different rules and norms. Yet inputs from across these divisions are needed to generate IFRS 9 reports.
How would you characterise the challenge of implementing IFRS 9 for financial institutions?
Jean-Bernard Caen: Banks have assigned a high priority to their implementation projects and allocated significant budgets to them. However, these projects are not proceeding as efficiently as they should. The major reason is the difficulty getting risk and finance professionals to communicate. They have been accustomed to working independently, and they have their own language, culture and data frameworks that do not fit easily together. This is a real constraint.
In this market landscape of analysis and regulatory compliance, financial institutions that are successfully implementing BCBS 239 and data management within their organisations are better prepared for IFRS 9. AxiomSL enterprise data management capabilities include the ability to map, aggregate and enrich source data and models across risk and finance, automate workflow process, reconcile and validate, as well as delivering visualisation, collaboration and reporting tools.
How will the relationship between the chief risk officer (CRO) and chief financial officer (CFO) need to change in an IFRS 9 environment?
Jean-Bernard Caen: They will have to learn to communicate. At present, they have very clear and separate responsibilities. The CFO signs off on the accounts and is used to talking in terms of assets and liabilities – precise values. In contrast, the CRO is in charge of ensuring that the firm is compliant with regulatory requirements and he speaks the language of probabilities and statistics. Up to now, accounting has been mainly backward-looking; with the introduction of risk sensitivity, forward-looking risk items will change the nature of accounting. Under IFRS 9 they will have to co-operate. The CRO will be delegated responsibility to set up the credit risk models to assess asset impairments, while the CFO will take charge of mapping those model outputs onto the published accounts.
At present, accounting is straightforward – for an asset you book the initial value and, under certain conditions, show amortisation. Under IFRS 9, accounting becomes much more complex. The value of an asset will be the discount of projected cashflows. Moreover, these projections, will need to exhibit a granularity beyond what the risk function is used to producing under the exposure-at-default approach. Both functions will have to embrace this new logic and come to a shared understanding.
What are the challenges involved in producing the correct assessment of expected credit loss?
Jean-Bernard Caen: With IFRS 9, the International Accounting Standards Board is demanding that firms take an economic approach to credit risk. This means calculating the risk of loss on an asset up to maturity with proper recovery estimates and a forward-looking stance, rather than relying on simplistic models, such as the regulatory internal ratings-based model. Essentially, banks must completely review how they assess credit risk – an annual review of credit ratings will no longer be sufficient.
The second challenge will be to assess the probability of default to maturity, rather than simply to a one-year horizon, which requires access to vast amounts of data to generate these estimates. The third challenge will be for banks to include forward-looking scenarios, as under IFRS 9 they will have to be explicit about how they see the economy evolving.
What should financial institutions prioritise: modelling expected losses under IFRS 9 or ensuring data governance is up to scratch?
Jean-Bernard Caen: Data. Without the right data, model outputs will be flawed. Modelling is more ‘sexy’. Banks have plenty of people skilled at modelling and too few dedicated to data management. The large amounts of data needed to satisfy IFRS 9 requirements are either not stored in firms’ existing databases or, if they are, have not been used for years and may prove unreliable.
A focus on modelling can set banks down the wrong path. For example, they have to transition from using their one-year through-the-cycle probability-of-default models to point-in-time models and lifetime models for assessing impairments. However, what I have learned from industry surveys is that many banks are considering making this transition by merely installing a multiplier into their one-year models. That is not the right way to go about this. The right way is to start with the data – analysing the term structure of projected defaults by product and counterparty type, and sourcing useful economic data to produce forward-looking scenarios. Some banks are considering asking rating agencies to provide this data, which is a possible first step.
What is the relationship between BCBS 239 and IFRS 9, and how should institutions factor this into their IFRS 9 implementation plans?
Jean-Bernard Caen: BCBS 239 sets out rules and principles that banks must apply to their risk data aggregation processes and reporting. To be compliant, banks must adopt processes to ensure the quality and reliability of their risk data, conduct more granular analyses of this data and increase the frequency of data reviews. In short, firms have to update their IT infrastructures.
However, this revamp does not apply to accounting data. Yet IFRS 9 will require the use of data that is being constrained by BCBS 239. Once again, this means there will need to be greater co-operation between risk and accounting functions. The ideal scenario is one where BCBS 239-constrained data is easy for the accounting function to access and any questions as to the origin of the data are readily answerable.
How can risk and accounting data be reconciled to fulfil IFRS 9 requirements?
Jean-Bernard Caen: This data will always be articulated differently. The goal should be for banks to be able to articulate accounting data with risk data. This requires a uniform segmentation of assets and liabilities into products and portfolios – one that is coherent across risk and finance.
From a technology perspective, the IFRS 9 requirement to converge risk and finance environments is now more compelling than ever. Duplication of work – such as parallel processes making the same set of calculations without co-ordination with other teams – should be minimised, and these processes should be automated to improve the soundness and reliability of the banks’ results.
AxiomSL’s data-driven structure enables banks to adopt a strategic posture beyond merely complying with minimum reporting and compliance requirements. AxiomSL’s change management platform acts as a catalyst to move to a converged risk and finance data environment, where close co-ordination with other data-related initiatives such as Basel III, stress testing, BCBS 239, Comprehensive Capital Analysis and Review regulation, and the Fundamental Review of the Trading Book provides senior managers with the ability to monitor redesigned processes and enforce control frameworks at business unit and product levels.
How important is it for institutions to be able to trace the lineage of data inputs into their IFRS 9 systems?
Jean-Bernard Caen: The way most banks work today is to originate data in their production systems and run them through an extract, transform and load (ETL) process. The data is then fed into an intermediary database from which the bank produces its regulatory and management reports. The problem with this approach is that it is difficult to maintain data quality and lineage when data is transformed by the process. Furthermore, if there is a change at the production system level, then the ETL processes have to modified accordingly, meaning that frequent and painful reviews are the norm.
This also means that the data trail can go cold. If a production system is changed, and data generated under the previous system is warehoused separately, you lose the ability to data mine and trace that data to its original source. This limits the flexibility and resilience of the data environment.
AxiomSL delivers a change management approach that demonstrates full data and process lineage across enterprise core financial, treasury, operations and risk management functions. Generated reports can be programmed so a user can trace a data point straight to the initiating production system. Put simply, you can take any current or historical report and determine where the data originally comes from, allowing for seamless data mining. You must still reconcile changes in the production system with the reporting system, but this automation makes isolating a data point’s lineage much simpler.
Today, banks face important choices about how to respond strategically to the convergence of risk and finance data environments. They have to reduce business-as-usual costs and achieve regulatory and accounting compliance, all the while enhancing business decision processes; adopting temporary tactical solutions will impose structural drags on a firm’s ability to meet its expected results and manage its future obligations.
This article was original published in the Risk.net Chartis Market Report on IFRS 9