Financial institutions are under pressure from regulators to automate their regulatory reporting which includes having effective and efficient tools in place. In recent years, there has been a significant increase in innovative projects from regulators, focusing on improving processes to reduce overheads through technology and data. In the near future, regulators will want to see a holistic and readily accessible view of the reporting process and the firm's granular data.
Off the back of this, Solidatus partner Suade have launched a new webinar series - 'RegTech: Future data standards'. Throughout this series, they have been exploring the importance of data standards, artificial intelligence and interoperability within financial services. On this webinar, our Co-Founder Phillip Dutton joined them to discuss some key areas including:
- Interoperability: the key component to efficiency and agility
- The importance of auditability and transparency
- How important are data quality and accuracy when it comes to data lineage?
- The role of data standards in achieving data lineage and interoperability
- Technology solutions for achieving compliance with BCBS 239
- How to make the most of your risk data
The challenge of enabling financial institutions to fully understand where their data comes from, where it ends up, and the many changes and transitions it goes through throughout its lifecycle in financial institutions is immense. Not only does this involve large volumes of data, but also many disparate systems across some of the largest institutions in the world. It is essential that these systems within financial institutions are interoperable. However, achieving interoperability between systems within a financial institution is only one side of the story.
Given the complexity of their organisations, financial institutions inevitably hire the services of many different technology solution providers. To ensure effective governance of these systems, they must also be interoperable. Data lineage, auditability, and interoperability go hand-in-hand. If we have an effective API setup across an FI’s internal and external systems, the process of tracking the origin, development, and transformation of data becomes more manageable.
Philip and Oliver addressed the question of data lineage and data and system interoperability. The two are essential cogs in the machine of effective data management in financial institutions. If disparate systems and databases within a financial institution store data in different formats, even the APIs and the powerful data lineage systems will struggle to achieve effective data management. This is where data standardisation is key. Many financial regulators have been working on data standardisation in recent years, including the ECB, Bank of England, the FCA, and the MAS. Principle 3 on Accuracy and Integrity in BCBS 239 also mentions a dictionary of terms commonly used to refer to various identifiers in a bank’s risk management systems.
You may also be interested in:
'Understand the State of Local Government Data' with Devon County Council, Solidatus & Peak
Panel Discussion: ESG Data and Reporting - Standardising a Complex and Evolving Landscape
'Leveraging Data Lineage to Deliver Tangible Business Benefits' Webinar with A-Team
A Global Wake-Up Call: Unpicking the Complexities of ESG
Data Strategy Spotlight: Future Proofing Your Data Strategy for Today's Complex Regulatory Landscape