By Ben Herzberg
Banks and financial institutions recognize the value of data in boosting their operations. According to statistics, the banking sector led in terms of data-driven decision-making within global organizations in 2020 – with 65% of respondents sharing that they rely on data. Because of this, there’s a need for stronger data management strategies that can be used to identify areas of improvement.
The Fundamental Review of the Trading Book (FRTB) regulation is expected to be implemented by banks in January 2023, and will push for a more robust data management strategy. The range of regulatory rules in this proposal requires banks to have significant business processes, data, technology modifications, comprehensive governance, and reporting policies.
So, how exactly will FRTB push for a stronger data management strategy, and what should you do about it? Let’s take a closer look.
What is FRTB?
FRTB was first introduced following the 2008 global financial crisis. This set of rules specifies the minimum regulatory capital requirements for banks’ wholesale trading activities. The financial crisis exposed the weaknesses in financial institutions such as risky transfers and the lack of boundary between the trading and the banking book.
Due to the crisis, some of the country’s biggest banks like Lehman Brothers – then the nation’s fourth-largest investment bank, filed for bankruptcy as a result of having too many mortgage-backed securities. Such occurrences prompted for an immediate response to ensure that banks survive unexpected losses from trading activities.
How FRTB will push for a stronger data management strategy
Consistent data architecture
FRTB implementation needs a lot of data to ensure banks comply with the stringent requirements. Due to the vastness of these requirements, banks have to make major changes in their technology, data architecture, and data strategy. For instance, banks operating in different regions with multiple regulators might have to split the data based on the regulator and create a data architecture that covers all jurisdictions and branches.
Each country has different financial regulations. If the bank has a branch in New York and London, both market risk systems will use a separate business timestamp when calculating their profit and loss statements. To comply with FRTB rules, such banks have to develop a data management strategy that’s consistent in every location.
Data democratization
Data democratization involves making digital data available to non-technical users for analytical purposes and it allows organizations to make better data-driven decisions. When all functional teams have the necessary data, they can identify opportunities and new ideas to help implement the FRTB rules.
To prepare for the FRTB implementation, banks will have to engage all teams and departments, including the front office, finances, and risk department. They can do this by choosing an employee or executive to run the overall implementation program and have a team with representatives from each department.
The solutions can’t be delivered by the IT team only. FRTB is a complex and large-scale regulatory program involving different requirements around the reporting of trading activities in the bank; therefore, there should be a seamless flow of information.
Alongside the increase in the number of data consumers, proper security and access controls needs to be put in place to prevent risk. As the scale goes up, this requires an increased implementation of data security automation.
Increased data integrity
FRTB expects data integrity from banks and financial institutions to prevent the systemic losses on trading books that happened during the financial crisis. By increasing this integrity, banks can avoid duplicated versions, reduce data manipulation and improve the reconciliation process.
For this to be successful, financial institutions have to make core changes in the departments that handle risk calculations as well as controls that need to be implemented to maintain data integrity. To do this, they can set up a team to oversee the implementation of the FRTB rules. This team should be responsible for maintaining the bank’s data security to reduce the risk.
Additionally, the banks will need a better data management strategy to trace and manage all the data sources and usage. This way, they can reduce the risk exposure and weak data reporting practices that were present in previous years.
New technologies
So far, the implementation of FRTB has been delayed to 2023 to give banks more time to prepare. Some financial institutions will need to adopt new technologies such as moving all their computing and storage to the cloud to ensure that they comply with the new set of rules.
For instance, a smaller financial institution may focus more on a standardized approach and buy a vendor solution because it’s more affordable. On the other hand, a larger bank may choose to use the vendor’s market data for Non Modellable Risk Factors (NMRF) or historical market data. NMRFs require banks to collect the real price observations for executed trades regularly. Sourcing this kind of data is new to most banks but it’s a requirement to pass the compliance test.
Technology advancements help with faster and more accurate data-driven financial decisions. Therefore, if the bank needs to conduct a data harmonization process, it can use the same technology for other ongoing programs and help with meeting the FRTB objectives.
Summary
Like with other major changes in the financial world, banks have to start preparing for the implementation of FRTB early enough. The set of rules comes with multiple requirements around data management and reporting to ensure that the banks manage their trading practices much better.
Data is continuously transforming the financial sector and bringing new opportunities for bank development. Therefore, a robust data management strategy will ensure that banks align all their operations in all their jurisdictions and make it easier to comply with the upcoming FRTB rules.
About the Author
Ben Herzberg is an experienced tech leader and book author with a background in endpoint security, analytics, and application & data security. Ben filled roles such as the CTO of Cynet, and Director of Threat Research at Imperva. Ben is the Chief Scientist for Satori, the DataSecOps platform.