David K Kelly, Managing Director, Quant Foundry

Many industries including finance have touted Blockchain as a method for counterparts to transact with each other where no trusted central clearer is available. The finance industry is awash with trusted central institutions, and yet there are initiatives to introduce blockchain to reduce the cost of doing business (loan trading) as well as keeping track of inventory (stock lending).

For banks managing market and credit risk, most have a mix of internally-built and vendor solutions cross-multiple locations. Each Front Office group send data to a central risk function who then normalise the data to calculate capital. One major regulatory challenge (BCBS 239) covers the need to have the provenance of data so that a risk professional can trace every item in a disclosure report back to the source.

A fragmented architecture where participants send data in batches from multiple locations makes this problematic as the risk process compress front office data to fit into engines such as VaR and tagged downstream for local reporting.

Fragmentation creates a lack of trust between the data receiver and the data provider in the risk process so that the handover in these cases is likely to be rudimentary.

The most spouted principle is that there should only be “one version of the truth”. The reality is that each instrument has multiple valuations (depending on timing) and multiple risk sensitivities, depending on who is using them. The trader needs a level of precision to hedge whereas the risk manager needs sensitivities that he/she can aggregate.

We can restate this principle that there should be “a single narrative of how the valuation and risk of an instrument evolve over its lifecycle.

The banks should consider recording all risk positions on a blockchain as a way of managing this narrative and achieving data provenance. Each blockchain would expand in the way bitcoins are “traded” each time a participant (Trader, Risk, Operations, Finance) adds new information. All participant should provide all the information used to calculate capital onto the blockchain.

The critical component is that the “trade” of data adding requires a consensus between the providers and receivers similar to buyers and sellers. At the point the administrator of the blockchain validates the new data, the updated blockchain leads to a complete trust that the information provided is the best available.

The blockchain sits below the “Process Layer” where all the processes that consume and generate data; transaction booking, EOD sensitivity calculation, sensitivity and tenor mapping, pricing model configurations, data tagging, proxying as well as governance and change processes such as methodology management, model monitoring and proxy methodology.

Such migration reduces the spaghetti of respective flows of data between the Front Office and Risk/Finance systems, many still in batch file form. Most importantly the blockchain provides the audit trail of how each the features of a transaction change during a day and over time.

Once the blockchain has a critical amount of information, the bank can migrate their VaR, CVA, Stress and PLA engines to point to the blockchain and send disaggregated results back (e.g. contributory Var, stress ladders). The bank can then build out an “Analytics Layer” that automates all the risk reporting as well as build out monitoring tasks such as modelability checks, limit utilisation, model usage, NMRF stress scenarios as well as control tools to manage the extent of proxying, VaR backtesting PL eligibility tests. Each piece of analytics would have their blockchain to store results if it makes sense to operate at an aggregate level.

For counterparty risk, each name would also constitute a blockchain where the participant can store critical information around CSA netting agreements. The counterpart blockchain would accommodate exposure profiles for epe/pfe reporting.

The advantage of introducing blockchain into the Risk and Finance process is that it does not change the overall system architecture. What it does provide is an emerging single source for the state of each transaction that participants can assess with confidence for numerous processes and analytics. The result is the unwinding of bilateral and mostly duplicate relationships between systems and processes that prove costly to maintain and change as well as evidencing compliance to BCBS 239.

Given the blockchain ledger sits underneath the current architecture, the delivery team can introduce the paradigm shift in a series of small steps. When the blockchain reaches critical mass (2-3 years) and gained status as the single version of the truth, the quant teams can then build all of the methodologies such as VaR directly on to the blockchain eliminating the inefficiencies and running costs of these legacy engines.