In this field, we are involved in major Big Data lake developments, where the main users are the regulatory and risk departments associated with the growing need in the area to comply with the various standards and regulatory frameworks demanded by the legislators and federal reserves in these markets.
We needed to adapt the Data Lake to be compliant with the new IOSCO regulation which requires the exchange of collateral (IM) between entities operating with derivatives at contract formalisation time. The primary objective of this project is to provide the sensitivities information necessary to correctly calculate this Initial Margin and to perform the subsequent validation of the model using the Backtesting and Benchmarking calculations required by the regulator.
We tackled the specific project developments required by this regulation by carrying out the prior definition and analysis of requirements, for subsequent design and programming with data modelling, ETL processes based on Scala/Spark, and user testing and integration for production (deployment, process optimisation, etc.).
Once the project had been deployed, we supported it through a centre of excellence where we performed specific tasks for software quality, monitoring, analysis and optimisation of processes, coordination of updates and migrations, as well as developing “velocity tools” or common utilities with common service for all the projects on the platform.
The main technologies are Scala/Spark for processing, in a Hive database, and under a planner in Control-M.