Managing Cloud-Migration: A scalable middleware for event-driven banking

Background

New use cases with artificial intelligence and actions close to real-time - in the banking industry this calls for specific cloud solutions. Account changes, transactions and other workflows however are performed by a variety of systems and require complex workflows in the backend. Thus, migrating the data flow to a new cloud-based solution often proves to be a big obstacle.

This was the case for a well-known German bank which needed to migrate one of its core banking systems from a legacy solution to a new implementation with Temenos, a software with packaged AI and API cloud services specifically for the banking industry. Before asking Data Reply one attempt had already failed.

The migration required changes to many other systems, both internal and external, for them to integrate with the new solution organically to seamlessly provide and retrieve data. Transactions, account operations and other scenarios required different actions in specific sequences before the information can reach Temenos. Most of these scenarios were provided as input to Data Reply as Business Process Model and Notation (BPMN) diagrams to show the flow of data and further requirements.

AN EVENT-DRIVEN ARCHITECTURE

Data Reply was tasked to build a new Middleware for the migration to Temenos from scratch. The Reply team chose an event-driven approach in order to satisfy the real-time needs of many of the workflows. For an efficient exchange of events between different systems, Data Reply decided to use the Confluent Platform based on Apache Kafka.

To orchestrate the event flow through the Kafka layer the consultants agreed to leverage the Camunda Workflow Engine. Camunda allows workflows to be represented as BPMN diagrams. This way data presenting business logic is stored in a central location. This enables easy and efficient monitoring and inspection of the data flow for a noticeable increase in transparency. This way events can be sourced in different ways, depending on different factors:

  • is the system internal or external?
  • is the system capable of writing to Kafka?
  • if the system is incapable of writing to Kafka, what alternative connectivity options are there?

CONNECTING INTERNAL SYSTEMS AND TENEMOS

The internal systems which are capable, produce events directly to the Kafka layer. Status updates and responses to the actions they requested are delivered over the middleware, an event streaming platform.

For the systems that do not natively integrate with Kafka, for example Temenos itself, custom developed connectors perform the translation between the systems and the Middleware. All custom components and connectors for the different banking systems have been developed with a Microservice approach, ensuring that each service performs one job only so quality standards are met.

For example, Temenos exposes ActiveMQ queues to push and retrieve information in OFS and XML format. Since all communication between the other systems was performed in JSON, a custom connector was built to translate messages in Kafka into OFS before pushing them to the Temenos queue. A second connector performs the inverse job from XML to JSON.


CONNECTING EXTERNAL PARTNERS

External partners leveraging the client’s payment services to process consumer transactions also needed to be connected to the Middleware solution to create accounts or process transactions. Data Reply offered this through a REST API layer used both for data ingestion and for querying the status of operations triggered by external systems.

The status is represented in an Oracle database which is updated by the Middleware when transactions or account operations change their state in the workflow. The Middleware is checking for data integrity with Know Your Customer (KYC) checks or suspicious activity checks. For the KYC checks another set of custom connectors was implemented, performing REST API calls to external systems and making the workflows progress according to the outcome. In the end all information flows into the Temenos core banking system.

To enable external partners to interact with the Middleware, the REST API layer exposes endpoints that allow operations like creating a profile, creating an account, transactions and more. All endpoints have been secured with mutual TLS authentication and are only reachable through a VPN tunnel between data centers.

However, an external partner is not able to perform actions on the resources of other partners or can query their status. Information from different partners flows into the same Kafka topics and the same infrastructure but is then segregated at the REST API layer. Data segregation is performed transparently to the external partners.

THE ACHIEVEMENTS

Within four months Data Reply enabled the migration process of the client by developing and integrating a middleware scalable by design, taking advantage of the Confluent Platform and Apache Kafka. Created with a microservice approach in mind a large set of customized connectors and components allows the middleware to be scaled horizontally if needed.

The client is also able to establish workflows which are close to real-time actions. An event-driven architecture increases the speed at which information on transactions and account operations is being provided to the core banking system. Now the end-to-end workflow from receiving the triggering event to the information transport to the core banking system takes only a minute – including an automated Know Your Customer (KYC) check result and data transformation or mapping.

Furthermore, the solution requires minimal operational and maintenance effort which decreases the total cost of ownership (TCOO).

  • strip-0

    DATA REPLY

    Data Reply is the Reply group company offering a broad range of advanced analytics and AI-powered data services. We operate across different industries and business functions, enabling them to achieve meaningful outcomes through effective use of data. We have strong competences in Big Data Engineering, Data Science and IPA; we build Big Data platforms and implement ML and AI models in a manner that is repeatable, efficient, scalable, simple and yet secure. We supports companies in combinatorial optimization processes with Quantum Computing techniques that enable an engine with high computational performances.