Transforming Banking with Apache Kafka

  • Written By Coforge
  • 08/10/2020

Back in July in partnership with Confluent, Coforge held a webinar discussing the transformation being caused by event-streaming and event-driven architectures in modern banking. In this blog (and infographic) we summarise the key takeaways from that webinar, showcasing how forward-looking banks are getting ahead of the curve with real-time streaming.

When it comes to banking, event streaming and event-driven architecture is transforming the way banks operate on a daily basis. In the table below, we examine these transformations:

Payment Processing

WITHOUT EVENT STREAMINGWITH EVENT STREAMING
Batch orientated with settlements happening T+1 or T+2Real-time payment processing
High throughput that easily scales on a small hardware footprint
Legacy MQ technologies that are difficult and costly to scaleDe-coupled producers and consumers

Retail Banking

WITHOUT EVENT STREAMINGWITH EVENT STREAMING
Nightly updated account balance
Batch fraud checks
Real-time account updates
Real-time credit card fraud alerts
Limited view of the customer360 / Omni channel view of the customer

Capital Markets

WITHOUT EVENT STREAMINGWITH EVENT STREAMING
Batch oriented compliance reporting (OATS, FRTB)
Batch view of risk
The ability to aggregate disparate risk systems to get an intra-day view of market & credit risk
Real-time pricing using stream processing
Spreadsheet-driven trading
Manual clearing and settlements
Integrated and automated clearing & settlement

Click on the image below to open the full infographic with the most successful use cases in banking: 

If you would like to find out how to become a data-driven organisation with event streaming, Kafka and Confluent, then give us a call or email us at Salesforce@coforge.com

Other useful links:

What is Kafka? The Top 5 things you should know

Coforge Expert Kafka Services

Coforge Confluent Services

Latest Insights

Blogs

How to optimise Azure Data Lake Integration Using MuleSoft

In this blog about Azure Data Lake integration, we’ll focus on uploading file data using Azure Data Lake Connector.

How to Insert and Retrieve Data from Amazon DynamoDB
Blogs

How to Insert and Retrieve Data from Amazon DynamoDB

Why do users choose Amazon DynamoDB? What you’ll need to get started and highlights of the functionality in DynamoDB made possible by Mule 4.

mulesofts-anypoint-monitoring-dashboards
Blogs

MuleSofts’ Anypoint monitoring dashboards

Built-in dashboards that provide insights into Mule applications and APIs through graphical representations of data over any given period.