The digital transformation enables a connected world. People, vehicles, factories, cities, digital services, and other “things” communicate with each other in real-time to provide a safe environment, efficient processes, and a fantastic user experience. This scenario only works well with data processing in real-time at scale. This blog post shares a presentation that explains why Apache Kafka plays a key role in these industries and use cases but also to connect the different stakeholders.
Event Streaming with Apache Kafka plays a massive role in processing massive volumes of data in real-time in a reliable, scalable, and flexible way integrating with various legacy and modern data sources and sinks.
I want to give you an overview of existing use cases for event streaming technology in a connected world across supply chains, industries, and customer experiences that come along with these interdisciplinary data intersections:
All these industries and sectors do not have new characteristics and requirements. They require data integration, data correlation, and real decoupling. The difference is the massively increased volumes of data.
Real-time messaging solutions have existed for many years. Hundreds of platforms exist for data integration (including ETL and ESB tooling or specific IIoT platforms). Proprietary monoliths monitor plants, telco networks, and other infrastructures for decades in real-time. But now, Kafka combines all the above characteristics in an open, scalable, and flexible infrastructure to operate mission-critical workloads at scale in real-time. And is taking over the world of connecting data.
“Apache vs. MQ/ETL/ESB” goes into more detail about this discussion.
Before we jump into the presentation, I want to cover one key trend I see across industries: A streaming data exchange with Apache Kafka:
TL;DR: If you use event streaming with Kafka in your projects (for reasons like real-time processing, scalability, decoupling), and your partner does the same, well, then it does NOT make sense to put a REST / HTTP API in the middle. Instead, the partners should be integrated in a streaming way.
APIs and API Management still have their value for some use cases, of course. Check out the comparison of “Event Streaming with Apache Kafka vs. API Gateway / API Management with Mulesoft or Kong” for more details.
Here is the slide deck covering various use cases and architectures to realize a connected world with Apache Kafka from different perspectives:
Click on the button to load the content from www.slideshare.net.
The on-demand video recording walks you through the above presentation:
Connecting the world is a key requirement across industries. Many innovative digital services are only possible through collaboration between stakeholders. Real-time messaging, integration, continuous stream processing, and replication between partners are required. Event Streaming with Apache Kafka helps with the implementation of these use cases.
What are your experiences and plans for event streaming to connect the world? Did you already build applications with Apache Kafka to connect your products and services to partners? Let’s connect on LinkedIn and discuss it! Stay informed about new blog posts by subscribing to my newsletter.
Siemens Healthineers, a global leader in medical technology, delivers solutions that improve patient outcomes and…
Discover my journey to achieving Lufthansa HON Circle (Miles & More) status in 2025. Learn…
Data streaming is a new software category. It has grown from niche adoption to becoming…
Apache Kafka and Apache Flink are leading open-source frameworks for data streaming that serve as…
This blog delves into Cardinal Health’s journey, exploring how its event-driven architecture and data streaming…
In the age of digitization, the concept of pricing is no longer fixed or manual.…