Event Streaming is happening all over the world. This blog post explores real-life examples across industries for use cases and architectures leveraging Apache Kafka. Learn about architectures for real-world deployments from Audi, BMW, Disney, Generali, Paypal, Tesla, Unity, Walmart, William Hill, and more. Use cases include fraud detection, mainframe offloading, predictive maintenance, cybersecurity, edge computing, track&trace, live betting, and much more.
The following sections show a few of the use cases and architectures. Check out the slide deck and video recording at the end for all examples and the architectures from the companies mentioned above.
Apache Kafka is an event streaming platform. It provides messaging, persistence, data integration, and data processing capabilities. High scalability for millions of messages per second, high availability including backward-compatibility and rolling upgrades for mission-critical workloads, and cloud-native features are some of the capabilities.
Hence, the number of different use cases is almost endless. If you learn one thing from the examples in this blog post, then remember that Kafka is not just a messaging system! While data ingestion into a Hadoop data lake was the first prominent use case, this implies <5% of actual Kafka deployments. Business applications, streaming ETL middleware, real-time analytics, and edge/hybrid scenarios are some of the other examples:
The following covers a few architectures and use cases. The presentation afterward goes into much more detail and examples from various companies about these and other use cases from various industries:
SIEM and cybersecurity are getting more important across industries. Kafka is used as open and scalable data integration and processing platform. Often combined with other SIEM solutions:
Kafka and Machine Learning are a great combination for data integration, data processing, model training, model deployment, online monitoring, and other ML tasks. The most recent innovation is discussed in detail at the latest Kafka Summit: Simplified Kafka ML architecture for streaming model training without the need for another data lake like HDFS, S3, or Spark:
Data aggregation and correlation at scale in real-time are key concepts for building innovative business applications and adding business value. The following is one example of doing continuous calculations for betting. It includes a synthetic delay to “adjust the live betting odds”:
While this is a controversial example, it shows the power of stateful streaming processing very well. I am sure you already have ideas on how to apply this to your industry.
Here are the slides from my presentation about Kafka examples across industries:
Click on the button to load the content from www.slideshare.net.
Here is the video recording with all the use cases and examples from various companies across the globe and industries:
Kafka is used everywhere across industries for event streaming, data processing, data integration, and building business applications / microservices. It is deployed successfully in mission-critical deployments at scale at silicon valley tech giants, startups, and traditional enterprises. Scenarios include cloud, multi-cloud, hybrid, and edge infrastructures.
What are your experiences with Apache Kafka and its ecosystem for event streaming? Which use cases and architectures did you deploy? What are your status quo and future strategy? Let’s connect on LinkedIn and discuss it!
In the age of digitization, the concept of pricing is no longer fixed or manual.…
In the rapidly evolving landscape of intelligent traffic systems, innovative software provides real-time processing capabilities,…
In the fast-paced world of finance, the ability to prevent fraud in real-time is not…
Choosing between Apache Kafka, Azure Event Hubs, and Confluent Cloud for data streaming is critical…
In today's data-driven world, understanding data at rest versus data in motion is crucial for…
If you ask your favorite large language model, Microsoft Fabric appears to be the ultimate…