Kafka Streams + H2O.ai + TensorFlow (Video Recording / Live Demo)

I do a lot of presentations these days at meetups and conferences with one focus: How to leverage Apache Kafka and Kafka Streams to apply analytic models (built with H2O, TensorFlow, DeepLearning4J and other frameworks) to scalable, mission-critical environments. As many attendees have asked me, I created a video recording about this talk (focusing on live demos).

I also see many Confluent customers talking about their challenges to deploy analytic models to a mission-critical, scalable production environment. This is a completely different story than “just” developing a great, accurate model in R or Python. Educating them how Apache Kafka and Kafka Streams can help here is a key task for me these days… 🙂 This leads to many very interesting and disrupting use cases! I will blog more about this in the next months. For example, I will show an example where I train a neural networks with the concept of autoencoders to build analytic models. Some use cases for this: Anomaly detection for predictive maintenance, fraud, customer churn, etc. These neural networks will then be deployed and monitored with Apache Kafka and its Streams API.

Abstract of the Session: Apache Kafka + Machine Learning

Intelligent real time applications are a game changer in any industry. This session explains how companies from different industries build intelligent real time applications. The first part of this session explains how to build analytic models with R, Python or Scala. No matter which language you favor, you can leverage open source machine learning / deep learning frameworks like TensorFlow, DeepLearning4J or H2O.ai. The second part discusses the deployment of these built analytic models to your own applications or microservices. Here you leverage the Apache Kafka cluster and Kafka’s Streams API instead of setting up a new, complex stream processing cluster. The session focuses on live demos. It also teaches lessons learned for executing analytic models in a highly scalable, mission-critical and performant way.

Key Takeaways for the Audience

  • Insights are hidden in Historical Data, e.g. on Big Data Platforms such as Hadoop
  • Machine Learning and Deep Learning find these Insights by building Analytics Models
  • Stream Processing uses these Models (without Redeveloping) to act in Real Time
  • See different open source frameworks for Machine Learning and Stream Processing like TensorFlow, DeepLearning4J or H2O.ai to build analytic models
  • Apache Kafka, its Streams API and Machine Learning can be combined to build, apply and monitor analytic models
  • Understand how to leverage Kafka Streams to use analytic models in your own streaming microservices. Learn best practices for building and deploying analytic models in real time leveraging the open source Apache Kafka Streams platform

Code Examples on Github (Java, Kafka Streams, TensorFlow, H2O.ai)

You can find the Java code examples and analytic models for H2O and TensorFlow in my Github project.

Just clone the repository and run “maven clean package”. Then take a look at the Unit Tests to understand how to apply analytic models with Apache Kafka’s Streams API.

Video Recoding: Apache Kafka + Kafka Streams + H2O.ai + TensorFlow

Finally, here we go with the video recording:

As always, I appreciate any comments (feedback, questions, criticism)… Have fun watching the video.

You can also see a corresponding slide deck:

Click on the button to load the content from www.slideshare.net.

Load content

Kai Waehner

builds cloud-native event streaming infrastructures for real-time data processing and analytics

Recent Posts

A New Era in Dynamic Pricing: Real-Time Data Streaming with Apache Kafka and Flink

In the age of digitization, the concept of pricing is no longer fixed or manual.…

2 weeks ago

IoT and Data Streaming with Kafka for a Tolling Traffic System with Dynamic Pricing

In the rapidly evolving landscape of intelligent traffic systems, innovative software provides real-time processing capabilities,…

3 weeks ago

Fraud Prevention in Under 60 Seconds with Apache Kafka: How A Bank in Thailand is Leading the Charge

In the fast-paced world of finance, the ability to prevent fraud in real-time is not…

1 month ago

When to Choose Apache Kafka vs. Azure Event Hubs vs. Confluent Cloud for a Microsoft Fabric Lakehouse

Choosing between Apache Kafka, Azure Event Hubs, and Confluent Cloud for data streaming is critical…

1 month ago

How Microsoft Fabric Lakehouse Complements Data Streaming (Apache Kafka, Flink, et al.)

In today's data-driven world, understanding data at rest versus data in motion is crucial for…

1 month ago

What is Microsoft Fabric for Azure Cloud (Beyond the Buzz) and how it Competes with Snowflake and Databricks

If you ask your favorite large language model, Microsoft Fabric appears to be the ultimate…

2 months ago