Azure Event Hubs

When to Choose Apache Kafka vs. Azure Event Hubs vs. Confluent Cloud for a Microsoft Fabric Lakehouse

Choosing between Apache Kafka, Azure Event Hubs, and Confluent Cloud for data streaming is critical when building a Microsoft Fabric Lakehouse. Apache Kafka offers scalability and flexibility but requires self-management and additional features for security and governance. Azure Event Hubs provides a fully managed service with tight Azure integration but has limitations in Kafka compatibility, scalability, and advanced features. Confluent Cloud delivers a complete, managed data streaming platform for analytical and transactional scenarios with enterprise features like multi-cloud support and disaster recovery. Each option caters to different needs, and this blog post will guide you in selecting the right data streaming solution for your use case.

This is part three of a blog series about Microsoft Fabric and its relation to other data platforms on the Azure cloud:

  1. What is Microsoft Fabric for Azure Cloud (Beyond the Buzz) and how it Compares (or Competes) with Snowflake and Databricks
  2. How Microsoft Fabric Complements Data Streaming (Apache Kafka, Flink, et al.)
  3. When to Choose Apache Kafka vs. Azure Event Hubs vs. Confluent Cloud for a Microsoft Fabric Lakehouse

Subscribe to my newsletter to get an email about a new blog post every few weeks.

Please read the other two articles to understand why Microsoft Fabric is not a silver bullet for every data problem. And how data streaming and Microsoft Fabric are complementary. This article focuses on choosing the right data streaming service for Microsoft Fabric data ingestion and beyond for many other use cases.

Apache Kafka – The De Facto Standard for Data Streaming

Apache Kafka has established itself as the cornerstone of data streaming, offering far more than traditional messaging systems. It provides a persistent event log that guarantees ordering and enables true decoupling of data producers and consumers and data consistency across real-time, batch and request-response APIs. Kafka Connect, which facilitates seamless integration with various data sources and sinks, and Kafka Streams, which allows for continuous stateless and stateful stream processing, complement the Kafka architecture. With its robust capabilities, Kafka is used by over 150,000 organizations worldwide. This underscores its status as a new software category, as recognized in the Forrester Wave for Streaming Data.

Benefits:

  • Vibrant Open Source Community: Kafka’s extensive community fosters continuous innovation and support, ensuring that the platform remains at the forefront of data streaming technology.
  • Reliability and Scalability: Kafka is battle-tested in diverse environments, offering unmatched reliability and scalability for critical applications.
  • Continuous Innovation: Kafka’s evolution is marked by significant advancements, such as the removal of ZooKeeper and support for tiered storage. Upcoming features include queues for Kafka and support for two-phase commit transactions, further enhancing its capabilities.

Cons:

  • Self-Managed Complexity: Operating Kafka as a self-managed system can be challenging, especially for critical use cases requiring 24/7 uptime and low latency.
  • Core-Only Offering: Kafka’s core requires additional components for a complete solution, including security, data governance, connectivity, operations tooling, monitoring, and support.
  • Cloud Integration: In cloud environments where SaaS solutions like Microsoft Fabric, Snowflake, Databricks, and MongoDB Atlas are prevalent, self-managed Kafka may not be the most cost-effective option from a total cost of ownership (TCO) perspective.

In summary, self-managed Apache Kafka does not make much sense in the cloud when you leverage other SaaS like Microsoft Fabric, Snowflake, Databricks, MongoDB Atlas, etc. Also from a TCO perspective.

The Kafka Protocol as Standard for Data Streaming

The Kafka protocol has become a de facto standard for many cloud-native services, such as Azure Event Hubs, Confluent’s KORA Engine or WarpStream. It is the foundation of these cloud services without relying on some or all of the open source Kafka implementation itself to enable a cloud-native experience.

Azure Events Hubs vs. Confluent Cloud for Kafka as a Service

Plenty of Kafka cloud services exist in the meantime. Every large software and cloud vendor has some kind of fully managed or partially managed Kafka cloud offering. Amazon, Microsoft, Google, IBM, Oracle, etc. While Confluent is the leader in the cloud-agnostic Kafka space, there are plenty of other vendors, such as Cloudera, instaclustr, Aiven, Redpanda, Streamnative, to name a few.

Check out the latest data streaming landscape to learn more about all these Kafka (and Flink) vendors and their trade-offs.

The following focuses on a comparison between Azure Event Hubs vs. Confluent Cloud, the two most common options for Kafka on the Azure cloud. Each offers unique advantages and limitations. The following is not a complete list, but the most critical aspects to compare.

Azure Event Hubs – Fully Managed Azure Services Using the Kafka Protocol

Azure Event Hubs is a proprietary, real-time data ingestion service on Microsoft Azure, designed for large-scale data ingestion into lakehouses. While it offers some Kafka API compatibility, it is not a complete replacement for Kafka.

Benefits of Azure Event Hubs

  • Fully Managed Service: In contrary to most competing Kafka cloud services, Azure Event hubs is a truly fully managed and not just provisioning some brokers and handing over all the operations, tuning and bug-fixing to the end user.
  • Real-Time Data Ingestion: Designed for real-time data ingestion, Event Hubs enables organizations to capture and process data from a wide range of sources, including IoT devices, applications, and cloud services, in real-time.
  • Integration with Azure Ecosystem: Event Hubs seamlessly integrates with other Azure services, such as Azure Stream Analytics, Azure Functions, and Azure Data Lake Storage, providing a comprehensive ecosystem for building end-to-end data processing and analytics solutions.

Limitations of Azure Event Hubs

  • Partial Kafka Compatibility: Event Hubs supports some Kafka APIs but lags behind in version updates. For instance, it recently added support for the Transaction API and Kafka Streams but is still several Kafka versions behind.
  • Scalability Constraints: Event Hubs can elastically scale only to a given quota, with low limits on partitions and latencies exceeding 100ms at gigabytes per second scale.
  • Short Data Retention: The Standard tier offers only a 7-day retention policy, making it unsuitable for long-term storage or as a system of record.
  • Separate Stream Processing: Requires additional services like Azure HDInsight on AKS for Flink or Azure Stream Analytics for stream processing. The “no code option” available through Azure Stream Analytics is yet another separate PaaS service requiring integration and has its own set of quotas and limitations.
  • Cost and total cost of ownership (TCO): Can be high for certain workloads, and because it lacks a complete data streaming platform, it often requires integration with multiple other products to achieve comprehensive functionality.

Confluent Cloud offers a fully managed data streaming platform powered by Apache Kafka and Flink and integrates seamlessly with the Azure ecosystem. As a strategic Microsoft partner, Confluent provides a unified security, management, and billing experience, with integrations across Azure services.

Benefits of Confluent

  • Fully Managed Service: In contrary to most competing Kafka cloud services, Confluent Cloud is a truly fully managed and not just provisioning some brokers and handing over all the operations, tuning and bug-fixing to the end user. In contrast to Azure Event Hubs, it includes an entire data streaming platform, not just the Kafka streaming service.
  • Comprehensive Data Streaming Platform: Confluent Cloud’s fully managed service includes various capabilities to stream, process and integrate. It includes data governance and security features for the most critical and data privacy sensitive projects.
  • Azure Integration: Pay with Azure cloud credits in the Azure marketplace and enjoy seamless integration with Azure services, including Microsoft Fabric, SQL Data Warehouse, Synapse, Cosmos DB, Databricks Analytics, Azure ML, Azure Data Lake Storage, and Azure Blob Storage.
  • Edge, Hybrid and Multi-Cloud: Confluent extends beyond just the Azure cloud, offering seamless deployment options at the edge, on-premises, and across multi-cloud environments to provide unparalleled flexibility and scalability for diverse data streaming needs.
  • Data Streaming Expertise: Confluent provides a robust data streaming product, along with unparalleled expertise and comprehensive support and consulting, enabling organizations to effectively leverage data streaming technologies to their fullest potential.
  • Innovation: If you want to get the latest features (and security fixes) for data streaming (also beyond just Kafka), Confluent is the only way to go for a cloud service.

Drawbacks of Confluent

  • Vendor Lock-in: If you choose any SaaS, you are always locked in. The same is true for any Azure service (including Azure Event Hubs). While the benefits usually exceed, some organizations only choose open source and build everything themselves. Though, with Confluent, you can also migrate across cloud providers. And because Confluent is powered by open source Kafka, you can also migrate back to a vendor-less implementation if you really want to.
  • Cost Complexity: Confluent’s fully managed services typically result in a lower TCO with less operational risk than alternatives, though open source or CSP data streaming offerings may appear to have a lower monthly cost before networking and operational management are considered.  Make sure to review all available Confluent Cloud products with your account team to understand the SKUs that make the most for you and to get the best pricing. Different offerings exist for less critical applications, high volume, and small startups. And do a TCO and risk analysis. There is a lot of potential hidden cost (like networking and other cloud provider costs).

Technical Decision: Find the Right Apache Kafka Option for Your Use Cases (Beyond the Lakehouse)

Azure Event Hubs works well as the data ingestion layer into Microsoft Fabric (if you can live with the drawbacks listed above). However, it has many limitations so that you can easily quality out Azure Event Hubs as the right Kafka solution.

“Qualify out” because of product limitations is often much easier than trying to fit several products into an architecture and comparing them.

Choosing the right Kafka option requires careful consideration of your specific use cases. Here are scenarios where Azure Event Hubs may not be suitable:

  • Multiple Consumers: Beyond simple lakehouse ingestion, Kafka is usually the data fabric for diverse data sources and sinks, including databases like Oracle and MongoDB, SaaS applications like Salesforce and ServiceNow, and microservices built with Java, Python, JavaScript, and Go.
  • Operational and Analytical Use Cases: Unified data storage and infinite retention with native Apache Iceberg integration is essential for using a data streaming platform for operational and analytical use cases.
  • Critical SLAs and/or High Throughput: Transactional workloads require uptime guarantees, low latency (even at scale), and a good disaster recovery strategy across multiple clusters.
  • Serverless Stream Processing: Leverage a complete serverless architecture as part of the data streaming platform for efficient stream processing. Implement a shift left architecture for better data quality and reduced costs.
  • Data Contracts, Policy Enforcement and Governance: Ensure robust data governance with features like data lineage, data catalog, self-service data portal, audit logs, end-to-end encryption, and data contracts and policy enforcement.

If you have any of the above requirements, it is an easy decision to qualify out Azure Event Hubs. Instead, look at Confluent and other vendors that provide the required capabilities.

Strategic Decision: Data Streaming Organization (Beyond a Lakehouse)

When embarking on a data streaming journey, it’s essential to focus on business value and long-term strategy. Establishing a data streaming organization with a center of excellence can maximize the platform’s strategic value.

Source: Confluent

Don’t just look at the first use case; a data streaming platform is strategic and adds more value as more people use the same data products. Expertise and 24/7 support are crucial, and Confluent excels in this area focusing dedicatedly on data streaming and a vast customer base. By fostering a data-driven culture, organizations can unlock the full potential of their data streaming investments.

Data Streaming in Azure Cloud: Choose the Right Tool for the Job

Choosing the right data streaming platform – Apache Kafka, Azure Event Hubs, or Confluent Cloud – depends on your specific use case within the Microsoft Fabric Lakehouse and beyond. Apache Kafka offers flexibility and scalability but requires self-management. Azure Event Hubs is a good choice for plain data ingestion into the Azure ecosystem powered by OneLake and Microsoft Fabric, but has limitations in Kafka compatibility and advanced features for a more complex enterprise architecture and especially critical, operational workloads. Confluent Cloud provides a full-featured, managed service with enterprise-level capabilities, making it ideal for strategic deployments across multiple use cases. Each option has its strengths, and careful consideration of your requirements will guide you to the best fit.

What cloud services do you use for data streaming on the Azure cloud? Is the use case just data ingestion into one lakehouse or do you have multiple consumers of the data? Do you also build operational applications with the Apache Kafka ecosystem, maybe including hybrid cloud or disaster recovery scenarios? Let’s connect on LinkedIn and discuss it! Stay informed about new blog posts by subscribing to my newsletter.

Kai Waehner

builds cloud-native event streaming infrastructures for real-time data processing and analytics

Share
Published by
Kai Waehner

Recent Posts

How Siemens Healthineers Leverages Data Streaming with Apache Kafka and Flink in Manufacturing and Healthcare

Siemens Healthineers, a global leader in medical technology, delivers solutions that improve patient outcomes and…

7 days ago

My Road to Lufthansa HON Circle Status in 2025

Discover my journey to achieving Lufthansa HON Circle (Miles & More) status in 2025. Learn…

2 weeks ago

The Data Streaming Landscape 2025

Data streaming is a new software category. It has grown from niche adoption to becoming…

3 weeks ago

Top Trends for Data Streaming with Apache Kafka and Flink in 2025

Apache Kafka and Apache Flink are leading open-source frameworks for data streaming that serve as…

3 weeks ago

Data Streaming in Healthcare and Pharma: Use Cases and Insights from Cardinal Health

This blog delves into Cardinal Health’s journey, exploring how its event-driven architecture and data streaming…

4 weeks ago

A New Era in Dynamic Pricing: Real-Time Data Streaming with Apache Kafka and Flink

In the age of digitization, the concept of pricing is no longer fixed or manual.…

1 month ago