Apache Kafka

Open API and Omnichannel with Apache Kafka in Healthcare

IT modernization and innovative new technologies change the healthcare industry significantly. This blog series explores how data streaming with Apache Kafka enables real-time data processing and business process automation. Real-world examples show how traditional enterprises and startups increase efficiency, reduce cost, and improve the human experience across the healthcare value chain, including pharma, insurance, providers, retail, and manufacturing. This is part five: Open API and Omnichannel. Examples include Care.com and Invitae.

Blog Series – Kafka in Healthcare

Many healthcare companies leverage Kafka today. Use cases exist in every domain across the healthcare value chain. Most companies deploy data streaming in different business domains. Use cases often overlap. I tried to categorize a few real-world deployments into different technical scenarios and added a few real-world examples:

Stay tuned for blog posts around these topics. Subscribe to my newsletter to get an email after each publication (no spam or ads).

Open API and Omnichannel in Healthcare

An open API is a publicly available application programming interface that provides developers with programmatic access to a proprietary software application or web service. APIs are sets of requirements that govern how one application can communicate and interact with another. Also, keep in mind that the Open API is not necessarily for B2B communication. Internal divisions in larger organizations often leverage this approach for business and technical reasons, too.

Omnichannel in healthcare connects the fragmentation between health providers, hospitals, pharmaceutical companies, and patients. Open APIs enable services and applications to improve the customer experience in the healthcare industry.

Omnichannel and customer 360 are already very prevalent and thriving in the retail industry leveraging Apache Kafka. Healthcare is way behind. People still have to use paper and fax for many use cases.

Streaming Data Exchange and Data Sharing with Apache Kafka

Currently, many APIs are provided by organizations for access via HTTP. However, synchronous request-response communication is an anti-pattern for many data streaming use cases around Apache Kafka. For that reason, data streaming with Apache Kafka is complementary to traditional API management tools like MuleSoft Anypoint, IBM API Connect, Apigee, or Kong.

The market is changing, though. On the one side, API management tools adopt data streaming APIs (native Kafka APIs, AsyncAPI as schema standard, or proprietary interfaces). On the other side, native streaming replication between Kafka clusters for data sharing is coming up more regularly in real-time at any scale:

Both HTTP API integration and native data streaming are valid options, depending on the use case. Let’s explore two healthcare case studies where data sharing and APIs are used for different use cases.

Care.com – Trusted Caregivers via Kafka and Open API

Care.com is an online marketplace for care services, including senior care and housekeeping. Their cloud-native Bravo Platform provides a simple, unified IT architecture to streamline go-to-market initiatives.

Care.com moved from a monolithic architecture into a truly decoupled, scalable microservices platform powered by the Apache Kafka ecosystem. The migration from Confluent Platform to Confluent Cloud allowed the focus on business problems. The data streaming infrastructure is serverless and consumed as a service.

The Schema Registry enables data governance across different applications built with various technologies like Java, .NET, Go, etc. The “Care APIs” (inspired by Google APIs) define all of their data and service contracts with Protobuf to communicate between different stakeholders with an Open API and enforced schemas. Additional capabilities enhance security for PII data with fine-grained RBAC and data lineage.

Invitae – Omnichannel for 24/7 Production and Data Science

Invitae is a biotechnology company that provides DNA-based testing to detect genetic abnormalities beyond what can be identified through traditional methodologies. Gene panels and single-gene testing are used for a broad range of clinical areas, including hereditary cancer, cardiology, neurology, pediatric genetics, metabolic disorders, immunology, and hematology.

While I have no idea what the above means, I like the technical details of how Invitae leverages a data streaming platform to provide a tremendous omnichannel experience from research to end-users. Their platform brings comprehensive genetic information into mainstream medical practice to improve the quality of healthcare for billions of people.

Invitae’s Kafka Summit talk, “From Zero to Streaming Healthcare in Production,” went into detail about their data streaming architecture:

The omnichannel chain of Invitae’s data flow does not stop after the DNA testing: Genetic results are often just the beginning. Invitae’s interactive, educational portal and caring genetic counselors can help patients understand their results and what to do next.

The truly decoupled infrastructure and Open API enables other stakeholders to join in and consume the data. That’s a considerable paradigm shift: Building an application entirely of streams is now possible. Additionally, data science teams consume the data for AI and Machine Learning use cases.

Open API and Omnichannel with Kafka for Improved Patient and Customer Experience

Open API and data sharing are significant game-changers for the healthcare industry. Enabling the same principles not just via “normal APIs” (= REST/HTTP) but also via data streaming empowers innovative new use cases.

Data streaming with Kafka and Open APIs can be built natively with a streaming data exchange or complementary API Management solutions to add capabilities like advanced reporting or monetization of the service consumption.

How do you leverage data streaming with Apache Kafka in the healthcare industry? What architecture does your platform use? Which products do you combine with data streaming? Let’s connect on LinkedIn and discuss it! Stay informed about new blog posts by subscribing to my newsletter.

Kai Waehner

builds cloud-native event streaming infrastructures for real-time data processing and analytics

Recent Posts

How Microsoft Fabric Lakehouse Complements Data Streaming (Apache Kafka, Flink, et al.)

In today's data-driven world, understanding data at rest versus data in motion is crucial for…

6 days ago

What is Microsoft Fabric for Azure Cloud (Beyond the Buzz) and how it Competes with Snowflake and Databricks

If you ask your favorite large language model, Microsoft Fabric appears to be the ultimate…

2 weeks ago

Real-Time Model Inference with Apache Kafka and Flink for Predictive AI and GenAI

Artificial Intelligence (AI) and Machine Learning (ML) are transforming business operations by enabling systems to…

2 weeks ago

Industrial IoT Middleware for Edge and Cloud OT/IT Bridge powered by Apache Kafka and Flink

As industries continue to adopt digital transformation, the convergence of Operational Technology (OT) and Information…

4 weeks ago

Deployment Options for Apache Kafka: Self-Managed, Fully-Managed / Serverless and BYOC (Bring Your Own Cloud)

BYOC (Bring Your Own Cloud) is an emerging deployment model for organizations looking to maintain…

1 month ago

Unified Commerce in Retail and eCommerce with Apache Kafka and Flink for Real-Time Customer 360

Delivering a seamless and personalized customer experience across all touchpoints is essential for staying competitive…

2 months ago