This blog post explores the state of data streaming for the energy and utilities industry. The evolution of utility infrastructure, energy distribution, customer services, and new business models requires real-time end-to-end visibility, reliable and intuitive B2B and B2C communication, and integration with pioneering technologies like 5G for low latency or augmented reality for innovation. Data streaming allows integrating and correlating data in real-time at any scale to improve most workloads in the energy sector.
I look at trends in the utilities sector to explore how data streaming helps as a business enabler, including customer stories from SunPower, 50hertz, Powerledger, and more. A complete slide deck and on-demand video recording are included.
The energy & utilities industry is fundamental for a sustainable future. Garter explores the Top 10 Trends Shaping the Utility Sector in 2023: “In 2023, power and water utilities will continue to face a variety of forces that will challenge their business and operating models and shape their technology investments.
Utility technology leaders must confidently compose the future for their organizations in the midst of uncertainty during this energy transition volatile period — the future that requires your organizations to be both agile and resilient.”
The increased use of digital tools makes the expected structural changes in the energy system possible:
Artificial Intelligence (AI) with technologies like Machine Learning (ML) and Generative AI (GenAI) is a hot topic across all industries. Innovation around AI disrupts many business models, tasks, business processes, and labor.
NVIDIA created an excellent diagram showing the various opportunities for AI in the energy & utilities sector. It separates the scenarios by segment: upstream, midstream, downstream, power generation, and power distribution:
McKinsey & Company explains that “the cyberthreats facing electric-power and gas companies include the typical threats that plague other industries: data theft, billing fraud, and ransomware. However, several characteristics of the energy sector heighten the risk and impact of cyberthreats against utilities:”
Adopting trends like predictive maintenance, track&trace, proactive sales and marketing, or threat intelligence is only possible if enterprises in the energy sector can provide and correlate information at the right time in the proper context. Real-time, which means using the information in milliseconds, seconds, or minutes, is almost always better than processing data later (whatever later means):
Data streaming combines the power of real-time messaging at any scale with storage for true decoupling, data integration, and data correlation capabilities. Apache Kafka is the de facto standard for data streaming.
“Apache Kafka for Smart Grid, Utilities and Energy Production” is a great starting point to learn more about data streaming in the industry, including a few case studies not covered in this blog post – such as
“After creating a collaborative team that merged customer experience and digital capabilities, one North American utility went after a 30 percent reduction in its cost-to-serve customers in some of its core journeys.”
As the Utilities Analytics Institute explains: “Utilities need to ensure that the data they are collecting is high quality, specific to their needs, preemptive in nature, and, most importantly, real-time.” The following five characteristics are crucial to add value with real-time data:
Smart meters are a perfect example of increasing business value with real-time data streaming. As Clou Global confirms: “The use of real-time data in smart grids and smart meters is a key enabler of the smart grid“.
Possible use cases include:
Processing and correlating events from smart meters with stream processing is just one IoT use case. You can leverage “Apache Kafka and Apache Flink for many Industrial IoT and Manufacturing 4.0 use cases“.
And there is so much more if you expand your thinking from upstream through midstream to downstream applications to “transform the global supply chain with data streaming and IoT“.
Accenture points out that 84% use Cloud SaaS solutions and 79% use Cloud PaaS Solutions in the energy & utilities market for various reasons:
This is a general statistic, but this applies to all components in the data-driven enterprise, including data streaming. A company does not just move a specific application to the cloud; this would be counter-intuitive from a cost and security perspective. Hence, most companies start with a hybrid architecture and bring more and more workloads to the public cloud.
The energy & utilities industry applies various trends for enterprise architectures for cost, flexibility, security, and latency reasons. The three major topics I see these days at customers are:
Let’s look deeper into some enterprise architectures that leverage data streaming for energy & utilities use cases.
Energy and utilities require data infrastructure everywhere. While most organizations have a cloud-first strategy, there is no way around running some workloads at the edge outside a data center for cost, security, or latency reasons.
Data streaming is available everywhere:
Data synchronization across environments, regions and clouds is possible with open-source Kafka tools like MirrorMaker. However, this requires additional infrastructure and development/operations efforts. Innovative solutions like Confluent’s Cluster Linking leverage the Kafka protocol for real-time replication. This enables much easier deployments and significantly reduced network traffic.
Kafka deployments look different depending on where it needs to be deployed.
Fully managed serverless offerings like Confluent Cloud are highly recommended in the public cloud to focus on business logic with reduced time-to-market and TCO.
In a private cloud, data center or edge environment, most companies deploy on Kubernetes today to provide a similar cloud-native experience.
Kafka can also be deployed on industrial PCs (IPC) and other industrial hardware. Many use cases exist for data streaming at the edge. Sometimes, a single broker (without high availability) is good enough.
No matter how you deploy data streaming workloads, a key value is the unidirectional or bidirectional synchronization between clusters. Often, only curated and relevant data is sent to the cloud for cost reasons. Also, command & control patterns can start a business process in the cloud and send events to the edge.
The energy sector operates many monoliths, inflexible and closed software and hardware products. This is changing in this decade. OT/IT modernization and the digital transformation require open APIs, flexible scale, and decoupled applications (from different vendors).
Many companies leverage Apache Kafka to build a postmodern data historian to complement or replace existing expensive OT middleware:
Just to be clear: Kafka and any other IT software like Spark, Flink, Amazon Kinesis, and so on are NOT hard real-time. It cannot be used for safety-critical use cases with deterministic systems like autonomous driving or robotics. That is C, Rust, or other embedded software.
However, data streaming connects the OT and IT worlds. As part of that, connectivity with robotic systems, intelligent vehicles, and other IoT devices is the norm for improving logistics, integration with ERP and MES, aftersales, etc.
Learn more about this discussion in two articles:
So much innovation is happening in the energy & utilities sector. Automation and digitalization change how utilities monitor infrastructure, build customer relationships, and create completely new business models.
Most energy service providers use a cloud-first approach to improve time-to-market, increase flexibility, and focus on business logic instead of operating IT infrastructure. And elastic scalability gets even more critical with all the growing networks, 5G workloads, autonomous vehicles, drones, and other innovations.
Here are a few customer stories from worldwide energy & utilities organizations:
This blog post is just the starting point. Learn more about data streaming in the energy & utilities industry in the following on-demand webinar recording, the related slide deck, and further resources, including pretty cool lightboard videos about use cases.
The video recording explores the telecom industry’s trends and architectures for data streaming. The primary focus is the data streaming case studies. Check out our on-demand recording:
If you prefer learning from slides, check out the deck used for the above recording:
Fullscreen ModeThe state of data streaming for energy & utilities is fascinating. New use cases and case studies come up every month. This includes better data governance across the entire organization, real-time data collection and processing data across hybrid edge and cloud infrastructures, data sharing and B2B partnerships for new business models, and many more scenarios.
We recorded lightboard videos showing the value of data streaming simply and effectively. These five-minute videos explore the business value of data streaming, related architectures, and customer stories. Stay tuned; I will update the links in the next few weeks and publish a separate blog post for each story and lightboard video.
And this is just the beginning. Every month, we will talk about the status of data streaming in a different industry. Manufacturing was the first. Financial services second, then retail, telcos, gaming, and so on… Check out my other blog posts.
Let’s connect on LinkedIn and discuss it! Join the data streaming community and stay informed about new blog posts by subscribing to my newsletter.
Siemens Healthineers, a global leader in medical technology, delivers solutions that improve patient outcomes and…
Discover my journey to achieving Lufthansa HON Circle (Miles & More) status in 2025. Learn…
Data streaming is a new software category. It has grown from niche adoption to becoming…
Apache Kafka and Apache Flink are leading open-source frameworks for data streaming that serve as…
This blog delves into Cardinal Health’s journey, exploring how its event-driven architecture and data streaming…
In the age of digitization, the concept of pricing is no longer fixed or manual.…