Apache Kafka

Real Time Locating System (RTLS) with Apache Kafka for Transportation and Logistics

Real-Time Locating System (RTLS) enables identifying and tracking the location of objects or people in real-time. It is used everywhere in transportation and logistics across industries. A postmodern RTLS requires an open architecture and high scalability. This blog post explores the use cases for RTLS, the challenges of existing implementations, and why more and more RTLS implementations rely on Apache Kafka as an open, scalable, and reliable event streaming platform.

Real-Time Locating / Tracking System (RTLS) in Supply Chain and Logistics

RTLS is a key part of many use cases across verticals. Many manufacturing processes and supply chains rely on good real-time information of assets and people. But also, other innovative scenarios could not exist without RTLS. For instance, think about ride-sharing, car-sharing, or food delivery.

An RTLS enables identifying and tracking the location of objects or people in real-time. Some examples:

  • Tracking automobiles through an assembly line
  • Locating pallets of merchandise in a warehouse
  • Finding medical equipment in a hospital
  • Track tools, machines, people (if legal) in a construction area

An RTLS has three key goals:

  • Improve safety
  • Control security
  • Optimize processes and productivity

Wireless RTLS tags are attached to objects or worn by people, and in most RTLS, fixed reference points receive wireless signals from tags to determine their location. However, more and more use cases require outdoors tracking, too. In many cases, a postmodern RTLS combines indoors and outdoors location tracking.

Challenges of Today’s Location and Tracking Systems

RTLS exist for a long time, already. Plenty of products are available on the market. While they differ in their characteristics and features, most traditional RTLS have at least some of the following technical challenges:

  • Monolithic
  • Proprietary
  • Limited Scalability
  • No Hardware Flexibility
  • Single Purpose Solution
  • Limited Integration Capabilities
  • Limited Tracking Technologies

Many vendors invest in their RTLS system. Similarly to CRM, ERP, and MES systems, many of the next generation RTLS systems are based on Kafka to solve these challenges. So feel free to check the above characteristics with your favorite vendor and how they plan to solve (or have already solved) them.

Many enterprises prefer building their own custom postmodern RTLS. This approach allows an open, flexible solution. Custom RTLS are typically built to include innovative and differentiating features that add business value and optimize the business processes.

A Postmodern RTLS for Multi-Purpose Use Cases and Architectures

From my conversations with customers across industries, I learned that use cases and requirements for RTLS changed in the last years. In addition to solving the above technical challenges, Two key differences establish a postmodern view on how to define an RTLS:

  1. RTLS is not just about location anymore. Applications leverage enhanced metadata such as speed, direction, or spatial orientation. Data integration and correlation is key for adding business value and improving processes.
  2. The combination of indoors and outdoors via hybrid architectures enables multi-purpose RTLS.

Some examples for indoors location tracking: Asset tracking monitoring, non-linear production line, geofencing for safety (cobots) and distance enforcement (e.g., Covid 19). Outdoors track&trace enables regional or global logistics, routing, and end-to-end monitoring (e.g., construction areas).

A key requirement of modern RTLS is the ability to integrate with different technologies. This includes Location Tracking Technologies such as Radiofrequency (RF), Infrared (IR), RFID, Beacon, Wi-Fi, Bluetooth, UWB, GPS, GSM, 5G, etc. But that’s not all. The RTLS also needs to integrate with the rest of the enterprise reliably in real-time at scale. This includes MES, ERP, APS, CRM, data lakes, and many other applications.

Use Cases for a Postmodern RTLS

Many use cases exist to leverage a postmodern RTLS to improve processes or build innovative new applications that were not possible beforehand. Some examples:

  • Locate and manage assets within a facility, such as finding a misplaced tool cart in a warehouse or medical equipment
  • Notification of new locations, such as an alert if a tool cart improperly has left the facility
  • Combine identity of multiple items placed in a single location, such as on a pallet
  • Locate customers, for example, in a restaurant, for delivery of food or service
  • Maintain proper staffing levels of operational areas, such as ensuring guards are in the proper locations in a correctional facility
  • Quickly and automatically account for all staff after or during an emergency evacuation
  • Automatically track and time stamp the progress of people or assets through a process, such as following a patient’s emergency room wait time, time spent in the operating room, and total time until discharge
  • Clinical-grade locating to support acute care capacity management
  • Replay past events to understand the mass movements of workflows
  • Plan future location requirements
  • Auditing for compliance cases
  • Etc.

Two important notes here:

  1. Many use cases exist for a long time already. But once again: Check out the challenges discussed above. The requirements change regarding scale, flexibility, and other characteristics.
  2. As you can see, most of these use cases do not just require location tracking but also data correlation in real-time. That’s where the optimization or added business value comes from.

Vehicle Tracking System in other Industries

Transportation and logistics are the obvious industries for real-time tracking systems. But industries not traditionally known to use vehicle tracking systems have started to use it in creative ways to improve their processes or businesses. Here are a few examples:

  • The hospitality industry has caught on to this technology to improve customer service. For example, a luxury hotel in Singapore has installed vehicle tracking systems in their limousines to ensure they can welcome their VIPs when they reach the hotel.
  • Vehicle tracking systems used in food delivery vans may alert if the temperature of the refrigerated compartment moves outside of the range of safe food storage temperatures.
  • Car rental companies are also using it to monitor their rental fleets.

The following sections explore an example using the scenario around transportation and logistics with truck delivery. Let’s look at how Apache Kafka and Event Streaming can help implement a postmodern RTLS.

Kafka-native Real-Time Locating / Tracking System (RTLS)

The following picture shows a multi-purpose Kafka-native RTLS for transportation and logistics:

The example shows three use cases of how produced events (“P”) are consumed and processed:

  • (“C1”) Real-time alerting on a single event: Monitor assets and people and send an alert to a controller, mobile app, or any other interface if an issue happens.
  • (“C2”) Continuous real-time aggregation of multiple events: Correlation data while it is in motion. Calculate average, enforce business rules, apply an analytic model for predictions on new events, or any other business logic.
  • (“C3”) Batch analytics on all historical events: Take all historical data to find insights, e.g., for analyzing issues of the past, planning future location requirements, or training analytic models.

The Kafka-native RTLS can run in the data center, cloud, or closer to the edge, e.g., in a factory close to the shop floor and production lines.

Hybrid Kafka Architecture for Transportation and Logistics for RTLS and Track&Trace

One of the benefits of Apache Kafka is the freedom to deploy the infrastructure as needed. On the one end, Kafka can be deployed as a single broker in a vehicle (like a truck or train). On the other end, a global Kafka infrastructure can spread multiple cloud providers, regions, countries, or even continents and integrate with tens or hundreds of factories or other edge locations. The reality is often somewhere in the middle. Most enterprises start small and roll it out across locations and countries over time.

The following shows a pretty powerful hybrid architecture for a Kafka-native RTLS:

 

In the above scenario, the hybrid architecture includes:

  • A 5G infrastructure with public telco and private 5G Campus networks
  • Confluent Cloud as fully-managed event streaming platform in the cloud
  • Confluent Platform deployed at the edge in the 5G Campus leveraging AWS Wavelength
  • Real-time integration with assets and people at the edge and in the cloud
  • Real-time integration with enterprise applications such as APS, CRM, or ERP systems
  • Data correlation of edge and cloud data (replicated bi-directionally in real-time with tools such as Confluent’s Cluster Linking or Apache Kafka’s MirrorMaker 2)

This is obviously just one sample architecture. Again, you are totally free to design your own architecture with the components and technologies you need for your use cases.

An RTLS system is heavily connected to the whole Supply Chain Management (SCM) process. As Kafka plays a key role in many supply chains, it is also a perfect fit for building real-time asset tracking.

Let’s now move over to two public use cases for location-based transportation and logistics with Kafka-native technologies.

Example: Bosch – Location-based Construction Site Management

The global supplier Bosch has a track&trace application leveraging Apache Kafka and Confluent Cloud: Construction site management analyzing sensors, machines, and workers.

Use cases include collaborative planning, inventory and asset management, and track, manage, and locate tools and equipment anytime and anywhere:

The example is close to the hybrid architecture I showed in the last section. The solution spans multiple construction areas in various regions and integrates with the event streaming platform running in the cloud.

Let’s now take a look at another advanced use case for a real-time location service.

Location-Analytics and Geofencing with Kafka and ksqlDB

A geofence is a virtual perimeter for a real-world geographic area and is used for location-analytics in real-time. A geo-fence could be dynamically generated—as in a radius around a point location, or a geo-fence can be a predefined set of boundaries (such as school zones or neighborhood boundaries).

The use of a geofence is called geofencing. One example of usage involves a location-aware device of a location-based service (LBS) user entering or exiting a geo-fence. This activity could trigger an alert to the device’s user and message to the geo-fence operator. Or, in the case of a factory, it could enforce distancing during Covid 19 times.

Guido Schmutz from Trivadis has done great work on this topic: “Location Analytics and Real-time Geofencing using Apache Kafka and KSQL“. It is actually quite simple to implement with KSQL:

These ksqlDB queries create continuous stream processing that analyses and correlates sensor data in motion in real-time. As ksqlDB is a Kafka-native technology, it is possible to process millions of events per second in a reliable, scalable, and secure way.

 

Example: Lyft – Real-Time Map-Matching to Provide Accurate Locations

The ride-sharing giant Lyft shared a great example for location analytics in real-time. Lyft implemented map-matching to track customers based on the GPS information of the mobile app.

Lyft has “two main use cases for map-matching:

  1. At the end of a ride, to compute the distance traveled by a driver to calculate the fare.
  2. In real-time, to provide accurate locations to the ETA team and make dispatch decisions as well as to display the drivers’ cars on the rider app.

As the signal is often weak, Lyft enhanced and correlated the data with other data sets to get more accurate information. For instance, Lyft also uses location data from public free Wi-Fi hotspots close to the customer.

This is a great outdoors example of a modern, scalable RTLS. And once again, this example shows that the real added value of real-time data is the data correlation. It does not help if you only use real-time messaging and process the data in batch mode in a data lake.

Open, Scalable, Multi-Purpose, Real-Time RTLS based on Kafka is the New Black

Real-Time Locating System (RTLS) enables identifying and tracking the location of objects or people in real-time. This is not a new problem. But the requirements changed…

A postmodern RTLS provides an open architecture and high scalability. For this reason, more and more RTLS implementations rely on Apache Kafka as an open, scalable, and reliable event streaming platform.

Last but not least, if you wonder what the term “real-time” actually means in “RTLS” (no matter if Kafka-based or not), check out the article “Apache Kafka is NOT Hard Real-Time BUT Used Everywhere in Automotive and Industrial IoT” to understand what real-time really means.

What are your experiences with RTLS architectures and applications? Did you already use Apache Kafka? Which approach works best for you? What is your strategy? Let’s connect on LinkedIn and discuss it! Stay informed about new blog posts by subscribing to my newsletter.

Kai Waehner

builds cloud-native event streaming infrastructures for real-time data processing and analytics

View Comments

  • hi sir , i am not a professional coder,but i wanna implement kafka stream , i need to discuss some issues , if possible can you say any community to help ,

Recent Posts

A New Era in Dynamic Pricing: Real-Time Data Streaming with Apache Kafka and Flink

In the age of digitization, the concept of pricing is no longer fixed or manual.…

3 days ago

IoT and Data Streaming with Kafka for a Tolling Traffic System with Dynamic Pricing

In the rapidly evolving landscape of intelligent traffic systems, innovative software provides real-time processing capabilities,…

2 weeks ago

Fraud Prevention in Under 60 Seconds with Apache Kafka: How A Bank in Thailand is Leading the Charge

In the fast-paced world of finance, the ability to prevent fraud in real-time is not…

3 weeks ago

When to Choose Apache Kafka vs. Azure Event Hubs vs. Confluent Cloud for a Microsoft Fabric Lakehouse

Choosing between Apache Kafka, Azure Event Hubs, and Confluent Cloud for data streaming is critical…

4 weeks ago

How Microsoft Fabric Lakehouse Complements Data Streaming (Apache Kafka, Flink, et al.)

In today's data-driven world, understanding data at rest versus data in motion is crucial for…

1 month ago

What is Microsoft Fabric for Azure Cloud (Beyond the Buzz) and how it Competes with Snowflake and Databricks

If you ask your favorite large language model, Microsoft Fabric appears to be the ultimate…

1 month ago