Systems Integration in the NoSQL Era with Apache Camel and Talend (MongoDB, Neo4j, HBase, AWS S3, Hazelcast, CouchDB)

In February 2013, I was at ApacheCon NA 2013 in Portland, Oregon, USA. It was a small, but great conference. I met so many awesome Apache experts and learned a lot about several Apache projects.

Besides all of the Hadoop related projects, I was especially interested in Apache Syncope, an open source system for managing digital identities in enterprise environments, and Apache Streams, a new Incubator project that aims to develop a scalable server for the publication, aggregation, filtering and re-exposure of enterprise social activities.

My session was named “Systems Integration in the NoSQL Era with Apache Camel“. I showed how to integrate several different NoSQL databases such as MongoDB (document), Neo4j (graph), HBase (column), AWS S3 (key-value), or Hazelcast (in-memory). I used Apache Camel with text editor and IDE. Besides, I showed some open source tooling on top of Camel with Talend ESB. With Talend, you  can use a graphical user interface, all Camel code is generated. You have just to configure your routes.

Here are the slides from my talk:

Click on the button to load the content from www.slideshare.net.

Load content

If you have any further questions, feel free to write a comment or contact me via Twitter, email or social networks (LinkedIn, Xing).

 

Best regards,

Kai Wähner (Twitter: @KaiWaehner)

 

Kai Waehner

bridging the gap between technical innovation and business value for real-time data streaming, processing and analytics

Recent Posts

Apache Kafka 4.0: The Business Case for Scaling Data Streaming Enterprise-Wide

Apache Kafka 4.0 represents a major milestone in the evolution of real-time data infrastructure. Used…

3 days ago

How Apache Kafka and Flink Power Event-Driven Agentic AI in Real Time

Agentic AI marks a major evolution in artificial intelligence—shifting from passive analytics to autonomous, goal-driven…

1 week ago

Shift Left Architecture at Siemens: Real-Time Innovation in Manufacturing and Logistics with Data Streaming

Industrial enterprises face increasing pressure to move faster, automate more, and adapt to constant change—without…

2 weeks ago

The Importance of Focus: Why Software Vendors Should Specialize Instead of Doing Everything (Example: Data Streaming)

As real-time technologies reshape IT architectures, software vendors face a critical decision: specialize deeply in…

2 weeks ago

The Top 20 Problems with Batch Processing (and How to Fix Them with Data Streaming)

Batch processing introduces delays, complexity, and data quality issues that modern businesses can no longer…

3 weeks ago

Replacing Legacy Systems, One Step at a Time with Data Streaming: The Strangler Fig Approach

Modernizing legacy systems doesn’t have to mean a risky big-bang rewrite. This blog explores how…

4 weeks ago