Categories: CloudEAIESBJava / JEE

Cloud Integration with Apache Camel and Amazon Web Services (AWS): S3, SQS and SNS

The integration framework Apache Camel already supports several important cloud services (see my overview article at https://www.kai-waehner.de/blog/2011/07/09/cloud-computing-heterogeneity-will-require-cloud-integration-apache-camel-is-already-prepared for more details). This article describes the combination of Apache Camel and the Amazon Web Services (AWS) interfaces of Simple Storage Service (S3), Simple Queue Service (SQS) and Simple Notification Service (SNS). Thus, The concept of Infrastructure as a Service (IaaS) is used to access messaging systems and data storage without any need for configuration.

Registration to AWS and Setup of Camel

First, you have to register to the Amazon Web Services (for free). Most AWS services include a free monthly quota, which is absolutely sufficient to play around and develop some simple applications. As its name states, AWS uses technology-independent web services. Besides, APIs for several different programming languages are available to ease development. By the way, Camel uses the AWS SDK for Java (http://aws.amazon.com/sdkforjava), of course. The documentation is detailed and easy to understand, including tutorials, screenshots and code examples .

Hint 1:

You should read the introductions to S3, SQS and SNS (go to http://aws.amazon.com and click on „products“) and play around with the AWS Management Console (http://aws.amazon.com/console) before you continue. This step is very easy and takes less than one hour. Then, you will have a much better understanding about AWS and where Camel can help you!

Hint 2:

It really helps to look at the source code of the camel-aws component, It helps you to understand how Camel uses the AWS Java API internally. If you want to write tests, you can do it the same way. In the past, I was afraid of looking at „complex“ source code of open source frameworks. But there is no need to be scared! The camel-aws component (and most other camel components) contain only of a few classes. Everything is easy to understand. It helps you to understand Camel internals, the AWS API, and to spot and solve errors due to exceptions in your code.

In the meanwhile, the current Camel version 2.8 supports three AWS services: S3, SQS and SNS. All of them use similar concepts. Therefore, they are included in one single camel component: „camel-aws“. You have to add the libraries to your existing Camel project. As always, the simplest way is to use Maven and add the following dependency to the pom.xml:

<dependency>

<groupId>org.apache.camel</groupId>

<artifactId>camel-aws</artifactId>

<version>${camel-version}</version>

</dependency>

Configuration of the Camel Endpoint

The implementation and configuration of all three services is very similar. The URI looks like this (the code shows the SQS service):

aws-sqs://queue-name[?options]

There are two alternatives to configure your endpoint.

Using Parameters

The easy way is to use two paramters in the URI of your endpoint: „accessKey“ and „secretKey“ (you receive both after your AWS registration).

“aws-sqs://unique-queue-name?accessKey=“INSERT_ME“&secretKey=INSERT_ME”

Be aware of the following problem, which can result in a strange, non-speaking exception (thanks to Brendan Long):

You’ll need to URL encode any +’s in your secret key (otherwise, they’ll
be treated as spaces). + = %2B, so if your secretkey was
“my+secret\key”, your Camel URL should have “secretKey=my%2Bsecret\key”.

“Within the query string, the plus sign is reserved as shorthand
notation for a space. Therefore, real plus signs must be encoded. This
method was used to make query URIs easier to pass in systems which did
not allow spaces.”

Source: WC3 URI Recommendations
<http://www.w3.org/Addressing/URL/4_URI_Recommentations.html#z5>

Adding a configured AmazonClient to the Registry

If you need to do more configuration (e.g. because your system is behind a firewall), you have to add an AmazonClient object to your registry. The following code shows an example using SQS, but SNS and S3 use exactly the same concept.

@Override

protected JndiRegistry createRegistry() throws Exception {

JndiRegistry registry = super.createRegistry();

AWSCredentials awsCredentials = new BasicAWSCredentials(“INSERT_ME”, “INSERT_ME”);

ClientConfiguration clientConfiguration = new ClientConfiguration();

clientConfiguration.setProxyHost(“http://myProxyHost”);

clientConfiguration.setProxyPort(8080);

AmazonSQSClient client = new AmazonSQSClient(awsCredentials, clientConfiguration);

registry.bind(“amazonSQSClient”, client);

return registry;

}

This example overwrites the createRegistry() method of a JUnit test (extending CamelTestSupport). You can also add this information to your runtime Camel application, of course.

Apache Camel and the Simple Storage Service (S3)

Simple Storage Service (S3) is a key-value-store. You can store small to very large data. The usage is very easy. You create buckets and put key-value data into these buckets. You can also create folders within buckets to organize your data. That’s it. You can monitor your buckets using the AWS Management Console – an intuitive GUI supporting most AWS services.

The following example shows both alternatives for accessing the Amazon services (as described above): Paramenters and the AmazonClient.

// Transfer data from your file inbox to the AWS S3 service

from(“file:files/inbox”)

// This is the key of your key-value data

.setHeader(S3Constants.KEY, simple(“This is a static key”))

// Using parameters for accessing the AWS service

.to(“aws-s3://camel-integration-bucket-mwea-kw?accessKey=INSERT_ME&secretKey=INSERT_ME&region=eu-west-1”);


// Transfer data from the AWS S3 service to your file outbox

from(“aws-s3://camel-integration-bucket-mwea-kw?amazonS3Client=#amazonS3Client&region=eu-wes”)

.to(“file:files/outbox”);


There are some additional parameters, for instance you can submit the desired AWS region or delete data after receiving it (see http://camel.apache.org/aws-s3.html and the corresponding SQS and SNS sites for more details about parameters and message headers).

As you see in the code, you can use the AWS-S3 endpoint for producing and for consuming messages. Each bucket must be unique, thus you have to add some specific information such as your company to its name.

Hint:

If a bucket does not exist, Camel is creating it automatically (as the AWS API does). This concept is also used for SQS queues and SNS topics.

Apache Camel and the Simple Queue Service (SQS)

The Simple Queue Service (SQS) is similar to a JMS provider such as WebSphere MQ or ActiveMQ (but with some differences). You create queues and send messages to them. Consumers receive the messages. Contrary to most other AWS services, you cannot monitor queues by using the AWS management console directly. You have to use the service „Cloudwatch“ (http://aws.amazon.com/cloudwatch) and start an EC2 instance to monitor queues and its content.

As you can see in the following code example, the syntax and concepts are almost the same as for the S3 service:

from(“file:inbox”)

.to(“aws-sqs://camel-integration-queue-mwea-kw?accessKey=INSERT_ME&secretKey=INSERT_ME”);


from(“aws-sqs://camel-integration-queue-mwea-kw?amazonSQSClient=#amazonSQSClient”)

.to(“file:outbox?fileName=sqs-${date:now:yyyy.MM.dd-hh:mm:ss:SS}”);


Again, you can use the AWS-SQS endpoint for producing and for consuming messages. Each queue name must be unique.

There exist two important differences to JMS (copy & paste from the AWS documentation):

Q: How many times will I receive each message?

Amazon SQS is engineered to provide “at least once” delivery of all messages in its queues. Although most of the time each message will be delivered to your application exactly once, you should design your system so that processing a message more than once does not create any errors or inconsistencies.

Q: Why are there separate ReceiveMessage and DeleteMessage operations?

When Amazon SQS returns a message to you, that message stays in the queue, whether or not you actually received the message. You are responsible for deleting the message; the delete request acknowledges that you’re done processing the message. If you don’t delete the message, Amazon SQS will deliver it again on another receive request.

Apache Camel and the Simple Notification Service (SNS)

The Simple Notification Service (SNS) acts like JMS topics. You create a topic, consumers subscribe to the topic and then receive notifications. Several transport protocols are supported: HTTP(S), Email and SQS. Further interfaces will be added in the future, e.g. the Short Message Service (SMS) for mobile phones.

Contrary to S3 and SQS, Camel only offers a producer endpoint for this AWS service. You can only create topics and send messages via Camel. The reason is simple: Camel already offers endpoints for consuming these messages: HTTP, Email and SQS are already available.

There is one tradeoff: A consumer cannot subscribe to topics using Camel – at the moment. The AWS Management Console has to be used. A very interesting discussion can be read on the Camel JIRA issue regarding the following questions: Should Camel be able to subscribe to topics? Should the producer contain this feature or should there be a consumer? In my opinion, there should be a consumer which is able to subscribe to topics, otherwise Camel is missing a key part of the AWS SNS service! Please read the discussion and contribute your opinion: https://issues.apache.org/jira/browse/CAMEL-3476.

Apache Camel is already ready for the Cloud Computing Era

AWS offers many more services for the cloud. Probably, it does not make sense to integrate everyone into Camel, but more AWS services will be supported in the future. For instance, SimpleDB and the Relational Database Service (RDS) are already planned and make sende, too: http://camel.apache.org/aws.html.

The conclusion is easy: Apache Camel is already ready for the cloud computing era. Several important cloud services are already supported. Cloud integration will become very important in the future. Thus, Camel is on a very good way. Hopefully, we will see more cloud components, soon.

I will continue to write articles about other Camel cloud components (and new AWS addons, ouf course). For instance, a component for the  Platform as a Service (PaaS) product Google App Engine (GAE) is already available.

If you have any additional important information, questions or other feedback, please write a comment. Thank you in advance…

Best regards,

Kai Wähner (Twitter: @KaiWaehner)

Kai Waehner

builds cloud-native event streaming infrastructures for real-time data processing and analytics

View Comments

  • My original contribution of camel-sns contained a consumer for easily configuring and consuming the data from the topic. This was not included in the release for what I would argue are still unknown reasons as opposed to the "simple" ones you've listed above. If you can figure out why it was not accepted then perhaps you could address them.

    The code for the SNSConsumer exists in the patch file attached to that issue. My suggestion is to get it working locally and see if it meets your needs. If you put together a couple of examples of it being used then perhaps the Camel team will accept it. Note that the integration tests are extensive so you should be able to put together something easily enough.

  • Hey Mark,

    I already read this in the discussion. I also think that it should be included in the release...

    Best regards,
    Kai Wähner (Twitter: @KaiWaehner)

  • Kai, yet another great blog post about Apache Camel.

    @Mark, I haven't been much involved with your contribution from CAMEL-3476, besides just some code reviews, to ensure the code is in line with Camel.

    I was not aware some feature was left out. I suggest to create a new JIRA ticket and ask for the piece to be included. For me it seems a good idea to have that consumer as well.

  • I think it's best for someone else to open an issue and lobby for the inclusion of this feature. I have nothing new to add in terms of arguments or code that isn't already attached to the issue. Feel free to use whatever you'd like in terms of code as I've already donated what's in that patch.

  • Hi I am trying to use Camel / Spring for AWS-S3. I can collect files but when I try to upload the Bucket is created but no data is transferred Example route. (Data Changed for Security)

    <route>
                  <from uri="activemq:queue:msg.out"/>
                       <setHeader headerName="S3Constants.KEY">
                             <constant>"FGTYUUIOYTTTTTDF1234567890"</constant>        
                          </setHeader>
                     <to  uri="aws-s3://pnt-phil-test.testcreate3?accessKey=DDDDDD3HYBZMVBZZZZZ&amp;secretKey=DDDDDDtBEIfQ45LVOvnzvYy3kxdgetjYUiZZZZZZ&amp;amazonS3Endpoint=s3.amazonaws.com&amp;region=eu-west-1"/>
    </route>

    SNS and SQS work perfectly as expected.

    Any assistance would be appreciated

    Phil 

Recent Posts

Real-Time Locating System and Digital Twin in Smart Factories with Data Streaming at Daimler Truck

Technologies like Real-Time Locating Systems (RTLS) and Digital Twin are transforming manufacturing processes in the…

12 hours ago

Data Streaming with Apache Kafka at Daimler Truck for Industrial IoT and Cloud Use Cases

As a global leader in the commercial vehicle sector, Daimler Truck is not only committed…

3 days ago

A New Era in Dynamic Pricing: Real-Time Data Streaming with Apache Kafka and Flink

In the age of digitization, the concept of pricing is no longer fixed or manual.…

1 week ago

IoT and Data Streaming with Kafka for a Tolling Traffic System with Dynamic Pricing

In the rapidly evolving landscape of intelligent traffic systems, innovative software provides real-time processing capabilities,…

3 weeks ago

Fraud Prevention in Under 60 Seconds with Apache Kafka: How A Bank in Thailand is Leading the Charge

In the fast-paced world of finance, the ability to prevent fraud in real-time is not…

4 weeks ago

When to Choose Apache Kafka vs. Azure Event Hubs vs. Confluent Cloud for a Microsoft Fabric Lakehouse

Choosing between Apache Kafka, Azure Event Hubs, and Confluent Cloud for data streaming is critical…

1 month ago