Building an Application with a RabbitMQ Acorn Service – Part 2

Oct 18, 2023 by Janakiram MSV
Building an Application with a RabbitMQ Acorn Service – Part 2

In the previous part, we built an Acorn Service that exposes an instance of RabbitMQ. In the final part, we will build two Acorns – a publisher that periodically publishes messages to the topic and a subscriber that reads from the same topic.

Ensure the Acorn Service is running before proceeding to the next steps.

Step 1 – Create an Acorn Job acting as a Publisher

Create a directory called pub to store all the relevant files for the publisher.

Under src, create the below Python script app.py that publishes to a topic called Hello:

import pika, os # Access the CLOUDAMQP_URL environment variable and parse it (fallback to localhost) url = os.environ.get('CLOUDAMQP_URL') params = pika.URLParameters(url) connection = pika.BlockingConnection(params) channel = connection.channel() # start a channel channel.queue_declare(queue='hello') # Declare a queue channel.basic_publish(exchange='', routing_key='hello', body='Hello CloudAMQP!') print(" [x] Sent 'Hello World!'") connection.close()

Notice that the script receives the URL with the RabbitMQ endpoint and credentials through the environment variable.

Now, create the Dockerfile for the Python script.

FROM python:alpine3.9 RUN pip install pika==1.1.0 COPY ./src /src ENTRYPOINT ["python","-u","/src/app.py"]

It’s time to define the Acorn that runs this container as a Job. It will be scheduled to publish a message every minute.

services: "rabbitmq-cloudamqp-server": { external: "rabbitmq-01" } jobs:{ "rabbitmq-pub": { build: context: "." env: { CLOUDAMQP_URL: "@{service.rabbitmq-cloudamqp-server.data.url}" } schedule: "* * * * *" } }

We are referencing the Acorn Service already running in the services section of the Acornfile. We then define the job and pass the CLOUDAMQP URL exposed by the Service as an environment variable to the Python script.

Run the Acornfile to kick off the job.

acorn run -n publisher .

You can see that the job is publishing the messages from the logs:

acorn logs -f publisher

PubSub-01.png

While this is running, let’s create the subscriber.

Step 2 – Create an Acorn acting as a Subscriber

Create a directory called sub to store all the relevant files for the subscriber.

Under src, create the below Python script, app.py, which subscribes to the topic called Hello:

import pika, os # Access the CLOUDAMQP_URL environment variable and parse it (fallback to localhost) url = os.environ.get('CLOUDAMQP_URL') params = pika.URLParameters(url) connection = pika.BlockingConnection(params) channel = connection.channel() # start a channel channel.queue_declare(queue='hello') # Declare a queue def callback(ch, method, properties, body): print(" [x] Received " + str(body)) channel.basic_consume('hello', callback, auto_ack=True) print(' [*] Waiting for messages:') channel.start_consuming()

Let’s create the Dockerfile and Acornfile to run the subscriber.

FROM python:alpine3.9 RUN pip install pika==1.1.0 COPY ./src /src ENTRYPOINT ["python","-u","/src/app.py"] services: "rabbitmq-cloudamqp-server": { external: "rabbitmq-01" } containers:{ "rabbitmq-sub": { build: context: "." env: { CLOUDAMQP_URL: "@{service.rabbitmq-cloudamqp-server.data.url}" } } }

Let’s run the subscriber and watch the logs if the messages are received by it.

acorn run -n subscriber . acorn logs -f subscriber

PubSub-02.png

We can see the messages flowing from the publisher to the subscriber via the RabbitMQ instance running in the cloud.

This tutorial walked you through all the steps involved in publishing Acorn Services and consuming them from multiple Acorns. If you’d like to learn more about getting started with Acorn, view our getting started workshop.