This is the sixth part of Building Microservice Applications with Azure Container Apps and Dapr. The topics we’ll cover are:
- Tutorial for building Microservice Applications with Azure Container Apps and Dapr – Part 1
- Deploy backend API Microservice to Azure Container Apps – Part 2
- Communication between Microservices in Azure Container Apps – Part 3
- Dapr Integration with Azure Container Apps – Part 4
- Azure Container Apps State Store With Dapr State Management API – Part 5
- Azure Container Apps Async Communication with Dapr Pub/Sub API – (This Post)
- Azure Container Apps with Dapr Bindings Building Block – Part 7
- Azure Container Apps Monitoring and Observability with Application Insights – Part 8
- Continuous Deployment for Azure Container Apps using GitHub Actions – Part 9
- Use Bicep to Deploy Dapr Microservices Apps to Azure Container Apps – Part 10
- Azure Container Apps Auto Scaling with KEDA – Part 11
- Azure Container Apps Volume Mounts using Azure Files – Part 12
- Integrate Health probes in Azure Container Apps – Part 13
Azure Container Apps Async Communication with Dapr Pub/Sub API
In this post, we will introduce a new background service which is named “ACA-Processer Backend” according to our architecture diagram. This new service will be responsible for sending notification emails to task owners to notify them that a new task has been assigned to them. We can do this in the Backend API and send the email right after saving the task, but we want to offload this process to another service and keep the Backend API service responsible for managing tasks data only.
To do this the right way, we need to decouple the 2 services from each other, so this means we are going to rely on the Publisher-Subscriber pattern (Pub/Sub Pattern).
The main advantage of this pattern is that it offers loose coupling between services, the sender/publisher of the message doesn’t know anything about the receiver/consumers, even you can have multiple consumers consuming a copy of the message in a totally different way, think of adding another consumer which is responsible to send push notification for the task owner (If we have a mobile app channel :))
The publisher/subscriber pattern relies on a message broker which is responsible for receiving the message from the publisher, storing the message to ensure durability, and delivering this message to the interested consumer(s) to process it. There is no need for the consumers to be available when the message is stored in the message broker, consumers can process the messages at a later time when they are available. The below diagram gives a high-level overview of how the pub/sub pattern works
The source code for this tutorial is available on GitHub. You can check the demo application too.
Overview of Dapr Pub/Sub API
If you implemented the Pub/Sub pattern before, you already noticed that there is a lot of plumbing needed on the publisher and subscriber components in order to publish and consume messages, as well each message broker has it is own SDK and implementation, so you need to write your code in an abstracted way to hide the specific implementation details for each message broker SDK and make it easier for publisher and consumers to re-use this. What Dapr offers here is a building block that significantly simplifies implementing pub/sub functionality.
In very simple words, the Dapr pub/sub building block provides a platform-agnostic API framework to send and receive messages. Your producer/publisher services publish messages to a named topic. Your consumer services subscribe to a topic to consume messages.
To try this out we can directly invoke the Pub/Sub API and publish a message to Redis locally, if you remember from this post once we initialized Dapr in a local development environment, it installed Redis container instance locally, so we can use Redis locally to publish and subscribe to message. If you navigate to the path “<UserProfile>\.dapr\components” you find a file named “pubsub.yaml”. Inside this file, you will see the properties needed to access the local Redis instance. The publisher/subscriber brokers template component file structure can be found on this link.
I want to have more control and provide my own component name, so let’s create a pub/sub component file in our components folder, so go ahead and add a new file named “dapr-pubsub-redis.yaml” under the folder “components”. The content of the file will be as the below:
1 2 3 4 5 6 7 8 9 10 11 12 |
apiVersion: dapr.io/v1alpha1 kind: Component metadata: name: taskspubsub spec: type: pubsub.redis version: v1 metadata: - name: redisHost value: localhost:6379 - name: redisPassword value: "" |
To try out the Pub/Sub API, run the Backend API from VS Code by running the below command or using the Run and Debug tasks we have created in the previous posts. Don’t forget to include the property “–components-path” if you are using dapr run from PowerShell.
1 |
dapr run --app-id tasksmanager-backend-api --app-port 7088 --dapr-http-port 3500 --app-ssl --components-path "../components" dotnet run |
Now let’s try to publish a message by sending a POST request to http://localhost:3500/v1.0/publish/taskspubsub/tasksavedtopic with the below request body, don’t forget to set the “Content-Type” header to “application/json”
1 2 3 4 5 6 7 8 |
{ "taskId": "fbc55b2c-d9fa-405e-aec8-22e53f4306dd", "taskName": "Testing Pub Sub Publisher", "taskCreatedBy": "tjoudeh@bitoftech.net", "taskCreatedOn": "2022-08-12T00:24:37.7361348Z", "taskDueDate": "2022-08-25T00:00:00", "taskAssignedTo": "Taiseer@mail.com" } |
Looking at the endpoint, we can break it into the following:
- The value 3500: is the Dapr app listing port, it is the port number upon which the Dapr sidecar is listening
- The value taskspubsub: is the name of the selected Dapr pub/sub-component.
- The value tasksavedtopic: is the name of the topic to which the message is published
If all is configured correctly, you should receive HTTP response 204 from this endpoint which indicates that the message is published successfully by the service broker (Redis) into the topic named “tasksavedtopic”. You can check that topic is created successfully by using the Redis Xplorer extension in VS Code.
Note: Some Service Brokers allows the creation of topics automatically when sending a message to a topic which is not been created before, that’s the clarification of why the topic “tasksavedtopic” is created automatically.
Right now those published messages are stored in the message broker topic doing nothing as we don’t have any subscribers bound to the service broker on the topic “tasksavedtopic” which is interested to consume and process those messages. So let’s add a consumer to consume the message.
Setting up the Backend Background Processer Project
Step 1: Create a Web API Project
Now we will add a new ASP.NET Core Web API project named “TasksTracker.Processor.Backend.Svc”, Configuration will be as the image below, check “Enable Docker” as we are going to containerize the application and deploy it to ACR, make sure “Linux” is selected for the Docker OS setting.
Step 2: Add Models (DTO)
Now we will add the DTO which will be used to deserialize the published message, so add a new file named “TaskModel.cs” under a new folder named “Models” and paste the code below:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
namespace TasksTracker.Processor.Backend.Svc.Models { public class TaskModel { public Guid TaskId { get; set; } public string TaskName { get; set; } = string.Empty; public string TaskCreatedBy { get; set; } = string.Empty; public DateTime TaskCreatedOn { get; set; } public DateTime TaskDueDate { get; set; } public string TaskAssignedTo { get; set; } = string.Empty; public bool IsCompleted { get; set; } public bool IsOverDue { get; set; } } } |
Step 3: Install Dapr Client NuGet package
Now we will install Dark SDK for .NET Core in the Backend Background Service to be able to subscribe to the service broker topic in a programmatic way, to do so, open the .csproj file of the project “TasksTracker.Processor.Backend.Svc.csproj” and add the below NuGet package
1 2 3 |
<ItemGroup> <PackageReference Include="Dapr.AspNetCore" Version="1.8.0" /> </ItemGroup> |
Step 4: Create an API endpoint for the consumer to subscribe to the topic
Now we will add an endpoint that will be responsible to subscribe to the topic in the message broker we are interested in, this endpoint will start receiving the message published from the Backend API producer, to do so, add a new controller named “TasksNotifierController.cs” under “Controllers” folder and use the code below:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 |
using Dapr.Client; using Microsoft.AspNetCore.Mvc; using TasksTracker.Processor.Backend.Svc.Models; namespace TasksTracker.Processor.Backend.Svc.Controllers { [Route("api/tasksnotifier")] [ApiController] public class TasksNotifierController : ControllerBase { private readonly IConfiguration _config; private readonly ILogger _logger; public TasksNotifierController(IConfiguration config, ILogger<TasksNotifierController> logger) { _config = config; _logger = logger; } public IActionResult Get() { return Ok(); } [Dapr.Topic("taskspubsub", "tasksavedtopic")] [HttpPost("tasksaved")] public async Task<IActionResult> TaskSaved([FromBody] TaskModel taskModel) { _logger.LogInformation("Started processing message with Task Name '{0}'", taskModel.TaskName); //Do the actual sending of emails here, return 200 ok to consume the message return Ok(); //Incase we need to return message back to the topic, return http 400 request //return BadRequest(); } } } |
What we have implemented here is the following:
- We have added an action method named “TaskSaved” which can be accessed on the route “api/tasksnotifier/tasksaved”
- We have attributed this action method with the attribute “Dapr.Topic” which accepts the Dapr pub/sub component to target as the first argument, and the second argument is the topic to subscribe to, in our case “tasksavedtopic”
- The action method expects to receive a “TaskModel” object.
- Now once the message is received by this endpoint, we can start out business logic to trigger sending an email (more about this next) and then return 200 ok responses to indicate that the consumer processed the message successfully and the broker can delete this message.
- If anything went wrong during sending the email (i.e. Email service not responding) and we want to retry processing this message at a later time, we return 400 bad request, this will inform the message broker that the message needs to be retired based on the configuration in the message broker.
- If we need to drop the message as we are aware it will not be processed even after retries (i.e Email to is not formatted correctly) we return a 404 not found response, this will tell the message broker to drop the message and move it to dead-letter or poison queue.
Now you will ask yourselves how the consumer was able to identify what are the subscriptions available and on which route they can be found, the answer for this that at startup on the consumer service, the Dapr runtime will call the application on a well-known endpoint to identify and create the required subscriptions. The well-known endpoint can be reached on this endpoint:
http://localhost:<appPort>/dapr/subscribe
When you invoke this endpoint, the response will contain an array of all available topics for which the applications will subscribe. Each includes a route to call when the topic receives a message, this was generated as we used the attribute “Dapr.Topic” on the action method “api/tasksnotifier/tasksaved”. That means when a message is published on the Pubsubname “taskspubsub” on the topic “tasksavedtopic”, it will be routed to the action method “/api/tasksnotifier/tasksaved” and will be consumed in this action method.
In our case, a sample response will be as below:
1 2 3 4 5 6 7 |
[ { "pubsubname": "taskspubsub", "topic": "tasksavedtopic", "route": "/api/tasksnotifier/tasksaved" } ] |
In this link, you find a detailed diagram of how the consumers will discover and subscribe to those endpoints.
Step 5: Register Dapr and Subscribe Handler at the Consumer startup
Open file “Programs.cs” in the project “TasksTracker.Processor.Backend.Svc” and paste the code below
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
namespace TasksTracker.Processor.Backend.Svc { public class Program { public static void Main(string[] args) { var builder = WebApplication.CreateBuilder(args); // Add services to the container. builder.Services.AddControllers().AddDapr(); var app = builder.Build(); app.UseHttpsRedirection(); app.UseAuthorization(); app.UseCloudEvents(); app.MapControllers(); app.MapSubscribeHandler(); app.Run(); } } } |
What we’ve done is the following:
- In line 11, the extension method “AddDapr” registers the necessary services to integrate Dapr into the MVC pipeline. It also registers a “DaprClient” instance into the dependency injection container, which then can be injected anywhere into your service. We will see how we are injecting DaprClient in the controller constructer later on.
- In line 19, the extension method “UseCloudEvents” adds CloudEvents middleware into the ASP.NET Core middleware pipeline. This middleware will unwrap requests that use the CloudEvents structured format, so the receiving method can read the event payload directly. You can read more about CloudEvents here. (Specs to describing evet data in a common and standard way)
- In line 23, in order to make the endpoint http://localhost:<appPort>/dapr/subscribe available for the consumer so it responds and returns available subscriptions we need to configure the consumer to do this by calling “MapSubscribeHandler”. When this endpoint is called, it will automatically find all WebAPI action methods decorated with the “Dapr.Topic” attribute and instruct Dapr to create subscriptions for them.
With all those bits in place, we are ready to run the publisher service “Backend API” and the consumer service “Backend Background Service” and test pub/sub pattern end to end, to do so, run the below commands in PowerShell console, ensure you are on the right root folder.
1 2 3 |
~\TasksTracker.ContainerApps\TasksTracker.TasksManager.Backend.Api> dapr run --app-id tasksmanager-backend-api --app-port 7088 --dapr-http-port 3500 --app-ssl --components-path "../components" dotnet run ~\TasksTracker.ContainerApps\TasksTracker.Processor.Backend.Svc> dapr run --app-id tasksmanager-backend-processor --app-port 7263 --dapr-http-port 3502 --app-ssl --components-path "../components" dotnet run |
Notice that we gave the new Backend background service a Dapr App Id with the name “tasksmanager-backend-processor” and a Dapr HTTP port with the value “3502”.
Let’s try to publish a message now from the Backend API similar to what we have done earlier by sending this POST request
1 2 3 4 5 6 7 8 9 10 11 12 13 |
POST /v1.0/publish/taskspubsub/tasksavedtopic HTTP/1.1 Host: localhost:3500 Content-Type: application/json Content-Length: 293 { "taskId": "fbc55b2c-d9fa-405e-aec8-22e53f4306dd", "taskName": "Testing Pub Sub Publisher 5", "taskCreatedBy": "tjoudeh@bitoftech.net", "taskCreatedOn": "2022-08-12T00:24:37.7361348Z", "taskDueDate": "2022-08-25T00:00:00", "taskAssignedTo": "Taiseer@mail.com" } |
And keep eye on the terminal logs of the Backend background processor, you will see that the message is received and consumed by the action method “api/tasksnotifier/tasksaved” and an information message is logged in the terminal to indicate the processing of the message. It should be similar to the below image:
Hint: You can use VS Code Dapr extension which we installed in this post to publish the message directly, it will be similar to the below image
Step 6: Update VS Code tasks and launch configuration files
In order to be able to run the 3 services together and debug them in VS Code, we need to update the files tasks.json and launch.json to include the new service we have added, it will be similar to what we have done exactly in this post. You can use this file to update tasks.json and this file to update launch.json files.
Use the Dapr .NET Client SDK to publish messages
Step 1: Update Backend API to publish a message when a task is saved
Now we need to update our Backend API to publish a message to the message broker when a task is saved (a new task is added or an existing task assignee has changed when updating it).
To do this open file named “TasksStoreManager” under the project “TasksTracker.TasksManager.Backend.Api” and update the file as the following:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 |
//Add new private method private async Task PublishTaskSavedEvent(TaskModel taskModel) { _logger.LogInformation("Publish Task Saved event for task with Id: '{0}' and Name: '{1}' for Assigne: '{2}'", taskModel.TaskId, taskModel.TaskName, taskModel.TaskAssignedTo); await _daprClient.PublishEventAsync("dapr-pubsub-servicebus", "tasksavedtopic", taskModel); } //Update the below method: public async Task<Guid> CreateNewTask(string taskName, string createdBy, string assignedTo, DateTime dueDate) { var taskModel = new TaskModel() { TaskId = Guid.NewGuid(), TaskName = taskName, TaskCreatedBy = createdBy, TaskCreatedOn = DateTime.UtcNow, TaskDueDate = dueDate, TaskAssignedTo = assignedTo, }; _logger.LogInformation("Save a new task with name: '{0}' to state store", taskModel.TaskName); await _daprClient.SaveStateAsync<TaskModel>(STORE_NAME, taskModel.TaskId.ToString(), taskModel); await PublishTaskSavedEvent(taskModel); return taskModel.TaskId; } //Update the below method: public async Task<bool> UpdateTask(Guid taskId, string taskName, string assignedTo, DateTime dueDate) { _logger.LogInformation("Update task with Id: '{0}'", taskId); var taskModel = await _daprClient.GetStateAsync<TaskModel>(STORE_NAME, taskId.ToString()); var currentAssignee = taskModel.TaskAssignedTo; if (taskModel != null) { taskModel.TaskName = taskName; taskModel.TaskAssignedTo = assignedTo; taskModel.TaskDueDate = dueDate; await _daprClient.SaveStateAsync<TaskModel>(STORE_NAME, taskModel.TaskId.ToString(), taskModel); if (!taskModel.TaskAssignedTo.Equals(currentAssignee, StringComparison.OrdinalIgnoreCase)) { await PublishTaskSavedEvent(taskModel); } return true; } return false; } |
Notice the highlighted rows and the new method “PublishEventAsync” added to the class, all we have to do is to call the method “PublishEventAsync” and pass the Pub/Sub name, in our case I named it “dapr-pubsub-servicebus” as we are going to use Azure Service Bus as a message broker in the next step (Feel free to use any name you want), and the second parameter “tasksavedtopic” is the topic name the publisher going to send the task model to it, that’s It for publishing, no extra changes needed to start publishing async messages from the Backend API.
Step 2: Update Backend Background Processor to consume messages and send actual email using SendGrid
Open controller named “TasksNotifierController.cs” under Backend Processor Project and update the controller as the code below
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 |
namespace TasksTracker.Processor.Backend.Svc.Controllers { [Route("api/tasksnotifier")] [ApiController] public class TasksNotifierController : ControllerBase { private readonly IConfiguration _config; private readonly ILogger _logger; private readonly DaprClient _daprClient; public TasksNotifierController(IConfiguration config, ILogger<TasksNotifierController> logger, DaprClient daprClient) { _config = config; _logger = logger; _daprClient = daprClient; } [Dapr.Topic("dapr-pubsub-servicebus", "tasksavedtopic")] [HttpPost("tasksaved")] public async Task<IActionResult> TaskSaved([FromBody] TaskModel taskModel) { _logger.LogInformation("Started processing message with Task Name '{0}'", taskModel.TaskName); var sendGridResponse = await SendEmail(taskModel); if (sendGridResponse.Item1) { return Ok($"SendGrid response staus code: {sendGridResponse.Item1}"); } return BadRequest($"Failed to send email, SendGrid response status code: {sendGridResponse.Item1}"); } private async Task<Tuple<bool, string>> SendEmail(TaskModel taskModel) { var apiKey = _config.GetValue<string>("SendGrid:ApiKey"); var sendEmailResponse = true; var sendEmailStatusCode = System.Net.HttpStatusCode.Accepted; var client = new SendGridClient(apiKey); var from = new EmailAddress("taiseer.joudeh@gmail.com", "Tasks Tracker Notification"); var subject = $"Task '{taskModel.TaskName}' is assigned to you!"; var to = new EmailAddress(taskModel.TaskAssignedTo, taskModel.TaskAssignedTo); var plainTextContent = $"Task '{taskModel.TaskName}' is assigned to you. Task should be completed by the end of: {taskModel.TaskDueDate.ToString("dd/MM/yyyy")}"; var htmlContent = plainTextContent; var msg = MailHelper.CreateSingleEmail(from, to, subject, plainTextContent, htmlContent); var response = await client.SendEmailAsync(msg); sendEmailResponse = response.IsSuccessStatusCode; sendEmailStatusCode = response.StatusCode; return new Tuple<bool, string>(sendEmailResponse, sendEmailStatusCode.ToString()); } } } |
What we’ve done In the code above is the following:
- We’ve updated the attribute “Dapr.Topic” to use the same Pub/Sub component name used in the publisher “dapr-pubsub-servicebus”. Then we add a new method that is responsible to consume the proceed message, taking the assignee email, and trying to send an email using SendGrid API.
- We are returning 200 Ok if the SendGrid was able to send the email successfully, and we are returning 400 Bad request if the SendGrid failed to send the email, this will allow the consumer service to re-try processing the message again.
- Don’t forget to add the NuGet package named “SendGrid” version “9.28.1” to the Backend processor project, as well we are reading the “SendGrid:ApiKey” from AppSettings, we will read this value from environment variables once we deploy this service to Azure Container Apps. So open the file named “appsettings.json” and add the below config:
1 2 3 4 5 6 |
{ "SendGrid": { "ApiKey": "", "IntegrationEnabled":false } } |
Use Azure Service Bus as a Service Broker for Dapr Pub/Sub API
Now we will switch our implementation to use Azure Service Bus as a message broker, Redis worked perfectly for local development and testing but we need to prepare ourselves for the cloud deployment, to do so we need to create Service Bus Namespace, then a topic.
Step 1: Create Azure Service Bus Namespace and a Topic
You can do this from Azure Portal or use the below PowerShell command to create the services. I will assume you are using the same PowerShell session from the previous posts so variables still hold the right values, you need to change the namespace as this one is already used by me.
1 2 3 4 5 6 7 8 9 10 11 |
$NamespaceName="taskstracker" $TopicName="tasksavedtopic" ##Create servicebus namespace az servicebus namespace create --resource-group $RESOURCE_GROUP --name $NamespaceName --location eastus ##Create namespace namespace az servicebus topic create --resource-group $RESOURCE_GROUP --namespace-name $NamespaceName --name $TopicName ##List connection string az servicebus namespace authorization-rule keys list --resource-group $RESOURCE_GROUP --namespace-name $NamespaceName --name RootManageSharedAccessKey --query primaryConnectionString --output tsv |
Step 2: Create Dapr Pub/Sub API Component file for Azure App Service deployment
Go ahead and add a new file named “containerapps-pubsub-svcbus.yaml” under the folder “aca-components”and paste the code below:
1 2 3 4 5 6 7 8 9 10 11 12 13 |
# pubsub.yaml for Azure Service Bus component componentType: pubsub.azure.servicebus version: v1 metadata: - name: connectionString secretRef: sb-root-connectionstring secrets: - name: sb-root-connectionstring value: "<value here>" # Application scopes scopes: - tasksmanager-backend-api - tasksmanager-backend-processor |
Notice that we are not setting the Azure Service bus connection string here, this will not be secure, we are using “secretRef” which will allow us to set the actual “sb-root-connectionstring” after we deploy this component to the Azure Container Environment. So this key will not be stored in the source code by mistake.
As well we’ve used the “Scopes” property to limit access to only those 2 applications (Frontend App is not concerned with Pub/sub pattern), so using the scopes property will help us to achieve this. More on scopes on this link.
Deploy the Backend Background Processer and the Backend API Projects to Azure Container Apps
Step 1: Build the Backend Background Processer and the Backend API App images and push them to ACR
As we have done previously we need to build and deploy both app images to ACR so they are ready to be deployed to Azure Container Apps, to do so, continue using the same PowerShell console and paste the code below (Make sure you are on directory “TasksTracker.ContainerApps”):
1 2 3 4 5 |
$BACKEND_SVC_NAME="tasksmanager-backend-processor" az acr build --registry $ACR_NAME --image "tasksmanager/$BACKEND_API_NAME" --file 'TasksTracker.TasksManager.Backend.Api/Dockerfile' . az acr build --registry $ACR_NAME --image "tasksmanager/$BACKEND_SVC_NAME" --file 'TasksTracker.Processor.Backend.Svc/Dockerfile' . |
Step 2: Add Azure Service Bus Dapr Pub/Sub Component to Azure Container Apps Environment
We need to run the command below to add the yaml file “.\aca-components\ccontainerapps-pubsub-svcbus.yaml” to Azure Container Apps Environment, to do so run the below PowerShell command:
1 2 3 4 |
az containerapp env dapr-component set ` --name $ENVIRONMENT --resource-group $RESOURCE_GROUP ` --dapr-component-name dapr-pubsub-servicebus ` --yaml '.\aca-components\containerapps-pubsub-svcbus.yaml' |
Notice that we set the component name “dapr-pubsub-servicebus” when we added it Container Apps Environment.
Once the command completes and from the Azure Portal, navigate to your Container Apps Environment, select “Dapr Components”, then click on “statestore” component, and provide your Azure Service Bus connection string in the Secrets text box value for the secret “sb-root-connectionstring” and click “Edit” button. It will be similar to the below image
Step 3: Create a new Azure Container App to host the new Backend Background Processor
Now we need to create a new Azure Container App, we need to have this new container app with those capabilities in place:
- Ingress for this container app should be disabled (No access via HTTP at all, this is a background processor responsible to process published messages).
- Dapr needs to be enabled.
- Setting the value of SendGrid API in the secrets store and referencing it in the environment variables.
To achieve the above run the below PowerShell script and notice how we removed the Ingress property totally Ingress is disabled for this Container App:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
az containerapp create ` --name "$BACKEND_SVC_NAME" ` --resource-group $RESOURCE_GROUP ` --environment $ENVIRONMENT ` --image "$ACR_NAME.azurecr.io/tasksmanager/$BACKEND_SVC_NAME" ` --registry-server "$ACR_NAME.azurecr.io" ` --min-replicas 1 ` --max-replicas 5 ` --cpu 0.25 --memory 0.5Gi ` --enable-dapr ` --dapr-app-id $BACKEND_SVC_NAME ` --dapr-app-port 80 ` --secrets "sendgrid-apikey=Replace with your SendGrid API Key" ` --env-vars "SendGrid__ApiKey=secretref:sendgrid-apikey" ` --query configuration.ingress.fqdn |
Step 4: Deploy new revisions of the Backend API to Azure Container Apps
As we’ve done this multiple times we need to update the Azure Container App hosting the Backend API with a new revision so our code changes for publishing messages after a task is saved is available for users, to do so run the below PowerShell script:
1 2 3 4 5 6 7 8 |
## Update Backend API App container app and create a new revision az containerapp update ` --name $BACKEND_API_NAME ` --resource-group $RESOURCE_GROUP ` --revision-suffix v20220829-1 ` --cpu 0.25 --memory 0.5Gi ` --min-replicas 1 ` --max-replicas 2 |
With this in place, you should be able to test the 3 services end to end and should receive a notification email to the task assignee email, the email will look like the below
In the next post, we will cover how are going to use Dapr binding with Azure Container Apps, stay tuned and happy coding!
Hello Taiseer,
great series, really what I was looking for!
There is one thing I’m not really sure about regarding container apps and Dapr Pub/Sub: Is it required that publisher and consumer(s) are in the same environment?
Thank you!
Hello Sven, thanks for your comment.
No there is no need for publisher and Subscriber to be on the same ACA environment. The subscriber service only watches the message on a certain topic to start consuming them. It doesn’t care from where the message comes in (from where it was produced)
You can check my latest post here and see how I used service bus explorer to publish messages, this could be a console application or any external application that acts as a producer. One thing to notice is that when using Dapr Pub/Sub API. The message is wrapped into a dataproperty because cloud events is used. So if you want to create new publisher (i.e console application you need to wrap your message body with data property). Check CloudEvents specs here.
Following your post to use the PubSub API the message is not getting received in the API. i followed the article as is.
Hello VJ,
There are many moving factors here, anything such as logs messages you can obtain from Container Apps logs or daprd to help troubleshoot your issue. Go to your container app and select log stream and watch the logs after you publish a message to a topic.
Hi Taiseer,
I tried with your code as is from Github, the trigger is not happening. I tried to use the service bus explorer. I will try and deploy the publisher and check again. The issue I am seeing is the POD unhealthy every time I set the Dapr port. so it appears some configuration.
HI Talseer,
Thank you, I was able to get it working. There were 2 issues. that caused my setup not to work.
1. There was a typo in my YAML file
2. Once I corrected the YAML, the error showed up.
For others who are following the blog’s benefit, in the block, you have the pib-sub reference as taskpubsub but in the YAML in github it is pub-sub-servicebus.
Just an fyi to anyone else following this article. I couldn’t get the manual POST request to work. It returned “failed getting app id either from the URL path or the header dapr-app-id”
I added dapr-app-id as tasksmanager-backend-api and it still didn’t work, just returned a 404.
However, publishing a message via the code did work and was consumed. So even if you can’t get the manual request displayed in this article to work, it doesn’t mean your setup isn’t working, and you can continue on.
Thanks for sharing, I will look again at why it’s not working. But I’m confident it should work with manual post operation. I will recheck and update you if there is something that needs to change.