Running Background Tasks in .NET Core

Introduction

You may often want to run background tasks without blocking your main thread. For example, a web API that accepts a request from the client and immediately returns a response.

There are various tasks that we want to execute in the background without blocking the main thread:

  • Sending an email
  • Image processing
  • Calling third-party APIs

There are various ways to run background tasks:

  • HangFire
  • Quartz.NET
  • Azure Web Jobs
  • Azure Function listing to Azure Service bus events
  • Windows Scheduler to repeatedly execute the tasks

It is always recommended to use an implementation such as Azure Service Bus, RabbimMq, or Kafka that retains your data even though your application restarts for some reason rather than an in-memory implementation.

In this post, we are going to review the combination of the following and how it can allow you to have fire-and-forget/background tasks without blocking your actual thread (in memory implementation):

  • Background Service: Since .NET Core 2.1, Background service allows you to run your tasks in the background.
  • Channel: Channel available since .NET Core 2.1+ or as Nuget Package, provides a thread-safe queue to exchange data between producers and consumers.

Types of Channel

  1. Unbounded Channel: To have any number of producers and consumers. This can be created with Channel.CreateUnbounded method.
  2. Bounded Channel: To define the channel with Maximum capacity. This can be created with Channel.CreateBounded method.

Executing Background Task – Implementation

Let’s implement the example mentioned in the introduction of this post.

There’s an API that accepts a request. It will add that request/model to a wrapper of a channel. A background service uses the channel method, continuously polls it and processes the request/model as soon as it is available.


Abstraction: IBackgroundTodoQueue

IBackgroundTodoQueue is an abstraction over the BoundedChannel. This abstraction gets injected in the API endpoint, which queues the Todo item to be processed by BackgroundService.

public interface IBackgroundTodoQueue
{
    ValueTask PushAsync(Todo todo);

    ValueTask<Todo> PullAsync(CancellationToken cancellationToken);
}

Implementation: BackgroundTodoQueue

Implementation of the above abstraction defines a BoundedChannel with the following options:

  • Capacity: 100 objects
  • Behaviour When the queue is full: Wait
public class BackgroundTodoQueue : IBackgroundTodoQueue
{
    private readonly Channel<Todo> _queue;

    private readonly ILogger<BackgroundTodoQueue> _logger;

    public BackgroundTodoQueue(ILogger<BackgroundTodoQueue> logger)
    {
        var opts = new BoundedChannelOptions(100) { FullMode = BoundedChannelFullMode.Wait };

        _queue = Channel.CreateBounded<Todo>(opts);

        _logger = logger;
    }

    public async ValueTask PushAsync(Todo todo)
    {
        await _queue.Writer.WriteAsync(todo);
    }

    public async ValueTask<Todo> PullAsync(CancellationToken cancellationToken)
    {
        var workItem = await _queue.Reader.ReadAsync(cancellationToken);

        return workItem;
    }
}

Implementation: API

I have created an API endpoint that receives the Todo item, schedules it to be processed by the BackGroundTodoService, and immediately informs the end user that the Todo item has been scheduled.  The background service continuously tries to read from the queue, and as soon as it receives a to-do item, it processes it.

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddSingleton<IBackgroundTodoQueue, BackgroundTodoQueue>();

builder.Services.AddHostedService<BackGroundTodoService>();

var app = builder.Build();

app.MapPost("/todoitems", async Task<IResult>(Todo todo, IBackgroundTodoQueue backgroundTaskQueue) =>
{
    await backgroundTaskQueue.PushAsync(todo);

    return Results.Ok("Todo execution scheduled successfully");
});

app.Run();

Implementation: BackgroundService

BackGroundTodoService inherits BackgroundService having ExecutedAsync, which is a long-running method, and I can continue to loop until stoppingToken is used to cancel the process. Inside the method, we call the PullAsync method of the queue. The process waits until a new Todo item is added to the queue. As soon as it returns the Todo item, we process it.

public class BackGroundTodoService : BackgroundService
{
    private readonly ILogger<BackGroundTodoService> _logger;

    private readonly IBackgroundTodoQueue _taskQueue;

    public BackGroundTodoService(ILogger<BackGroundTodoService> logger, IBackgroundTodoQueue taskQueue)
    {
        _logger = logger;

        _taskQueue = taskQueue;
    }

    protected override async Task ExecuteAsync(CancellationToken stoppingToken)
    {
        const string serviceName = nameof(BackGroundTodoService);

        _logger.LogInformation("{ServiceName} is running", serviceName);

        await ProcessTodoQueueAsync(stoppingToken);
    }

    private async Task ProcessTodoQueueAsync(CancellationToken stoppingToken)
    {
        await Task.Yield();

        while (!stoppingToken.IsCancellationRequested)

            try
            {
                await Task.Delay(5000, stoppingToken);

                var task = await _taskQueue.PullAsync(stoppingToken);

                //other operations - db/api call

                Console.WriteLine($"Todo {task.Name} retrieved and executed");
            }
            catch (OperationCanceledException operationCanceledException)
            {
                _logger.LogInformation(operationCanceledException,
                    "Operation was cancelled because host is shutting down");
            }
            catch (Exception ex)
            {
                _logger.LogError(ex, "Error occurred executing task work item");
            }
    }
}

In the above implementation, at present, we’re adding an object. However, we can also add a Task that doesn’t execute immediately, but the background service will execute it.

The source code can be found in the Github repo:

https://github.com/nilact/ApiFireAndForget

References:

Posted

in

by

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: