0 Comments

So it’s 2019 now, almost 2020. More and more companies start migrating solutions to the cloud. In my profession, building cloud and web solutions, I see that companies start migrating their solutions to different architectures, more suitable for cloud environments. Classic web systems are replaced by serverless systems and huge IT systems are thorn apart into microservices. Since Azure Functions became mature, they’re a really good replacement for classic ASP.NET Web solutions running on a Web App or on IIS. So I started using these serverless solutions more and more. Because I also like to create user-friendly software I often use the SignalR real-time framework to notify the user of processes going on on the server. For example, when sending a command to a serverless function, you may want to inform the user whether processing that command was successful (or not). In the past, you required a web host to run SignalR, but running a web host in the cloud is relatively expensive. Today, SignalR is one of the native cloud services Azure can deliver. During this blog, I’m going to implement this SignalR service.

The demo project

Developers always create demo projects to try something new. The idea is great, but there’s never time to finish the project and so it lands in the trash can somewhere between now and 5 years. So for this blog, to show how the SignalR Service works, I… Yes… created a demo project. It’s an Angular front-end project uploading images to Azure Blob Storage. An Azure Function will be triggered by the blob creation and start resizing the image into two versions, a thumbnail and a fairly decent web size (1024 x 768). Image references are stored in Azure Table Storage and once both images are sized correctly, the state of the image in Table Storage will be set to available. Then a message will be broadcasted using SignalR, which enables the front-end system to respond. Pretty awesome, you could also use this exact same scenario for example when importing data. Just upload the data, make a function that imports, and report status through SignalR.

signalr-service-createSo first I navigated to the Azure Portal and started creating a SignalR Service.

Now when the Azure created the resource, navigate to the newly created SignalR Services and open the Keys blade. Here you’ll find two keys and two connection strings. Copy one of the connection strings, you’re going to need that one in the Azure Functions project. Then navigate to the CORS blade and validate if there’s an allowed origin *. If not, add it. You may want to change this to a valid endpoint once your system goes to production, but for this demo, you’ll be fine. Please note, that I selected Serverless as a ServiceMode. This mode should only be selected when you use SignalR from an Azure Functions project.

Next Up, The Functions project

Now open Visual Studio, I used VS 2019 (16.3.18) and Azure Functions v2. Create a new Azure Functions project and see if your project contains a local.settings.json file. If not, create it and add the copied Connection String value as a setting called ‘AzureSignalRConnectionString’. Your local.settings.json looks like this (or something similar):

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "UseDevelopmentStorage=true",
    "AzureSignalRConnectionString": "Endpoint=https://your-signalr.service.signalr.net;AccessKey=/--secred-access-key-here--/;Version=1.0;"
  },
  "Host": {
    "LocalHttpPort": 7071,
    "CORS": "http://localhost:4200",
    "CORSCredentials": true
  }
}

The Angular client makes HTTP requests to the negotiate function to initiate the connection negotiation. When the client application is hosted on a different domain than the Azure Function app, cross-origin resource sharing (CORS) must be enabled on the Function app or the browser will block the requests. This is why I also added some CORS settings in the settings file. I know my Angular client is going to run on localhost port 4200. Once again, you may want to change these settings once you go to production.

As you all know, an Azure Function is fired by a trigger and may use bindings (input and/or output) to use external data or services, or send data to external services. We’re going to use a SignalR Output Binding which means we send data out to the SignalR Service. This data fires an event on the client which can be handled accordingly. The bindings for the SignalR Service can be installed by adding a NuGet package to your project. Look for the packed called Microsoft.Azure.WebJobs.Extensions.SignalRService, My project used version 1.0.2, just so you know.

Now it’s time to implement the negotiate endpoint. SignalR uses this endpoint to initiate a connection and determine server and client capabilities. In your Azure Functions project, create a new endpoint with an HTTP trigger and which looks like this:

[FunctionName("negotiate")]
public static SignalRConnectionInfo SignalRNegotiate(
    [HttpTrigger(AuthorizationLevel.Anonymous, "post")]  HttpRequestMessage req,
    [SignalRConnectionInfo(HubName = "notifications")] SignalRConnectionInfo connectionInfo)
{
    return connectionInfo;
}

That's pretty much all there is to it. This endpoint allows you to connect to the SignalR service. Connecting to this endpoint redirects to your SignalR Service, which in turn returns its capabilities (like available transport types and so).

I explained I persist a reference to uploaded pictures in table storage. Once a file is uploaded and successfully scaled, I send a command on a queue that sets an availability flag on the picture entity in table storage. When the table entity is successfully updated, I send a message through the SignalR Service.

The function looks like so (I stripped code which doesn’t add value for this demo):

[FunctionName("PictureStatusCommandQueueHandler")]
public static async Task PictureStatusCommandQueueHandler(
    [QueueTrigger(Constants.QueueNamePictureStatusCommands)] string pictureStatusCommandMessage,
    [Table(TableNames.Pictures)] CloudTable picturesTable,
    [SignalR(HubName = SignalRHubNames.NotificationsHub)] IAsyncCollector signalRMessages,
    ILogger log)
{
    log.LogInformation("Picture status command retrieved");
    SetStorageConsumptionCommand consumptionCommand = null;
    ...
    if (...)
    {

	...
        Update the table entity here
        ...

        var pictureDto = new PictureDto
        {
            CreatedOn = entity.Timestamp,
            Id = Guid.Parse(entity.RowKey),
            Name = entity.Name,
            Url = picturesTable.ServiceClient.BaseUri.ToString()
        };
        await signalRMessages.AddAsync(
            new SignalRMessage
            {
                Target = "newPicture",
                Arguments = new object[] { pictureDto }
            });
        }
    }
    return consumptionCommand;
}

So what happens here is basically that I create a Data Transfer Object (DTO), which I want to push to the client, and I happen to use SignalR as a mechanism to do that for me. The DTO will be converted to JSON and passed to the client. The Target here (newPicture) is the event that will be raised client side, and the arguments can be seen as the payload of that message.

The Angular project

Before we run into a discussion that doesn’t make sense… I’m a cloud solution architect and I really like C# and the Microsoft development stack. I also have a strong affinity with Angular. Because I use Angular as a demo project doesn’t mean it’s the best solution. Vue, React and all other frameworks/component libraries work fine! So I created this Angular project and inside that project created a service. This service uses the @aspnet/signalr package so you need to install that. For your information, my demo project used version 1.1.4.

npm i @aspnet/signalr

or yarn if you like

yarn add @aspnet/signalr

Now back to the service, since the service is quite large, I created a Github Gist here. The service contains a connect and a disconnect function. The endpoint to connect to is your Azure Functions project URL http://{az-functions-project}/api

By connecting to that location, the SignalR client will send a post request to the negotiate endpoint of your Azure Functions project, and the SignalR service does the rest for you.

Now if your scroll down to line 22 of the gist, you see this code:

this.connection.on('newPicture', (dto: PictureDto) => {
    console.log('New picture arrived');
});

This fragment subscribes to the ‘newPicture’ event. Remember the Azure Function in which we send a message with a target ‘newPicture’? Well, this is the event handler on the client handling that event. In this case, a message is written to the browser’s console, but you also see the dto of type PictureDto, which contains the actual information about the image as it was passed by the Azure Function.

Now create a component that consumes the realtime service and calls the service’s connectSignalR() function and you’re good to go!!

I have quite some history with SignalR, so I expected a very complicated solution. It took me some time to figure out how the SignalR service is implemented, but mostly because I expected something difficult. The reality is that the SignalR Service integrates extremely well and lowers the complexity bar big time! Have fun experimenting!

0 Comments

So I ran in to Azure Functions and realized I totally missed something there. One of my co-workers is a serverless advocate and kind of drew my attention about a year ago. And so I started exploring the world of serverless. First impression is that it’s hard to learn and complicated, but all these thoughts appeared to be not true… It’s just different. A different way of thinking and a different way of programming.

So as a lot of developers do, I started a project which made sort of sense and started learning while the project evolved. And now, a year has passed. What happened during that year? I created a couple of github repos for the project, threw them away, re-created repos and threw them away as well… And now, a few weeks ago, I started a new repo with some code that I thought was worth sharing. And that’s where we are today….

TL;DR – A cool and awesome URL Shortner project running Azure Functions in probably the cheapest way possible, hit https://4dn.me/azfuncs.

Answer the question please!?

So the question remains… Why Azure Functions are so cool? Well, because you implement them in the easiest way possible. They’re triggered by native cloud services and thus integrate very well in every cloud solution. They scale like a maniac so huge amounts of traffic are no problem. Oh and wait…. Almost forgot to mention that running Azure Functions is cheap… Really cheap!!

So the project I was talking about, is the classic project of an URL Shortner. You paste in a long huge endpoint URL. The service stores the URL and returns a short code which can be used to visit the URL.

I added login functionality so users are able to manage their short links and change the short code so it’s even easier to remember as long as the short code is unique.

Finally I want to track hits on each short link so you can see how many hits a short link received and even see the most recent hits in a graph.

If users don’t want to log in, they can just paste a URL and have it shortened. They miss the advantage of being able to change the short code and extend the life time of a short link. All links will expire, users will be able to set / change the expiration date. Anonymous visitors cannot change that date.

So what is an Azure Function?

Basically, very simple… An Azure Function is just a piece of code, running because it’s executed by a trigger. You want to keep functions lean and clean. Ideally functions have a single purpose (responsibility) and rely as little as possible on code libraries. For example importing EntityFramework in an Azure Function runs fine and works perfect. However the EF library is large and makes your slim and lean function a big rhino running through an Azure datacentre. What you’re looking for is an agile free runner able to manoeuvre though the datacentre at lightning speed.

To help you, there’s a mechanism called bindings. So functions have a trigger, and bindings. With bindings, you are able to connect to other cloud services like storage, the service bus, event grid, sendgrid and more. And best of all, if your binding is not available by default, you’re free to create one yourself. Bindings are input (stuff coming in) or output (stuff sent out).

A tiny example

An easy example is sending email. Sending an email message is a relatively heavy process for web applications. Sending email messages within a web request, may block additional incoming requests. You dont’s want this kind of processes in your web request. Writing a function that sends these email messages for your makes your system more distributed, and best of all, removes the heavy process from your web request. Basically you would store an email message to blob storage and add a message to a queue. A function with a queue trigger, an input binding reading the message from blob, and an output binding to send the message using sendgrid would be an excellent solution. And best of all, you just removed the pressure from your web app.

So how does my demo app work?

An endpoint URL is passed to the backend, which generates a unique short code and stores the link in table storage. Pretty straigt forward.

public static async Task CreateShortLink(
    [HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "links")] HttpRequestMessage req,
    [Table(Constants.TableStorageLinksTableName)] CloudTable table,
    ILogger log)

This function uses a HTTP Trigger to fire (e.g. it waits for a web request). It uses an input binding to table storage and accepts a CloudTable so I can query for existing short codes and store the new short link in case everything is fine.

Then a couple of validations take place, and a unique short code is generated. In the end, I use the table to store the new short link.

var entity = new ShortLinkEntity
{
    ShortCode = validShortCode,
    RowKey = Guid.NewGuid().ToString(),
    CreatedOn = DateTimeOffset.UtcNow,
    EndpointUrl = shortLinkDto.EndpointUrl,
    ExpiresOn = expirationDate,
    PartitionKey = Constants.TableStorageLinksPartitionKey,
    Timestamp = DateTimeOffset.UtcNow,
    TotalHits = 0,
    OwnerId = owner
};
var operation = TableOperation.Insert(entity);
var result = await table.ExecuteAsync(operation);

Then I return a HTTP response containing information about the new short link.

Now when one of the short links is hit, the system needs to find if the short code exists and retrieve the endpoint associated to that short link. But because this is a cool and fancy Azure Functions Demo app, I want to track hits per short link. So I write also write a ‘hit’ to a storage queue.

A different function will be triggered when a message arrives on that queue, and starts processing the information about that hit. Here is the entire function:

[FunctionName("ProcessIncomingHit")]
public static async void Run(
    [QueueTrigger(Constants.TableStorageQueueHits)]ShortLinkHitDto hitDto,
    [Table(Constants.TableStorageLinksTableName)] CloudTable table,
    [Table(Constants.TableStorageHitsTableName)] CloudTable hitsTable,
    ILogger log)
{
            
    log.LogInformation($"Hit received for processing {hitDto.ShortCode}");
    var fetchOperation =
        TableOperation.Retrieve(Constants.TableStorageLinksPartitionKey, hitDto.RowKey);
    var retrievedResult = await table.ExecuteAsync(fetchOperation);
    if (retrievedResult.Result is ShortLinkEntity shortLinkEntity)
    {
        var hitEntity = new HitEntity
        {
            PartitionKey = Constants.TableStorageHitsPartitionKey,
            RowKey = Guid.NewGuid().ToString(),
            ShortCode = hitDto.ShortCode,
            HitOn = hitDto.HitOn,
            Timestamp = DateTimeOffset.UtcNow
        };


        shortLinkEntity.TotalHits = shortLinkEntity.TotalHits + 1;
        var insertOperation = TableOperation.Insert(hitEntity);
        await hitsTable.ExecuteAsync(insertOperation);
        var updateOperation = TableOperation.InsertOrReplace(shortLinkEntity);
        await table.ExecuteAsync(updateOperation);
    }
}

Obviously, the function is triggered on arrival of a message on the storage queue. I added bindings to the original short links table, and to a hits table. The original short links table is used to increment a total hits counter of the short link. Also I add a new entity to a hits table. This is used by an aggregate function that allows me to draw a graph of the hits over the past week.

The full source code can be found here.