In the last article we’ve seen how to the parallel state in a State function. In this article we’ll see how we can use the Wait state using the serverless framework.

The wait state delays the execution of the state function for a certain amount of time. By default, it returns the same object that it receives.

What are we going to code

We are going to code the following state function step function with parallel state

As you can see we’re going to have an initial function that creates a result with a field called DelaySeconds. Then, we’ll have the wait state and finally a result state that will format the output.

Coding the lambdas

Following the steps of the following article, create two lambdas called InitLambda and ResultLambda with the following code:

InitLambda

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
public class InitLambda
{
    public InitResult Init(string input){
        return new InitResult(int.Parse(input));
    }
}
    
public class InitResult{

    public InitResult(int delay)
    {
        DelaySeconds = delay;
    }
    public int DelaySeconds {get;set;}
}

ResultLambda

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
public class ResultLambda
{
    public string Result(InitResult input){
        return $"The seconds delayed are {input.DelaySeconds}";
    }
}

public class InitResult{

    public InitResult(int delay)
    {
        DelaySeconds = delay;
    }
    public int DelaySeconds {get;set;}
}

// You can have the result class in a shared library

Creating the step function

Now it’s time to create the step function. The code is very similar to our original code, but we’re now introducing a new kind of state. Let’s put here the interesting bits:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
stepFunctions:
    stateMachines:
        testParallelStepFunction:
            definition:
                StartAt: Init
                States:
                    Init:
                        Type: Task
                        Resource: arn:aws:lambda:${opt:region}:${self:custom.accountId}:function:${self:custom.initService}-${opt:stage}-Init
                        Next: WaitSeconds
                    WaitSeconds:
                        Type: Wait
                        Seconds: 10
                        Next: Result
                    Result:
                        Type: Task
                        Resource: arn:aws:lambda:${opt:region}:${self:custom.accountId}:function:${self:custom.resultService}-${opt:stage}-Result
                        End: true

As you can see we have a new task state called WaitSeconds which is of type Wait. In this first case we are specifying that we want to wait 10 seconds. Let’s run the step function from the UI and see if it waits the desired time.

wait 10 seconds

It works!

Let’s see which other alternatives do we have.

Specifying a timestamp

It can be possible that we need a step to be executed at a certain time. If we want that, we can specify the timestamp field:

1
2
3
4
WaitSeconds:
    Type: Wait
    Timestamp: "2017-06-20T20:58:00Z"
    Next: Result

The timestamp, as the documentation says, must conform to the RFC3339 profile of ISO 8601, with the further restrictions that an uppercase T must separate the date and time portions, and an uppercase Z must denote that a numeric time zone offset is not present. In our case, we’re saying that we want to wait until 2017/06/20 20:58 UTC.

Let’s deploy and execute the step function from the UI to see if it works:

wait timestamp

Non-hardcoded duration

We don’t need to always hardcode the value of the duration or the timestamp. We can use a path from the state’s input data to specify. If we want to do that, we need to specify the state in this way:

1
2
3
4
WaitSeconds:
    Type: Wait
    SecondsPath: "$.DelaySeconds"
    Next: Result

In our case, we’re going to use the field DelaySeconds of the input data to read the amount of seconds we want to wait. We can do the same thing with the timestamp using the field TimestampPath.

Summary

We’ve seen another possible state that you can use when defining a State Function: the wait step. We’ve seen how we configure this kind of step in four different ways.

In the last article we’ve seen how to create a very basic step function using .Net Core and the serverless framework. Today we’ll see how to create one of the more useful states in a Step Function: the parallel state.

The parallel state allows you to create parallel branches of execution in your state machine. Using it, you’ll be able to run several tasks in parallel and then collect the results in another task, that will be executed only if all the parallel tasks finish correctly.

The task that collects the results of the parallel tasks will receive an array with all the results. The only limitation we have using .Net Core is that the results must be of the same type, so we can have an array of that type. I imagine you can do fancier things using Javascript.

What are we going to code

We are going to code the following state function step function with parallel state

As you can see we’re going to have an initial function that makes a simple transformation of the input, two parallel functions and a function that collects the results.

Coding the lambdas

Following the steps of the previous article, create four lambdas called InitLambda, ParallelOneLambda, ParallelTwoLambda and ResultLambda with the following code:

InitLambda

1
2
3
4
5
6
public class InitLambda
{
    public string Init(string input){
        return $"Start processing {input}";
    }
}

ParallelOneLambda

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
public class ParallelOneLambda
{
   public ParallelResult ParallelOne(string request)
   {
       return new ParallelResult(DateTime.UtcNow, $"Parallel one output: {request}");
   }
}

public class ParallelResult
{
    public ParallelResult(DateTime dateTime, string result)
    {
        DateTime = dateTime;
        Result = result;
    }

    public DateTime DateTime {get; set;}
    public string Result {get;set;}
}

ParallelTwoLambda

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
public class ParallelTwoLambda
{
   public ParallelResult ParallelTwo(string request)
   {
       return new ParallelResult(DateTime.UtcNow, $"Parallel two output: {request}");
   }
}

public class ParallelResult
{
    public ParallelResult(DateTime dateTime, string result)
    {
        DateTime = dateTime;
        Result = result;
    }

    public DateTime DateTime {get; set;}
    public string Result {get;set;}
}

// You can have the result class in a shared library

ResultLambda

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
public class ResultLambda
{
    public string Result(ParallelResult[] input){
        return $"The result of the first task is {input[0].Result} and the result of the second task is {input[1].Result}";
    }
}

public class ParallelResult
{
    public ParallelResult(DateTime dateTime, string result)
    {
        DateTime = dateTime;
        Result = result;
    }

    public DateTime DateTime {get; set;}
    public string Result {get;set;}
}

// You can have the result class in a shared library

As you can see, the result lambda receives an array or ParallelResult.

Creating the step function

Now it’s time to create the step function. The code is very similar to our original code, but we’re now introducing a new kind of state. Let’s put here the interesting bits:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
stepFunctions:
    stateMachines:
        testParallelStepFunction:
            definition:
                StartAt: Init
                States:
                    Init:
                        Type: Task
                        Resource: arn:aws:lambda:${opt:region}:${self:custom.accountId}:function:${self:custom.initService}-${opt:stage}-Init
                        Next: ParallelProcessing
                    ParallelProcessing:
                        Type: Parallel
                        Branches:
                        - StartAt: ParallelOne
                            States:
                            ParallelOne:
                                Type: Task
                                Resource: arn:aws:lambda:${opt:region}:${self:custom.accountId}:function:${self:custom.parallelOneService}-${opt:stage}-ParallelOne
                                End: true
                        - StartAt: ParallelTwo
                            States:
                            ParallelTwo:
                                Type: Task
                                Resource: arn:aws:lambda:${opt:region}:${self:custom.accountId}:function:${self:custom.parallelTwoService}-${opt:stage}-ParallelTwo
                                End: true
                        Next: Result
                    Result:
                        Type: Task
                        Resource: arn:aws:lambda:${opt:region}:${self:custom.accountId}:function:${self:custom.resultService}-${opt:stage}-Result
                        End: true

As you can see we have a new task state called ParallelProcessing which is of type Parallel and has two branches: ParallelOne and ParallelTwo. The branches follow the state machine definition and, in this case, have only a task inside them.

We can now deploy the lambdas and the step function and invoke it:

1
sls invoke stepf --name testParallelStepFunction --data '"asdf"'

And see that we have the expected result:

step function result

A couple of remarks

A branch can be a state machine by itself

A branch is not limited to have just one task inside it. It can have several tasks. The only limitation is that the last task must have a final one (have

1
End:true
)

branch with several tasks

The results always come in order

The order of the results in the array on the step after a parallel step always come in the same order, and that order is the order that the branches are defined. So, although ParallelOne ends later than ParallelTwo, the result of ParallelOne will come in the first position of the array.

Summary

We’ve seen one of the more powerful states of a State Function, the parallel step. We’ve seen how we can add more than one task inside each of its branches and how to collect the results.

If my good friend Alan Gorton is right

we’d better be prepared. In this article we’ll see how we can develop AWS Lambda Functions using .Net Core and deploy them and Step Functions using the serverless framework.

Installing dependencies

You should be able to follow this tutorial using a Windows machine or a Mac. The first step is to install all the things we’re going to need.

Let’s start with .Net Core. The limitation that AWS imposes is to target netcoreapp1.0,so we can just download the last version of .Net Core. To do that, go to https://www.microsoft.com/net/core and follow the instructions.

The next step is to install the serverless framework. You will need to have NodeJs installed and then follow the instructions they have in their website. It’s very straightforward.

And we’re good to go! Nothing else is needed.

Accessing AWS

To be able to run the serverless framework commands we’ll need an account with its access keys. There are a couple of ways to configure them in your environment as it’s explained in https://serverless.com/framework/docs/providers/aws/guide/credentials/ In my case, I’m going to use a profile. You can use whatever you want.

Folder structure

We’re going to have a root folder that will contain the different folders for each lambda function. The only file that will exist in that folder will be the serverless.yml for the step function. So, go ahead and create the root folder

1
mkdir TestServerlessStepFunctions

We’re now ready to create our first lambda function.

Creating a lambda function

The serverless framework allows you to create a project scaffolding for your lambda project. It has a couple of limitations though:

  • It creates a project that you need to work with it with the version 1.0.0-preview2-003131 of the .Net Core SDK. That’s not the last version, is the version that still uses the project.json file.
  • It doesn’t allow you to create an F# project.

I’ll send a pull request to fix both problems, but it’s not there yet.

Apart from some utility files, the framework only adds a file to an standard project, so we can create the project by ourselves using the .Net Core CLI and manually add that file. Let’s do that:

1
dotnet new classlib --framework netcoreapp1.0 --name UppercaseLambda --output UppercaseLambda

This creates a new C# project targeting netcoreapp1.0 called UppercaseLambda in a folder called UppercaseLambda. That’s all we need.

Rename the cs file to UppercaseLambda.cs and copy the following content there:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
using System;
using Amazon.Lambda.Core;

// Assembly attribute to enable the Lambda function's JSON input to be converted into a .NET class.
[assembly: LambdaSerializerAttribute(typeof(Amazon.Lambda.Serialization.Json.JsonSerializer))]

namespace TestServerlessStepFunctions
{
    public class UppercaseLambda
    {
        public string Uppercase(string request)
        {
            return request.ToUpperInvariant();
        }
    }
}

In order to be able to build this project, we’ll need to add a package. So, go to the project folder and run:

1
dotnet add package Amazon.Lambda.Serialization.Json

We’re good to go now. Let’s restore the dependencies

1
dotnet restore

dotnet restore

Build the project

1
dotnet build

dotnet build

And publish it

1
dotnet publish

dotnet publish

What we need to be able to publish the lambda function is to create a zip file with all the files needed to execute the function. We’ll need to create the zip in a folder we can reference later, so let’s create the folder and the zip:

1
2
mkdir bin/Debug/netcoreapp1.0/package
zip -Xrj bin/Debug/netcoreapp1.0/package/UpperCaseLambda.zip bin/Debug/netcoreapp1.0/publish/

zip

As you can see, we’re creating a zip with the contents of the publish folder.

We’re just one step away to be able to deploy our Lambda function. If we want to use the serverless framework to do it, we need to create a file called serverless.yml in the package folder. Create the file and copy the following contents:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
service: uppercaseService

provider:
    name: aws
    runtime: dotnetcore1.0
    profile: serverless-admin-vgaltes
    region: us-east-1
    stage: dev

package:
    artifact: bin/Debug/netcoreapp1.0/package/UpperCaseLambda.zip

functions:
    Uppercase:
        handler: UppercaseLambda::TestServerlessStepFunctions.UppercaseLambda::Uppercase

If we use different yaml files for each lambda functions, we needd to specify a different service name for each lambda. In the provider section, we’re telling the framework which provider we’re going to use, runtime, user profile, region and stage. Stages are a good way to manage different environments inside the same AWS account.

In the package section we’re specifing which package we’re going to deploy.

And finally, we’re going to define our functions, in our case just one of them. In the handler, we’re going to specify where the function lives, in our case in the UppercaseLambda file, in a class called UppercaseLambda inside a namespace called TestServerlessStepFunctions and the method is called Uppercase.

We’re prepared to deploy the function. We just need to run the following command:

1
sls deploy -v

sls deploy uppercase

The -v flag is to have a verbose output. If everything is configured well we’re going to see how the lambda is correctly deployed in the account.

We can now try the lambda to see if it’s been correctly deployed. To do that, just run

1
sls invoke -f Uppercase --data "asdf"

We should see “ASDF” as the output.

sls invoke uppercase

A basic step function

Now that we have a lambda function, we can use it inside a Step Function. A Step Function is a nice way to orchestrate Lambda functions. You can learn more here https://aws.amazon.com/step-functions/

To be able to deploy a Step Function using the serverless framework we’ll need to install a plugin. So, lets go to the root folder and type:

1
npm install --save serverless-step-functions

The step function will not have any code associated to it. It will just call the Lambdas we create. So, we just need to create a serverless.yml file in the root folder. Create the file and copy the following contents:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
service: TestStepFunctions

custom:
    accountId: <your_account_id>
    uppercaseService: uppercaseService

provider:
    name: aws
    runtime: dotnetcore1.0
    profile: serverless-admin-vgaltes
    region: us-east-1
    stage: dev

stepFunctions:
    stateMachines:
        testStepFunction:
            definition:
                StartAt: Uppercase
                States:
                    Uppercase:
                        Type: Task
                        Resource: arn:aws:lambda:${opt:region}:${self:custom.accountId}:function:${self:custom.uppercaseService}-${opt:stage}-Uppercase
                        End: true

plugins:
- serverless-step-functions

Notice that, as we explained previously, we’re using a different service name here. In the custom section we’re defining some custom variable that we’ll use to properly reference the Lambda function. The meat of the file is in the stepFunctions sections. We’re defining there a new state machine called testStepFunction that has a single state, called Uppercase, which is a Task. In the resource, we’re referencing the arn of the lambda function we’ve created previously. To do that, we need to use the custom variables previously declared.

Finally, but very important, we need to tell the framework that we’re using the step functions plugin.

And that’s all! We can now deploy the Step function. Just type:

1
sls deploy -v

sls deploy step function

We should see the step function deployed correctly. It’s time to invoke the Step Function and see it working:

1
sls invoke stepf --name testStepFunction --data '"asdf"'

sls invoke step function

Voilà!!

A bit of FSharp

We’ve seen how to develop a Lambda function using C#. As we’ve seen, at the end we’re compiling the code and publishing it so we can do the same with an F# project. Let’s do it!

First of all, let’s create a new F# project. Go to your root folder and type

1
dotnet new classlib -lang F# --name SayHelloLambda --output SayHelloLambda

Unfortunately you can’t change the targeted framework using the CLI. So, go to the SayHelloLambda.fsproj file and change the target framework to netcoreapp1.0

As we did with the C# project, we need to add the assembly attribute to our module. Let’s start adding the required package:

1
dotnet add package Amazon.Lambda.Serialization.Json

Now you can change the contents of the fs file (you can rename it as well if you want). Copy the following contents:

1
2
3
4
5
6
7
8
9
namespace TestServerlessStepFunctions

module SayHelloLambda =
    open Amazon.Lambda.Core
    [<assembly: LambdaSerializerAttribute(typeof<Amazon.Lambda.Serialization.Json.JsonSerializer>)>]
    do()

    let sayHello name =
        sprintf "Hello %s" name

As you can see we’re composing a new string as a return value of our Lambda. We can now build and deploy the project:

1
2
3
4
5
dotnet restore
dotnet build
dotnet publish
mkdir bin/Debug/netcoreapp1.0/package
zip -Xrj bin/Debug/netcoreapp1.0/package/SayHelloLambda.zip bin/Debug/netcoreapp1.0/publish/

We just need to add the serverless.yml file. Copy the following contents:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
service: sayHelloService

provider:
    name: aws
    runtime: dotnetcore1.0
    profile: serverless-admin-vgaltes
    region: us-east-1
    stage: dev

package:
    artifact: bin/Debug/netcoreapp1.0/package/SayHelloLambda.zip

functions:
    SayHello:
        handler: SayHelloLambda::TestServerlessStepFunctions.SayHelloLambda::sayHello

Nothing new here. We’re ready to deploy now. Let’s do it:

1
sls deploy -v

Our Lambda function developed in F# is deployed now. Let’s try it:

1
sls invoke -f SayHello --data "Vicenç"

sls invoke sayHello

Great! The F# Lambda function is working now! Time to add the lambda to our Step function. Let’s start adding a new custom variable:

1
2
3
4
custom:
    accountId: <your_account_id>
    uppercaseService: uppercaseService
    sayHelloService: sayHelloService

And now add the new Lambda to the Step function and link the execution of the previous one to this one:

1
2
3
4
5
6
7
8
9
10
11
12
testStepFunction:
  definition:
    StartAt: Uppercase
    States:
      Uppercase:
        Type: Task
        Resource: arn:aws:lambda:${opt:region}:${self:custom.accountId}:function:${self:custom.uppercaseService}-${opt:stage}-Uppercase
        Next: SayHello
      SayHello:
        Type: Task
        Resource: arn:aws:lambda:${opt:region}:${self:custom.accountId}:function:${self:custom.sayHelloService}-${opt:stage}-SayHello
        End: true

And that’s all we need. You can now deploy the step function:

1
sls deploy -v

And test it:

1
sls invoke stepf --name testStepFunction --data '"asdf"'

You should see “Hello ASDF” as output.

sls invoke step function

Summary

We’ve seen quite a few things in this article. We’ve discovered Step functions, which are a nice way to orchestrate Lambda functions. We’ve seen as well how we can develop Lambda functions using C# or F# thanks to .Net Core. As a side effect, we’ve seen how we can develop all of this using a Mac laptop and not a Windows machine. Hope you enjoyed it!

The other day my friend Jero wrote an article explaining how to write a Google Function using Javascript. The goal of this article is to do the same exercise but using Azure Functions and F#.

First steps

First of all, you will need to set up a couple of things. Don’t worry, both of them are free. First of all you’ll need an Azure account. You can create one for free and have £150 credit (and you’re not going to spend a single pound doing this exercise).

Second (not entirely necessary as you can edit an Azure Function directly in the browser, but highly recommended) you need an IDE to develop F#. Download VSCode and install the Ionide extension and you’ll be in a perfect position.

If we want to develop locally we’ll need to install the Azure Functions CLI. Unfortunately, this tool is only available in Windows right now. This tool will help us a little bit on the function creation. Right now is a convenient way to create an Azure Function, but it’s not entirely necessary. At the end, is going to create some files that you can create manually if you want.

To install the Azure Functions CLI you need to follow the folowwing steps:

  • Install nodejs
  • Update npm: npm install npm@latest -g
  • Install Azure Functions npm package: npm i -g azure-functions-cli

Now we can create the Azure function folder. Create the folder where you want to work and type func init ManOrWoman. This will create a git repository with some files inside. Func InitNow you can push the repo to Github.

Continuous deployment

What are we going to do in this example, is enable the continuous deployment from Github. We will link our Azure Function to a repo in Github so that every time we push a change the function will be automatically deployed. Probably something you don’t want to do in your professional project but that is good enough by now. Hopefully, we’ll see how to deploy an Azure function in a more professional way in a future article.

Let’s go to the Azure portal and create a new Function App:

  • App Name: ManOrWoman
  • Resource Group: ManOrWomanRG
  • Hosting Plan: use a App Service Plan -> create new -> Free Tier (or a consumption plan)
  • Storage account: (create new) manorwomansa

When Azure finishes creating the function you can set up the continuous deployment using Github:

  • Click on Start from source control Start from source control
  • Follow the steps to set up a new github deployment

Now we’re ready to create the function itself. Go to your console (to your work folder) type func new and follow the following steps:

  • Select a language -> choose F#
  • Select a template -> httptrigger
  • FunctionName -> ManOrWoman

Push the changes and wait until they are deployed. Don’t worry, only takes few seconds. Now we’re ready to test the test function using a browser. If we try to execute the function using the browser and going to the URL https://manorwoman.azurewebsites.net/api/ManOrWoman?name=Vicenc we’re going to get a 401 Unauthorised as a response. This is because we have the authorisation set as function and therefore we need to provide a code (key) to be able to run this function. Something like this:

1
https://manorwoman.azurewebsites.net/api/ManOrWoman?code=<the_code>&name=Vicenc
You can find your code in the Azure Portal under Manage inside your function.

If we’d like to make this function public, we should go to our function.json file and change the authLevel from function to anonymous. Let’s change it, push the changes and try to access the function in this way: https://manorwoman.azurewebsites.net/api/ManOrWoman?name=Vicenc We’re good now and we don’t need to provide any key.

Code

Let’s code the function itself. What we’re going to do is given a name, check it the list of people’s name in Spain. To do that, we’re going to download the data from the goverment and export it as a CSV, one file for men’s names and another one for women’s names. We need to upload this files to our function environment and we’ll do that using Kudu:

  • Go to https://.scm.azurewebsites.net
  • Click on Debug Console -> CMD
  • Navigate to data
  • Create a folder called spain
  • Create two files, women.csv and men.csv
  • Edit the files and copy the content from the files you’ve recently created

We’ll need the same kind of structure in our local environment to be able to test the function. So, in your function folder, create the same folder structure (/data/spain) and copy there those two files.

As we’re using csv as our data source, we’re going to use the CSV type provider to read the data. Let’s start defining some types we’re going to use. Create a file called types.fs and copy the following content:

1
2
3
4
5
6
7
8
9
10
11
12
13
namespace Types
    open FSharp.Data

    type NameData = CsvProvider<"Order,Name,Frequency,AverageAge", HasHeaders = true, 
                                            Schema = "Order(int),Name,Frequency(int), AverageAge(float)">

    type NameStatistic = {Frequency: int}

    type Result = {
        Gender: string
        Frequency: int
        Percentage:float
    }

And now create a file called statistics.cs and copy the following content:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
module Statistics
    open Types
    open FSharp.Data

    let getGenderStatistics (fileName:string) (name:string) =
        let names = NameData.Load(fileName)

        let nameData =
            names.Rows
            |> Seq.filter(fun r -> r.Name = name.ToUpperInvariant() )
            |> Seq.tryHead
        
        match nameData with
        | None -> None
        | Some x -> Some {NameStatistic.Frequency = x.Frequency}

    let getNameStatistics (name: string) (folder:string) =
        let statistics =
            [|folder + "men.csv"; folder + "women.csv"|]
            |> Array.map(fun x -> getGenderStatistics x name)

        let calculatePercentage (x:int) (y:int) = 
            float x * 100.0 / (float x + float y)

        match statistics with
        | [|Some m;Some w|] -> 
            match (m.Frequency > w.Frequency) with
            | true -> Some {Gender = "Man"; Frequency = m.Frequency; Percentage = calculatePercentage m.Frequency w.Frequency}
            | false -> Some {Gender = "Woman"; Frequency = w.Frequency; Percentage = calculatePercentage w.Frequency m.Frequency}
        | [|Some m;None|] -> 
            Some {Gender = "Man"; Frequency = m.Frequency; Percentage = 100.0} 
        | [|None;Some w|] -> 
            Some {Gender = "Woman"; Frequency = w.Frequency; Percentage = 100.0} 
        | _ -> None

Basically what we’re doing here is to find the name in both files, see where is more common, and return the result.

What we need to do now is to use this function in the run.fsx script. Edit the script with the following content:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
#r "System.Net.Http"
#r "Newtonsoft.Json"
#r "Fsharp.Data"
#load "Types.fs"
#load "Statistics.fs"

open System.Net
open System.Net.Http
open System.Net.Http.Headers
open Newtonsoft.Json
open Statistics

let Run(req: HttpRequestMessage, log: TraceWriter) =
    async {
        log.Info(sprintf "F# HTTP trigger function processed a request.")

        let name =
            req.GetQueryNameValuePairs()
            |> Seq.tryFind (fun q -> q.Key.ToLowerInvariant() = "name")

        let folder = Environment.ExpandEnvironmentVariables(@"%HOME%\data\spain\")

        let response =
            match name with
            | Some x ->
                let statistics = getNameStatistics x.Value folder
                match statistics with
                | Some y -> 
                    let json = JsonConvert.SerializeObject(y)
                    let jsonResponse = sprintf "%s" json
                    req.CreateResponse(HttpStatusCode.OK, jsonResponse, "text/plain")
                | None -> req.CreateResponse(HttpStatusCode.BadRequest, "We haven't found the name")
            | None ->
                req.CreateResponse(HttpStatusCode.BadRequest, "Specify a Name value")

        return response

    } |> Async.RunSynchronously

In this piece of code, we’re just parsing the input, calling our code, and return the result. The “interesting” part comes when returning the result. As you can see we’re returning it as “text/plain”. That’s because how ASP.Net works internally. You will face the same problem developing a Web API project. If you’re doing that you can use Mark Seemann’s solution or Isaac Abraham’s solution If not, you have two options:

  • Specify the media type as “text/plain”
  • Use DataMemberAttribute to provide your own names.

I think the second option is the best one because we keep our media type as “application/json”. So, let’s make some changes: In types.fs:

  • Add: open System.Runtime.Serialization
  • Change the result type to: [] type Result = { [<field: DataMember(Name="Gender")>] Gender: string [<field: DataMember(Name="Frequency")>] Frequency: int [<field: DataMember(Name="Percentage")>] Percentage:float }

In run.fsx:

  • Add: #r “System.Runtime.Serialization”
  • Change the way we create the response to: match statistics with | Some y -> req.CreateResponse(HttpStatusCode.OK, y) | None -> req.CreateResponse(HttpStatusCode.BadRequest, “We haven’t found the name”)

Dependencies

Our last step is to specify the dependencies of the function. In our case, we just need the FSharp.Data dependency. Open the project.json file and add “FSharp.Data”: “2.3.2” inside the dependencies node.

Push the changes and… voilà!!

Result

Summary

In this article, we’ve seen an introduction to Azure Functions using F#. We’ve developed a toy project to learn the basics. But we can’t develop a serious project in this way, we need things like testing and a better deployment. Hopefully, we’ll see it in future posts.

When you’re not sure about how to do something, you generally do a checklist of steps to follow. The first day I went to my current client, I had a list of steps: buy the train ticket, go to Kings Cross, collect the train ticket, take the train, take a cab, ask for Mr X, etc. I’m not using that list anymore because I now know how to do it.

It’s a common practice in agile teams to have a definition of ready and a definition of done. The definition of ready defines what does a story have to have defined in order to be considered to be taken in the sprint. Typical things are acceptance criteria, business value, risk. The definition of ready defines what the team considers a story have to look like to be considered done. Typical things are unit tests, demoed to the PO, deployed to Pre (or ideally to Prod), reviewed. You can even customise a Jira template to include the checklist! Amazing! (NO)

The agile literature suggests that your goal should be to evolve these checklists when the team feels comfortable with the existing one and include more things and be more awesome. I don’t agree with that.

Your goal should be to get rid of those checklists. Your goal should be to be a team mature enough to don’t need to be remembered that to consider a story done has to be reviewed by someone else. Your team should be mature enough to define the acceptance criteria before accepting working on one story. You don’t need a checklist for that, you just need to be professional.

You should aim to get rid of these checklists, but they are useful in the meantime.

As more mature is the team, the less established process it needs.