Skip to content

Colin Webb

AWS Lambda with Rust

AWS Lambda has supported custom runtimes using Docker images for a long time now. Any language that can run in Docker, can run in AWS Lambda. Therefore, we can write AWS Lambda functions in Rust - a fun, and fast, language.

This quick guide gets you deploying a Rust Lambda using the AWS CLI.

The Plan.

We'll be using the aws-cli in this blog post. The concepts and basic building blocks are the same if you're opt to use Terraform, or Cloudformation.

So, what is the plan?

  • Write a Lambda function and a Dockerfile
  • Create an ECR repository (to hold the Docker image)
  • Upload a Docker image
  • Configure the lambda function & IAM roles/permissions
  • Invoke it
  • Profit!

Well, maybe not profit. But you'll at least have a Rust Lambda running in AWS.

Write a Lambda

Firstly, we need an AWS Lambda function to use.

A very simple project with a Cargo.toml and src/main.rs will do.

[package]
name = "hello_world"
version = "0.1.0"
edition = "2021"

[dependencies]
lambda_http = "0.7.1"
tokio = { version = "1", features = ["macros"] }

This is a simple Lambda function that returns an HTTP 200 response with the body "Hello world". I've chosen HTTP rather than something similar, since I've spent a lot of time writing HTTP Lambdas in Rust - and it is slightly more exciting than a barebones example.

use lambda_http::{run, service_fn, Body, Error, Request, Response};

async fn function_handler(_event: Request) -> Result<Response<Body>, Error> {
    let resp = Response::builder()
        .status(200)
        .header("content-type", "text/html")
        .body("Hello world".into())
        .map_err(Box::new)?;
    Ok(resp)
}

#[tokio::main]
async fn main() -> Result<(), Error> {
    run(service_fn(function_handler)).await
}

Next, we also need a Dockerfile to construct our docker image we'll later upload to ECR.

We're using the Docker builder pattern - also known, in the docs, as a multi-stage build. The pattern builds the Lambda function in one container, and then copies the binary into a second container. This means that the second container can be as small as possible since it doesn't contain the build toolchain.

FROM rust:latest as build
WORKDIR /usr/src/lambda
COPY . .
RUN cargo build --release

FROM gcr.io/distroless/cc-debian10
ENV RUST_LOG=${log_level}
COPY --from=build /usr/src/lambda/target/release/hello_world /asset-output/bootstrap
ENTRYPOINT [ "/asset-output/bootstrap" ]

To build..

docker build -t $AWS_ACCOUNT_NUMBER.dkr.ecr.eu-west-2.amazonaws.com/lambda-blog-post .

And then we start to create our AWS infrastructure.

aws ecr create-repository --repository-name lambda-blog-post

In order to push our Docker image to ECR, we need to authenticate with the ECR service first.

aws ecr get-login-password | docker login --username AWS --password-stdin $AWS_ACCOUNT_NUMBER.dkr.ecr.eu-west-2.amazonaws.com

docker push $AWS_ACCOUNT_NUMBER.dkr.ecr.eu-west-2.amazonaws.com/lambda-blog-post:latest

We now have a Docker image in ECR. We can now create a Lambda function that uses this image.

Lambda Function and IAM

Firstly, a simple IAM role that says AWS can run our Lambda with a the basic policy.

# Create an IAM Role lambda-basic-execution.
aws iam create-role \
--role-name lambda-basic-execution \
--assume-role-policy-document '{
    "Version": "2012-10-17",
    "Statement": [
        {
        "Effect": "Allow",
        "Principal": {
            "Service": "lambda.amazonaws.com"
        },
        "Action": "sts:AssumeRole"
        }
    ]
}'

# Attach AWSLambdaBasicExecutionRole to the role.
aws iam attach-role-policy \
--role-name lambda-basic-execution \
--policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole

Then create the function:

aws lambda create-function \
--function-name rust_lambda \
--package-type Image \
--code ImageUri="$AWS_ACCOUNT_NUMBER".dkr.ecr.eu-west-2.amazonaws.com/lambda-blog-post:latest \
--role arn:aws:iam::"$AWS_ACCOUNT_NUMBER":role/lambda-basic-execution \
--architectures arm64

Note here that I'm developing and writing on a Macbook with an ARM processor, so have set the architectures flag (the flag defaults to to x86_64). If you want to develop on a different architecture to the deployment target, you'll need to play around with cross-compilation and/or switch to building with musl which is a libc replacement that you can link to statically (and therefore results in a more portable binary).

Testing it works

At this point we should have everything setup and deployed. We can test it by invoking the lambda function.

This should output the invocation result to stdout and the Lambda's response to output.json:

# aws lambda invoke --function-name=rust_lambda output.json
{
    "StatusCode": 200,
    "ExecutedVersion": "$LATEST"
}

# cat output.json | jq .
{
  "statusCode": 200,
  "headers": {
    "content-type": "text/html"
  },
  "multiValueHeaders": {
    "content-type": [
      "text/html"
    ]
  },
  "body": "Hello world",
  "isBase64Encoded": false
}

Future Work

We've constructed a simple Rust Lambda function, and deployed it to AWS. This is the first step, and there's lots more that could be done to extend this.

For example, we could add additional Lambda functions to the same project - perhaps using Cargo's workspaces feature to build multiple binaries, and deploy them separately? Or, we could extend our current Lambda to handle multiple types of invocation or arguments.

One thing I would recommend is switching away from the aws-cli to a different tool for deployment once you have a more complex setup. Terraform, AWS-SAM, Cloudformation, etc. They all work well, and help set up a structure that reduces code complexity.

Another point, and one close to my heart, is how to test our Lambda functions. If we start extending the Lambda to use other tools and systems, we'll need to ensure we prevent regressions and unit-test our logic. As this is an introductory blog-post, I won't go into too much detail here, but passing in a fixture (such as a database client) to the main function is a good start. This allows us to mock out the database client in our tests, and test the Lambda function in isolation.

For example:

let db = database::Database::new(config);
run(service_fn(|event: Request| function_handler(&db, event))).await

Finally, since we are using Rust, we could remove Docker and build portable binaries that can be deployed directly instead. I hinted at it earlier, with mention of cross-compilation and using musl, and there are some good guides out there. Such as this one. I know many people prefer binaries over containers, and Rust is a great language for this.

Thanks for reading, and I hope you have fun experimenting with Rust and AWS Lambda!