Run AWS Lambda with Micronaut and Graal

Sabyasachi Bhattacharya
3 min readAug 16, 2020

--

AWS Lambda written in Java often suffers from well known cold start problem.While mechanism like periodically calling the lambda to keep it warm works but it would have been great if the language itself can help us mitigate cold start issue. With advent of Graal VM and it’s feature to AOT compile Java programme and create a native-image may give us that option.Below we will see how to run Graal native image in AWS Lambda

Running Graal native image on AWS lambda is possible thanks to Lambda’s relatively new feature called custom runtime. AWS Lambda comes with out of the box support for Java 11(Amazon Corretto),nodeJs,Python,GoLang, it also provides a way to run lambda on a custom runtime.

To create a custom runtime, lambda expects a file named bootstrap at the root of your lambda artifact zip. This bootstrap file basically contains code to initialize lambda, processing the event. Next we will see how we can use Micronaut Function to create a custom runtime and run function as graal native-image.

Micronaut Framework comes with support for AWS lambda function. Micronaut 2.0 comes with a class AbstractMicronautLambdaRuntime which acts as the custom runtime. All we need to do is to extend this class and override the function createRequestHandler . An example of custom runtime looks like this .

The above snippet from the default scaffold micronaut code

How it works. A custom runtime needs to do the following -

Initializing task

  • Read _HANDLER environment variable to know the name of handler function.
  • Read LAMBDA_TASK_ROOT to know the directory that contains lambda function.
  • Read AWS_LAMBDA_RUNTIME_API to fetch host and port of AWS Lambda runtime api.
  • Load handler file and run any initialization code that it has.

Processing task

  • Call next invocation API of AWS_LAMBDA_RUNTIME_API , get the event create an context and trigger handler function.
  • Handle response or any error.

A closer look into the startRuntimeApi in AbstractMicronautLambdaRuntime we see the method does exactly the above mentioned tasks.

Deep DiveAbstractMicronautLambdaRuntime takes four Generic Types as followsRequestType - The event JSON will be de-serialized to this model class.
ResponseType- The final response from the lambda. This may not be same as the response of the handler function.
HandlerRequestType - This is the input to the handler function.HandlerResponseType - This is the response from handler function, which further can be transformef to ResponseType above.

As we now all set up with our custom runtime we need to add the bootstrap file which may look something like below.

#!/bin/sh
set -euo pipefail
./<native-image-exec-name> -Xmx128m -Djava.library.path=$(pwd)

So Lambda runtime will see this bootstrap file at the root of deployed function and will invoke it. Which in turn invoke the CustomRuntime main class defined above . This class has extended AbstractMicronautLambdaRuntime and do all the heavy lifting to initialize and trigger lambda and send response back.

Now all that is left is to compile it using native-image tool and zip bootstrap file and the executable in a zip file and use that as the artifact for lambda.


Deep Dive
It is also possible to write the logic for initialization, calling lambda for custom runtime in the bootstrap file without using Micronaut. Using Micronaut just spare you from writing that. Also you can make use of DI and AOT compile from micronaut in your lambda out of the box also it has a good support for graal. But it is no way absolutely necessary to use Micronaut in order to run native-image with custom runtime.

So we see how to create a custom runtime with Micronaut and graal. Micronaut makes it relatively simple to hide the complexity of creating custom runtime and gives you a convenient class to extend, also due it’s easy integration with graal ,native-image generation becomes easier.

Hope you like this .. Happy coding .

Read more posts http://blog.sabyasachi.io , twitter @sabz2301

--

--

Sabyasachi Bhattacharya
Sabyasachi Bhattacharya

No responses yet