Published on Jul 19, 2022
AWS Lambda is one of my favorite services provided by AWS. "Serverless" may not sound as cool or fancy as several years ago, but it has no doubt become one of the most essential services to many businesses today. Next, I'm going to present some of the usages during my daily work.
A "job scheduler" lambda is particularly useful for tasks running on defined schedules. For example, my company needs to generate some sort of report every day at a particular time.
The diagram above illustrates the idea. We create a lambda and attach a CloudWatch event trigger to it. Whenever the event triggers the lambda, which then sends a message to a queue. The queue is subscribed by a service which can consume the message. This way, we leverage AWS to be the "restless whistler" to remind our system to generate the report on time every day.
A "message consumer" is a lambda which, well, consumes messages. We often call such service a "worker" and it usually deals with background tasks asynchronously. While there are many benefits to run a worker as a serious container maintained by yourself or DevOps team, using a lambda sometimes can significantly reduce the the cost and maintenance effort. And it's very easy to implement it in terms of infrastructure:
You do need to realize that Lambda does have its own inconvenience and limitations. One of the examples is that to connect to a database in private subnets, a lambda needs to be attached to the VPC. There are pros and cons for a lambda running in VPC and you should consider the trade offs before changing all the worker services to lambdas.
This AWS article teaches you some essentials you need to know when using lambda with SQS.
This is rather obvious as a Step Function more ore less involves at least one lambda. We use step functions to orchestrate some long running business workflows and each of the step functions usually contains 5 to 20 lambdas. Step Functions is big enough to be a topic of its own, so I won't talk more about it here.
Our business requires the files uploaded to S3 to be virus-scanned before they can be downloaded to our users. If any file is detected to carry any virus, it can not be touched and any attempt to fetch it should result in a "Forbidden" error.
I implemented a solution following this article. First, we create a container image lambda that can leverage on ClamAV to scan files for virus.
Then we hook the s3:ObjectCreated
event with the lambda, so that whenever a new file
is uploaded to S3 bucket, the lambda will be triggered to scan the file. If the
result is bad, the lambda will mark the file by tagging it with INFECTED
. We can
also attach some policy to the S3 bucket to forbid the downloading of any file
tagged with INFECTED
. The infrastructure is simple and it looks like below.
You can also associate CloudFront distribution with a
Lambda@Edge function.
In our usage, we need the lambda to serve as a "smart" router to forward
the requests such as @my-packages/my-pkg1/^1.2.3/main.js
to the latest
compatible version, which is between 1.2.3
and 2.0.0
.
We let the lambda run upon every viewer request arriving at CloudFront, and
then we mutate its uri
. For example, the code below transforms all versions
to latest
version:
request.uri = request.uri.replace(
/@my-packages/([A-Za-z-d]+)/([A-Za-z\%d\.\^\~]+|(latest))/,
'@my-packages/$1/latest'
)
Code above routes requests such as https://cdn.mydomain.com/@my-packages/pkg1/^2.3.1/main.js
to https://cdn.mydomain.com/@my-packages/pkg1/latest/main.js
. It's fast and easy to
build this, thanks to AWS SAM CLI.
Of course, you can also choose Serverless Framework
or even SST if you fancy AWS CDK.
This is a very particular usage if you want monorepo with AWS CodePipeline. We want to maintain a bunch of NPM packages in one Git repository and build/publish them only when their dependent files are changed.
If you're interested in such usage, you can take a look at this article, which will give you the general idea. In our case, we only want one repository and one pipeline, and therefore our lambda is also responsible for dynamically generating build script which contains only the packages that are indeed changed.
This is a very quick overview of some of the AWS Lambda usages I found practical in my work and I hope it gives you some ideas of how to leverage on the power of "being serverless".
© 2022 disasterdev.net. All rights reserved