Skip to main content

4 posts tagged with "serverless-adapter"

View All Tags

· 9 min read
Vinícius Lourenço

Two paths inside a forest!

Image by Jens Lelie on Unsplash

This feature was initially asked by ClementCornut on issue #127.

Initially I was a little unsure whether to publish esm and cjs, but then I started to like the idea of exporting my packages as @h4ad/serverless-adapter/adapters/aws.

You can use it by installing the new version:

npm i @h4ad/[email protected]

The version 4.0.0 was released with a bug that didn't include the package files, so I released the version 4.0.1 to fix this issue.

In the previous version, since I only export to commonjs, you need to import the files as /lib/adapters/aws, which is not bad, but not exactly good. This was necessary because I can't export all files in the default export as this will lead you to install all frameworks supported by this library.

But ok, I had some problems while adding support for dual-package publishing which I want to share with you only, and especially for my future version if you want to add support for dual-package publishing in your modules.


I already use vitest test my package and to build the previous versions, it works great and is specially fast to run the tests (5.82s to run 456 tests across 52 files).

But the configuration to build was a little bit nightmare:

// some initial configuration lines...
...(!isTest && {
esbuild: {
format: 'cjs',
platform: 'node',
target: 'node18',
sourcemap: 'external',
minifyIdentifiers: false,
build: {
outDir: 'lib',
emptyOutDir: true,
sourcemap: true,
lib: {
entry: path.resolve(__dirname, 'src/index.ts'),
formats: ['cjs'],
rollupOptions: {
external: ['yeoman-generator'],
input: glob.sync(path.resolve(__dirname, 'src/**/*.ts')),
output: {
preserveModules: true,
entryFileNames: entry => {
const { name } = entry;

const fileName = `${name}.js`;

return fileName;

All of this configuration was necessary as I need to build my package to match exactly the same structure as src, which was needed for users to import as /lib/adapters/aws.

On the first attempt, I just tried to extend this configuration, but I spent a few hours and managed to generate a good package output, but it was missing some details that were very painful to bear, such as correctly emitting d.cts.

If you want to see how it turned out, if you want to try doing this using vite directly, here is vite.config.ts.

But then I started to give up and tweeted about it.

Suggestions from Twitter

I got some incredible helpful messages on twitter and I will cover those suggestions that I applied to be able to finally support dual package publish.

Re-exporting on mjs

This was a suggestion from Matteo Collina, he also sent me the package snap which does this re-export, which I also saw being used in Orama, is basically doing this:

Your state:

import Date from 'date';
const someDate = new Date();

Define/export on cjs.

const state = require('./state.cjs');
module.exports.state = state;

Re-export on mjs.

import state from './state.cjs';
export {

This way, I solve the problem of state isolation, but I will need:

  • or manually export all these files
  • or use a tool to automate this process

Both ways will be a bit painful to maintain, so I didn't go that route.

This approach can be fine if your library maintains state and the codebase is pure javascript.


Instead of trying to go through this configuration hell in vite, Michele Riva suggested tsup which is incredibly easy to use.

I spent less than 10 minutes to generate almost the same output as the previous configuration with vite, but this time the output was correct, with d.cts files being generated.

My configuration file now looks like this:

export default defineConfig({
outDir: './lib',
clean: true,
dts: true,
format: ['esm', 'cjs'],
outExtension: ({ format }) => ({
js: format === 'cjs' ? '.cjs' : '.mjs',
cjsInterop: true,
// the libEntries is basically all the entries I need to export,
// like: adapters/aws, frameworks/fastify, etc...
entry: ['src/index.ts', ...libEntries],
sourcemap: true,
skipNodeModulesBundle: true,
minify: true,
target: 'es2022',
tsconfig: './',
keepNames: true,
bundle: true,

package.json exports

Since I have a lot of things to export, I also automate the configuration of exports in package.json with:

// I do the same for adapters, frameworks and handlers
const resolvers = ['aws-context', 'callback', 'dummy', 'promise'];

const libEntries = [ => `src/resolvers/${resolver}/index.ts`),

const createExport = (filePath: string) => ({
import: {
types: `./lib/${filePath}.d.ts`,
default: `./lib/${filePath}.mjs`,
require: {
types: `./lib/${filePath}.d.cts`,
default: `./lib/${filePath}.cjs`,

const createExportReducer =
(initialPath: string) => (acc: object, name: string) => {
acc[`./${initialPath}/${name}`] = createExport(

acc[`./lib/${initialPath}/${name}`] = createExport(

return acc;

const packageExports = {
'.': createExport('index'),
...resolvers.reduce(createExportReducer('resolvers'), {}),
// and I also do the same for adapters, frameworks and handlers.

// this command does the magic to update my package.json
execSync(`npm pkg set exports='${JSON.stringify(packageExports)}' --json`);

This works incredible and also keep my package.json updated.

Module not found on moduleResolution node

But there is one thing you should pay attention to, did you see that I export files with the prefix ./lib?

  "exports": {
"./adapters/apollo-server": {...},
"./lib/adapters/apollo-server": {...},

The content of both is the same, but I do this to be compatible with the resolution of node. Without this configuration, the IDE will show the import to @h4ad/serverless-adapter/adapters/apollo-server as resolved, but node will not be able to find the file during the runtime.

And the interesting part of doing this is that people who import this package with /lib will still be able to import the code, and they won't need any code modifications.

Maybe I could release this feature without it being a breaking change with this change, but to be on the safe side, I released it as a breaking change anyway.


The publint is a tool that I learned on ESM Modernization Lessons inside Early Attempts, this article was a suggestion by Luca Micieli.

With this tool, I detect several problems with the exports configuration and this will make your life a lot easier.

But this tool has a problem that they didn't catch, and this problem was pointed out by Michele Riva, instead:

"exports": {
"resolvers/promise": {

I should export it with the prefix ./:

"exports": {
"./resolvers/promise": {

This small detail can make your configuration fail.


The verdaccio was suggested by Abhijeet Prasad, with this tool you can have your private registry that you can use to test, Abhijeet use this tool on Sentry SDKs to do e2e tests.

With this tool, I was able to make sure the package was working correctly with the new dual package publish.

Wrong moduleResolution

It took me a while to realize, but while testing the changes in a sample project, the imports still failed because TypeScript couldn't find the files.

So I remember the suggestion from sami who gave me some examples of projects he uses in BuilderIO/hidration-overlay, and I understand the difference between moduleResolution, in my project it was configured for node, in his project it was configured for bundler.

When I changed this setting, all imports started working and imports using /lib/adapters/aws started failing.

If you're like me and have no idea what this setting is about, the documentation says:

Specify the module resolution strategy:

'node16' or 'nodenext' for modern versions of Node.js. Node.js v12 and later supports both ECMAScript imports and CommonJS require, which resolve using different algorithms. These moduleResolution values, when combined with the corresponding module values, picks the right algorithm for each resolution based on whether Node.js will see an import or require in the output JavaScript code.

'node10' (previously called 'node') for Node.js versions older than v10, which only support CommonJS require. You probably won’t need to use node10 in modern code.

'bundler' for use with bundlers. Like node16 and nodenext, this mode supports package.json "imports" and "exports", but unlike the Node.js resolution modes, bundler never requires file extensions on relative paths in imports.

Make sense why it was not working at all, node was not built to support exports, and only nodenext and bundler should work correctly.

Doubling the package size

Something I saw that made me a little worried was the size of this package, since I need to export the code, types and source maps twice, the package went from ~600Kb to ~1.5Mb.

I enabled minification to try to reduce the amount of code shipped, but if you use this library and don't have any kind of minification/bundling during your build, I highly recommend you look into these libraries to help you with the size of your zip file being uploaded:


The esm packages was a nightmare to support some time ago but the ecosystem is starting solving problems with new tools to bundle your project instead of having to fight with your own configuration files.

The cjs is a no-brain solution, almost no configuration and it works great but maybe is not ideal for your consumers/clients, some of them can have issues like ClementCornut that needed to import the files with the full path import awsPkg from "@h4ad/serverless-adapter/lib/adapters/aws/index.js";.

When I started adding this feature, I had no knowledge of how to publish double packages, I basically go-horse in the early hours of my implementation and then I started learning more about how it works and how to properly configure the package.

This makes me realize that dual-package publishing isn't the nightmare I initially thought, I just didn't learn from the previous mistakes other people made and I should have read more articles about it before I started implementing it.

My sincere thanks to:

Without you, it will probably take me a lot longer to be able to convince myself to go ahead and try again to add support for dual-package publish after the first failures.

· 3 min read
Vinícius Lourenço

A beautiful stream!

Image by Hendrik Cornelissen on Unsplash

It's been a long time since I wrote a post here, but I'm happy to share this new announcement.

First, are you new to this library?

First time?

Let me introduce the library first, I named Serverless Adapter because my goal is connect any serverless environment to any NodeJS framework.

So you could just plug your framework, use the correct handler for your serverless environment, choose the adapters and then you can deploy your application!

What does this library support?

Currently, we support 8 NodeJS frameworks: Express, Fastify, tRPC, Apollo Server, NestJS, Deepkit, Koa and Hapi.

We also support 6 serverless environments: AWS, Azure, Google Cloud, Digital Ocean, Firebase and Huawei.

Talking about AWS, we support 10 different services like API Gateway V1 and V2, SQS, SNS, etc... and you can combine them to use the same codebase and lambda to handle them all.


To learn understand the power of this composability, check this article I wrote about how I went From a million invocations to a thousand with correct caching.

But okay, enough self-marketing, let's get to the main point of this article.

AWS Lambda Response Streaming

Today I'm rolling out support for AWS Lambda Streaming Response using AwsStreamHandler.

If you already use this library, just change DefaultHandler to AwsStreamHandler, and make sure you're using DummyResolver and ApiGatewayV2Adapter:

import { ServerlessAdapter } from '@h4ad/serverless-adapter';
import { AwsStreamHandler } from '@h4ad/serverless-adapter/handlers/aws';
import { DummyResolver } from '@h4ad/serverless-adapter/resolvers/dummy';
import { ApiGatewayV2Adapter } from '@h4ad/serverless-adapter/adapters/aws';
import app from './app';

export const handler =
// .setHandler(new DefaultHandler())
.setHandler(new AwsStreamHandler())
.setResolver(new DummyResolver())
.setAdapter(new ApiGatewayV2Adapter())
// more options...
//.setFramework(new ExpressFramework())

Despite its name, ApiGatewayV2Adapter can be used to support API Gateway V2 and function URLs.


Response streaming currently is only available for Function URLs.

That's it :) Now you can use Function URLs and stream your content to the world!

Don't forget to enable the feature in your AWS Lambda function by changing BUFFERED TO RESPONSE_STREAM.


Well, if you're the type of person who, like me, needs to see the code working, here's a repository with several example projects using this library: serverless-adapter-examples.

Beyond HTTP Requests

Furthermore, not only can you receive HTTP requests using Function URLs, but you can combine your SQS queue and use the same codebase to process everything.

I haven't spent a lot of time testing it, but so far, any AWS service that supports this library can be hooked up to your Lambda function with RESPONSE_STREAM enabled.

The only thing you need to know is: the answer didn't work as expected, I couldn't get the SQS Partial Response to work for example .

But you can give it a try anyway, share your results with me on twitter and I'll be happy to help if I can.


Well, I don't have much to say, but I hope you enjoy this new feature and use it to build amazing things.

I've spent the last 3 weeks trying to figure out how to make this work and I'm happy with the result.

If you're curious enough to learn more about how I implement it, you can see this PR with all my struggles and thoughts over the weeks.

· 2 min read
Vinícius Lourenço

To the moon!

Now we have more Handlers, Frameworks and Adapters, let's see what's new.

From v2.3.2 to v2.6.0, compare the changes here.


42 commits, 6905 lines added, 601 lines deleted, that's the size of the changes since The Beginning.

I'm very proud of how things are going, I learned a lot by studying to implement these new things.

But, let's learn what's new in all these releases.

Azure and Firebase

You can now use this library to deploy your apps to Azure Functions and Firebase Functions.

More specifically, you can integrate with Http Trigger V4 on Azure and Http Events on Firebase.

These integrations are just to open the door of possibilities, in the future I want to add support for more triggers in these clouds.

Check out the Azure and Firebase docs for how to integrate.

I also added examples for the cloud in the serverless-adapter-examples repository.


tRPC allows you to easily build & consume fully typesafe APIs, without schemas or code generation.

tRPC is a framework that brings a new way of thinking about APIs, instead of REST or GraphQL, you can build typesafe APIs and easily can integrate with the client, seems to be very promising.

So now you can deploy applications developed with tRPC to any cloud that this library supports, have a look at docs to learn more about how to use it.

That's all folks!

I have two more weeks to work in this library without worrying because I'm on vacation at the university, so probably my next efforts will be to bring more articles to this blog to show the full power of this library.

Giving some spoilers for those of you that make it this far, I'll start by showing you the benefits of using AWS Lambda integrated with API Gateway and SQS, I used it in a project of my company and I managed to reduce a lot of stress on the database and now we are able to process 500k votes in minutes without spending 15% CPU using a PostgreSQL database on a t2.micro instance.

That's all for today, thank you!

· 2 min read
Vinícius Lourenço

Hello, welcome to my new library to help you integrate your API with the serverless world.

The development

It took me almost 5 months to build this library, refactoring was easy and testing was challenging, but documenting this library was the hardest part.

It took me almost 2 weeks to refactor @vendia/serverless-express, about 1 and a half month to create tests with 99% coverage and the rest of the time I spent creating documentation for this library.

I currently added support for:

But it's just the beginning, I'm going to build more adapters to integrate with as much of the cloud as possible, just to be able to deploy my APIs on any cloud.

About me

I am a student at Facens university and I work for Liga, which is a sector within Facens that develops applications, websites, games and much more fun stuff.

I currently work on this library only in my spare time and I need to balance my Final Theses and my overtime projects so it was very challenging but I am happy with the end result of this library.


This library was originally created to help my company reduce costs with AWS SQS, but it has since turned into something I can spend my time developing and learning English because I'm not a native speaker (as typing problems might suggest) writing all the documentation in English.


I need to thank @vendia for developing @vendia/serverless-express, all logic and code I finished to refactor from the code I read on serverless-express. I also have many thanks to Chaguri, Liga and many other people who gave me time and insights to create this library.

You can use it right now!

See the Introduction section to know more about the library.