Swagger Benchmarks

Measuring run-time overhead

Posted by Assaf Lavie on June 8, 2017

You know those benchmark posts that make you scroll to the end to get to the good stuff, and you’re like “shut up and show me the data already”?

This isn’t one of those posts.

Here is the overhead Swagger (the NPM package) adds to your Node.js request processing time for three major frameworks: Express, Restify and Connect.

benchmark bar chart


If you are familiar with Swagger, just skip to the next section.

Swagger (now actually called “OpenAPI”) lets you define REST APIs using a JSON/YAML format. Developers use it to publish their API specifications: resources, methods, parameters, error codes, etc.

It generates client and server code for most programming languages, which validates input and output for you. It also has a web-based API editor and sandbox in which API calls can be tested from the browser.

swagger editor screenshot

If you’re publishing REST APIs, this is pretty much the de-facto standard for documenting them1.

Given the above YAML, Swagger generates nice online documentation that looks like this:

swagger docs screenshot

Here at Binaris, we use Swagger to define the contracts between our micro-services. And we care a lot about latency, hence the benchmarking.


First, let’s give credit to Express for being the quickest on its own, without Swagger. It’s lean and mean.

Swagger introduces quite a bit of overhead. It has work to do, of course, validating all those requests and so on and so forth. That stuff doesn’t come free. There seems to be a (nearly) fixed overhead of about half a millisecond for each request.

The bottom line is using Swagger shouldn’t affect your choice of Framework. Swagger slows them all down pretty much equally.

How we tested

You can run the benchmarks yourself from the repo. Only docker-compose is required.

We used Apache Bench to run 10,000 GET operations against each project:

  • Express, Express with Swagger
  • Connext, Connext with Swagger
  • Restify, Restify with Swagger

The projects are basically the Hello, World example projects swagger project create generates for each framework. We then created a no-swagger (“vanilla”) version for each project that exposes the same API endpoint.

The tests ran on a m4.large Amazon EC2 instance, and we averaged the results from multiple runs.

No concurrency was used with Apache Bench, as it’s irrelevant to the measurement.

We actually also tested Hapi and Sails - two additional frameworks supported by Swagger; but their results were so abismally slow that we did not even include them in the chart above.

1 Unless, of course, if you’re a hipster that has already moved on to GraphQL, in which case this post is so 2017 and not for you.