Adds compression utils to the Fastify reply
object and a hook to decompress requests payloads.
Supports gzip
, deflate
, and brotli
.
🛈 Note: In large-scale scenarios, use a proxy like Nginx to handle response compression.
⚠ Warning: Since
@fastify/compress
version 4.x, payloads compressed with thezip
algorithm are not automatically uncompressed. This plugin focuses on response compression, andzip
is not in the IANA Table of Content Encodings.
npm i @fastify/compress
Plugin version | Fastify version |
---|---|
^8.x |
^5.x |
^6.x |
^4.x |
^3.x |
^3.x |
^2.x |
^2.x |
^0.x |
^1.x |
Please note that if a Fastify version is out of support, then so are the corresponding versions of this plugin in the table above. See Fastify's LTS policy for more details.
This plugin adds two functionalities to Fastify: a compress utility and a global compression hook.
Currently, the following encoding tokens are supported, using the first acceptable token in this order:
br
gzip
deflate
*
(no preference —@fastify/compress
will usegzip
)identity
(no compression)
If an unsupported encoding is received or the 'accept-encoding'
header is missing, the payload will not be compressed.
To return an error for unsupported encoding, use the onUnsupportedEncoding
option.
The plugin compresses payloads based on content-type
. If absent, it assumes application/json
.
The global compression hook is enabled by default. To disable it, pass { global: false }
:
await fastify.register(
import('@fastify/compress'),
{ global: false }
)
Fastify encapsulation can be used to set global compression but run it only in a subset of routes by wrapping them inside a plugin.
🛈 Note: If using
@fastify/compress
plugin together with@fastify/static
plugin,@fastify/compress
must be registered (with global hook) before registering@fastify/static
.
Different compression options can be specified per route using the compress
options in the route's configuration.
Setting compress: false
on any route will disable compression on the route even if global compression is enabled.
await fastify.register(
import('@fastify/compress'),
{ global: false }
)
// only compress if the payload is above a certain size and use brotli
fastify.get('/custom-route', {
compress: {
inflateIfDeflated: true,
threshold: 128,
zlib: {
createBrotliCompress: () => createYourCustomBrotliCompress(),
createGzip: () => createYourCustomGzip(),
createDeflate: () => createYourCustomDeflate()
}
}, (req, reply) => {
// ...
})
This plugin adds a compress
method to reply
that compresses a stream or string based on the accept-encoding
header. If a JS object is passed, it will be stringified to JSON.
The compress
method uses per-route parameters if configured, otherwise it uses global parameters.
import fs from 'node:fs'
import fastify from 'fastify'
const app = fastify()
await app.register(import('@fastify/compress'), { global: false })
app.get('/', (req, reply) => {
reply
.type('text/plain')
.compress(fs.createReadStream('./package.json'))
})
await app.listen({ port: 3000 })
The minimum byte size for response compression. Defaults to 1024
.
await fastify.register(
import('@fastify/compress'),
{ threshold: 2048 }
)
mime-db determines if a content-type
should be compressed. Additional content types can be compressed via regex or a function.
await fastify.register(
import('@fastify/compress'),
{ customTypes: /x-protobuf$/ }
)
or
await fastify.register(
import('@fastify/compress'),
{ customTypes: contentType => contentType.endsWith('x-protobuf') }
)
Set onUnsupportedEncoding(encoding, request, reply)
to send a custom error response for unsupported encoding. The function can modify the reply and return a string | Buffer | Stream | Error
payload.
await fastify.register(
import('@fastify/compress'),
{
onUnsupportedEncoding: (encoding, request, reply) => {
reply.code(406)
return 'We do not support the ' + encoding + ' encoding.'
}
}
)
Response compression can be disabled by an x-no-compression
header in the request.
Optional feature to inflate pre-compressed data if the client does not include one of the supported compression types in its accept-encoding
header.
await fastify.register(
import('@fastify/compress'),
{ inflateIfDeflated: true }
)
fastify.get('/file', (req, reply) =>
// will inflate the file on the way out for clients
// that indicate they do not support compression
reply.send(fs.createReadStream('./file.gz')))
By default, @fastify/compress
prioritizes compression as described here. Change this by passing an array of compression tokens to the encodings
option:
await fastify.register(
import('@fastify/compress'),
// Only support gzip and deflate, and prefer deflate to gzip
{ encodings: ['deflate', 'gzip'] }
)
Compression can be tuned with brotliOptions
and zlibOptions
, which are passed directly to native node zlib
methods. See class definitions.
server.register(fastifyCompress, {
brotliOptions: {
params: {
[zlib.constants.BROTLI_PARAM_MODE]: zlib.constants.BROTLI_MODE_TEXT, // useful for APIs that primarily return text
[zlib.constants.BROTLI_PARAM_QUALITY]: 4, // default is 4, max is 11, min is 0
},
},
zlibOptions: {
level: 6, // default is typically 6, max is 9, min is 0
}
});
By default, @fastify/compress
removes the reply Content-Length
header. Change this by setting removeContentLengthHeader
to false
globally or per route.
// Global plugin scope
await server.register(fastifyCompress, { global: true, removeContentLengthHeader: false });
// Route-specific scope
fastify.get('/file', {
compress: { removeContentLengthHeader: false }
}, (req, reply) =>
reply.compress(fs.createReadStream('./file.gz'))
)
This plugin adds a preParsing
hook to decompress the request payload based on the content-encoding
request header.
Currently, the following encoding tokens are supported:
br
gzip
deflate
If an unsupported encoding or invalid payload is received, the plugin throws an error.
If the request header is missing, the plugin yields to the next hook.
The global request decompression hook is enabled by default. To disable it, pass { global: false }
:
await fastify.register(
import('@fastify/compress'),
{ global: false }
)
Fastify encapsulation can be used to set global decompression but run it only in a subset of routes by wrapping them inside a plugin.
Specify different decompression options per route using the decompress
options in the route's configuration.
await fastify.register(
import('@fastify/compress'),
{ global: false }
)
// Always decompress using gzip
fastify.get('/custom-route', {
decompress: {
forceRequestEncoding: 'gzip',
zlib: {
createBrotliDecompress: () => createYourCustomBrotliDecompress(),
createGunzip: () => createYourCustomGunzip(),
createInflate: () => createYourCustomInflate()
}
}
}, (req, reply) => {
// ...
})
By default, @fastify/compress
accepts all encodings specified here. Change this by passing an array of compression tokens to the requestEncodings
option:
await fastify.register(
import('@fastify/compress'),
// Only support gzip
{ requestEncodings: ['gzip'] }
)
By default, @fastify/compress
chooses the decompression algorithm based on the content-encoding
header.
One algorithm can be forced, and the header ignored, by providing the forceRequestEncoding
option.
If the request payload is not compressed, @fastify/compress
will try to decompress, resulting in an error.
The response error can be customized for unsupported request payload encoding by setting onUnsupportedEncoding(request, encoding)
to a function that returns an error.
await fastify.register(
import('@fastify/compress'),
{
onUnsupportedRequestEncoding: (request, encoding) => {
return {
statusCode: 415,
code: 'UNSUPPORTED',
error: 'Unsupported Media Type',
message: 'We do not support the ' + encoding + ' encoding.'
}
}
}
)
The response error can be customized for undetectable request payloads by setting onInvalidRequestPayload(request, encoding)
to a function that returns an error.
await fastify.register(
import('@fastify/compress'),
{
onInvalidRequestPayload: (request, encoding, error) => {
return {
statusCode: 400,
code: 'BAD_REQUEST',
error: 'Bad Request',
message: 'This is not a valid ' + encoding + ' encoded payload: ' + error.message
}
}
}
)
Past sponsors:
Licensed under MIT.