Version: @da638d2 · History · Edit

How to Serve Brotli and GZIP Compressed Content on GitLab Pages

Text content compression reduces the size of HTTP response body payloads and improves web page load times. GZIP and Brotli are both widely supported in current era web clients and use of these text compression technologies is a Lighthouse Performance Audit criterion.

GitLab Pages doesn’t serve compressed HTML, CSS, or Javascript files by default, but it is fast and straightforward to implement static GZIP and Brotli compression as part of your build and deployment workflow. After you generate the compressed files, the GitLab Pages servers will deliver the appropriate file formats to clients based on support. The approach described below should work with all static site generators. The examples are based on this site’s Hugo build workflow.

The GZIP and Brotli Compressed File Paths#

This file compression approach happens during the static site build workflow. Your gzip- (extension .gz) and brotli- (extension .br) compressed files must be located in the same directory as the pre-compressed versions of the text files. Here’s an example site directory tree from the GitLab Pages documentation. It assumes your production distribution files are located in a top-level public directory.

├─┬ index.html
│ |
│ └ index.html.gz
├── css/
│   └─┬ main.css
│     |
│     └ main.css.gz
└── js/
    └─┬ main.js
      └ main.js.gz

Compress Files in Your GitLab Pages Build#

gzip is installed by default in the GitLab CI/CD runner environment. We’ll configure the brotli installation and compress all HTML, CSS, and JS files with a few lines in the .gitlab-ci.yml configuration file. This site is built with Hugo, and the example below shows the hugo static site build step before the file compression takes place. Replace the hugo build command with your static site generator build command(s) before you compress the files if you use something else.

    - apk add --update brotli
    - hugo # replace with your static site generator build command if not using Hugo
    - find public -type f -regex '.*\.\(htm\|html\|txt\|text\|js\|css\)$' -exec gzip -f -k {} \;
    - find public -type f -regex '.*\.\(htm\|html\|txt\|text\|js\|css\)$' -exec brotli -f -k {} \;

The final two configuration lines above find all HTML, CSS, and JS files in the public directory by file extensions, execute gzip and brotli compression, and write the compressed files with the appropriate file extensions in the same directory as the pre-compressed file. Change public in these commands to your production distribution directory path if you use something else.

See my .gitlab-ci.yml build and distribution configuration file for a working example of the complete Hugo site build configuration that is used to compile and deploy this site.

That’s all there is to it. Once the files are in your production environment, web clients with GZIP and Brotli support should begin to receive the compressed versions. This takes place without any further configuration from you.

Confirm Compressed Files in HTTP GET Responses with curl#

Here is a curl-based one-liner that reports the Content-Encoding header on HTTP GET requests for files on your site. It can be configured with any Accept-Encoding Content Coding header value to examine the response body encoding format for assets on your site.

The following commands send an HTTP GET request—configured with an Accept-Encoding header—for this site’s home page HTML file , pipe it through head, and report out all of the content-* response headers with a grep regular expression match. Inspect the content-encoding line for the compression format. There will be no content-encoding response header when the file is not compressed. The content-length value is the response body size in bytes.

Uncompressed File#

curl -H "Accept-Encoding: *" -i 2>/dev/null | head | grep -E "content\-.*:"


content-type: text/html; charset=utf-8
content-length: 9279

gzip Compressed File#

curl -H "Accept-Encoding: gzip" -i 2>/dev/null | head | grep -E "content\-.*:"


content-encoding: gzip
content-type: text/html; charset=utf-8
content-length: 3187

brotli Compressed File#

curl -H "Accept-Encoding: br" -i 2>/dev/null | head | grep -E "content\-.*:"


content-encoding: br
content-type: text/html; charset=utf-8
content-length: 2638

You should be able to change the above URLs to any of your compressed file paths and test the content encoding in the response.

Hat tip to Mattias Geniar for the curl and head pipeline bits of the one-liners above in Test Gzip Compression of Site via Curl.

What about GitHub Pages?#

GitHub Pages gzip compresses by default. No need to do anything. It just works. Despite user requests for brotli compression dating back to at least 2019, there has yet to be brotli support on GitHub Pages. This response from one of GitHub’s Open Source Community Managers indicates that support is in the queue, but it hasn’t launched to date.

Further Reading#

⚡ Free Software Powered ⚡

Crunch · Fira Sans · Git · Hugo · JetBrains Mono · Kiss · Lato · Poppins · Space Mono · VS Code