On this page we will discuss:

    This page will show you how to set up basic pipelines for Deno projects in GitHub Actions. The concepts explained on this page largely apply to other CI providers as well, such as Azure Pipelines, CircleCI or GitLab.

    Building a pipeline for Deno generally starts with checking out the repository and installing Deno:

    To expand the workflow just add any of the deno subcommands that you might need:

    1. # Check if the code is formatted according to Deno's default
    2. # formatting conventions.
    3. - run: deno fmt --check
    4. # Scan the code for syntax errors and style issues. If
    5. # you want to use a custom linter configuration you can add a configuration file with --config <myconfig>
    6. - run: deno lint
    7. # runs with all permissions, but it is recommended to run with the minimal permissions your program needs (for example --allow-read).
    8. - run: deno test --allow-all --coverage=cov/
    9. # This generates a report from the collected coverage in `deno test --coverage`. It is
    10. # stored as a .lcov file which integrates well with services such as Codecov, Coveralls and Travis CI.
    11. - run: deno coverage --lcov cov/ > cov.lcov

    As a Deno module maintainer, you probably want to know that your code works on all of the major operating systems in use today: Linux, MacOS and Windows. A cross-platform workflow can be achieved by running a matrix of parallel jobs, each one running the build on a different OS:

    1. jobs:
    2. build:
    3. runs-on: ${{ matrix.os }}
    4. matrix:
    5. os: [ ubuntu-20.04, macos-11, windows-2019 ]
    6. steps:
    7. - run: deno test --allow-all --coverage cov/

    If you are working with experimental or unstable Deno APIs, you can include a matrix job running the canary version of Deno. This can help to spot breaking changes early on:

    1. jobs:
    2. build:
    3. runs-on: ${{ matrix.os }}
    4. continue-on-error: ${{ matrix.canary }} # Continue in case the canary run does not succeed
    5. strategy:
    6. os: [ ubuntu-20.04, macos-11, windows-2019 ]
    7. deno-version: [ v1.x ]
    8. canary: [ false ]
    9. include:
    10. - deno-version: canary
    11. os: ubuntu-20.04
    12. canary: true

    Reducing repetition

    In cross-platform runs, certain steps of a pipeline do not need to run for each OS necessarily. For example, generating the same test coverage report on Linux, MacOS and Windows is a bit redundant. You can use the if conditional keyword of GitHub Actions in these cases. The example below shows how to run code coverage generation and upload steps only on the ubuntu (Linux) runner:

    1. if: matrix.os == 'ubuntu-20.04'
    2. run: deno coverage --lcov cov > cov.lcov
    3. - name: Upload coverage to Coveralls.io
    4. if: matrix.os == 'ubuntu-20.04'
    5. # Any code coverage service can be used, Coveralls.io is used here as an example.
    6. uses: coverallsapp/github-action@master
    7. with:
    8. github-token: ${{ secrets.GITHUB_TOKEN }} # Generated by GitHub.
    9. path-to-lcov: cov.lcov

    Caching dependencies

    As a project grows in size, more and more dependencies tend to be included. Deno will download these dependencies during testing and if a workflow is run many times a day, this can become a time-consuming process. A common solution to speed things up is to cache dependencies so that they do not need to be downloaded anew.

    Deno stores dependencies locally in a cache directory. In a pipeline the cache can be preserved between workflows by setting the DENO_DIR environment variable and adding a caching step to the workflow:

    At first, when this workflow runs the cache is still empty and commands like deno test will still have to download dependencies, but when the job succeeds the contents of DENO_DIR are saved and any subsequent runs can restore them from cache instead of re-downloading.

    1. key: ${{ hashFiles('lock.json') }}

    To make this work you will also need a have a lockfile in your Deno project, which is discussed in detail . Now, if the contents of lock.json are changed, a new cache will be made and used in subsequent pipeline runs thereafter.

    To demonstrate, let’s say you have a project that uses the logger from deno.land/std:

      In order to increment this version, you can update the import statement and then reload the cache and update the lockfile locally:

      You should see changes in the lockfile’s contents after running this. When this is committed and run through the pipeline, you should then see the hashFiles function saving a new cache and using it in any runs that follow.

      Clearing the cache

      Occasionally you may run into a cache that has been corrupted or malformed, which can happen for various reasons. There is no option in GitHub Actions UI to clear a cache yet, but to create a new cache you can simply change the name of the cache key. A practical way of doing so without having to forcefully change your lockfile is to add a variable to the cache key name, which can be stored as a GitHub secret and which can be changed if a new cache is needed:

      1. key: ${{ secrets.CACHE_VERSION }}-${{ hashFiles('lock.json') }}