Step 7: Setting up a Database

Setting up a Database

A comment is best described by a fixed data structure: an author, their email, the text of the feedback, and an optional photo. The kind of data that can be best stored in a traditional relational database engine.

PostgreSQL is the database engine we will use.

On our local machine, we have decided to use Docker to manage services. Create a file and add PostgreSQL as a service:

docker-compose.yaml

This will install a PostgreSQL server and configure some environment variables that control the database name and credentials. The values do not really matter.

We also expose the PostgreSQL port (5432) of the container to the local host. That will help us access the database from our machine.

Note

The pdo_pgsql extension should have been installed when PHP was set up in a previous step.

Starting Docker Compose

Start Docker Compose in the background (-d):

  1. $ docker-compose up -d

Wait a bit to let the database start up and check that everything is running fine:

  1. $ docker-compose ps
  2. Name Command State Ports
  3. ---------------------------------------------------------------------------------------
  4. guestbook_database_1 docker-entrypoint.sh postgres Up 0.0.0.0:32780->5432/tcp

If there are no running containers or if the State column does not read Up, check the Docker Compose logs:

    Using the psql command-line utility might prove useful from time to time. But you need to remember the credentials and the database name. Less obvious, you also need to know the local port the database runs on the host. Docker chooses a random port so that you can work on more than one project using PostgreSQL at the same time (the local port is part of the output of docker-compose ps).

    If you run psql via the Symfony CLI, you don’t need to remember anything.

    The Symfony CLI automatically detects the Docker services running for the project and exposes the environment variables that psql needs to connect to the database.

    Thanks to these conventions, accessing the database via symfony run is much easier:

    1. $ symfony run psql

    If you don’t have the psql binary on your local host, you can also run it via docker-compose:

    1. $ docker-compose exec database psql main

    Dumping and Restoring Database Data

    Use pg_dump to dump the database data:

    And restore the data:

    1. $ symfony run psql < dump.sql

    Warning

    Never call docker-compose down if you don’t want to lose data. Or backup first.

    For the production infrastructure on SymfonyCloud, adding a service like PostgreSQL should be done in the currently empty .symfony/services.yaml file:

    .symfony/services.yaml

    1. db:
    2. type: postgresql:13
    3. disk: 1024

    The db service is a PostgreSQL database (same version as for Docker) that we want to provision on a small container with 1GB of disk.

    We also need to “link” the DB to the application container, which is described in .symfony.cloud.yaml:

    .symfony.cloud.yaml

    1. relationships:
    2. database: "db:postgresql"

    The db service of type postgresql is referenced as database on the application container.

    The last step is to add the pdo_pgsql extension to the PHP runtime:

    .symfony.cloud.yaml

    1. runtime:
    2. extensions:
    3. - pdo_pgsql
    4. # other extensions here

    Here is the full diff for the .symfony.cloud.yaml changes:

    patch_file

    1. --- a/.symfony.cloud.yaml
    2. +++ b/.symfony.cloud.yaml
    3. @@ -4,6 +4,7 @@ type: php:7.4
    4. runtime:
    5. extensions:
    6. + - pdo_pgsql
    7. - apcu
    8. - mbstring
    9. - sodium
    10. @@ -21,6 +22,9 @@ build:
    11. disk: 512
    12. +relationships:
    13. +
    14. web:
    15. locations:
    16. "/":

    Accessing the SymfonyCloud Database

    PostgreSQL is now running both locally via Docker and in production on SymfonyCloud.

    As we have just seen, running symfony run psql automatically connects to the database hosted by Docker thanks to environment variables exposed by symfony run.

    If you want to connect to PostgreSQL hosted on the production containers, you can open an SSH tunnel between the local machine and the SymfonyCloud infrastructure:

    1. $ symfony tunnel:open --expose-env-vars

    By default, SymfonyCloud services are not exposed as environment variables on the local machine. You must explicitly do so by using the --expose-env-vars flag. Why? Connecting to the production database is a dangerous operation. You can mess with real data. Requiring the flag is how you confirm that this is what you want to do.

    Now, connect to the remote PostgreSQL database via symfony run psql as before:

    1. $ symfony run psql

    When done, don’t forget to close the tunnel:

    1. $ symfony tunnel:close

    Tip

    To run some SQL queries on the production database instead of getting a shell, you can also use the symfony sql command.

    Docker Compose and SymfonyCloud work seamlessly with Symfony thanks to environment variables.

    Check all environment variables exposed by symfony by executing symfony var:export:

    1. $ symfony var:export
    2. PGHOST=127.0.0.1
    3. PGPORT=32781
    4. PGDATABASE=main
    5. PGUSER=main
    6. PGPASSWORD=main
    7. # ...

    The PG* environment variables are read by the psql utility. What about the others?

    When a tunnel is open to SymfonyCloud with the --expose-env-vars flag set, the var:export command returns remote environment variables:

    1. $ symfony tunnel:open --expose-env-vars
    2. $ symfony var:export

    Describing your Infrastructure

    You might not have realized it yet, but having the infrastructure stored in files alongside of the code helps a lot. Docker and SymfonyCloud use configuration files to describe the project infrastructure. When a new feature needs an additional service, the code changes and the infrastructure changes are part of the same patch.

    Going Further


    This work, including the code samples, is licensed under a Creative Commons BY-NC-SA 4.0 license.