I experiment with Drupal a lot. Enough that I created a template project a while back to get me up and running quickly. The thing is, I included the composer.json file in the repo, and that quickly got out of date. So I'd end up doing a lot of resetting of things every time I needed a new instance of Drupal.
I finally got tired enough of that that I decided to do a better job.
Thus, I created Waterdrop.
The quick summary is that I scripted out all the main bits of configuring a new project with the folder structure I use.
So, what does Waterdrop actually do?
It sets a well defined directory structure
For me, my application is separate from my Docker configuration. I mostly write PHP applications. They all can run on plain jane LAMP stack servers just fine. So I've never liked having Docker related code directly in my application code. It especially doesn't make sense when your Docker setup requires configuring database, key/value store, etc. containers. I mean, do you really want your my.cnf file for MariaDB to live in the code of an application that might actually use PostgreSQL instead?
So I have a project directory structure that reflects that separation.
- < any needed s6-overlay files >
- < any needed varnish related files >
- src/ (The application code.)
- app2/ < if I need to build a different image, it gets it's own directory. >
- scripts/ (Any bash helper scripts.)
- secrets/ (Where I stick secrets files for dev.)
- tmp/ (Always handy to have a git-ignored tmp dir.)
Of course the actual structure varies a bit between projects. If I'm not using Varnish, for example, I'd not have the s6 and Varnish directories in app/docker-config.
This structure also offers the option to make app/src a git submodule if I need even more separation.
Waterdrop doesn't have everything I showed above, but it follows the same logic.
It installs the drupal/recommended-project skeleton
Drupal.org recommends starting with their drupal/recommended-project skeleton. I happen to agree with them.
So the scripts/init.sh script in Waterdrop will run the Composer commands needed to install that skeleton to app/src. You can also set a specific version of the skeleton via the --drupal flag.
It installs my most used Drupal projects
There are certain modules and projects that I use on almost every Drupal site I throw up. So, by default, I have Waterdrop install those during initialization.
You can reset the list with the --reset-extra-projects flag.
You can add your own projects with the --extra-project flag.
It configures Asset Packagist
So, to be honest, I don't know why this isn't part of the recommended-project skeleton. Drupal.org includes it in their installing via Composer docs. So it is obviously needed by a lot of projects.
In any case, to make adding NPM and Bower libraries easy, Waterdrop adds the Asset Packagist configuration to app/src/composer.json by default. You can skip it with the --skip-asset-packagist flag.
It configures settings.php to use your project name when looking for Docker secrets
So, long story, but using just a generic secret name in settings.php is not a good idea. You should prefix all of them with your project name. Waterdrop does that automatically.
It configures dev.run.yml to use your project name
So, another long story, but generic identifiers in your Docker Compose files are not a good idea. It gets messy. So Waterdrop makes sure to prefix everything with your project name.
It configured dev.run.yml with the correct network and ports
Running docker compose up, and getting a "port already in use" message is frustrating. So, rather than have all Waterdrop based project use the same ports, Waterdrop will set the -app and -db containers to bind to the ports passed in via the --host-port and --db-host-port flags.
I also have a habit of putting all my Docker Compose projects on their own Docker Network. Waterdrop does that via the --docker-network flag.
It creates networks, volumes, and files for secrets
I learned the hard way that docker compose down -v will wipe out data you want to keep if you use local volumes in your Compose file.
So I always use externally created volumes in my project.
Same for networks.
It also gets tedious creating new secrets files for every project.
Waterdrop does all that for you when the init script is run. Just set the --docker-network and --project-name flags.
It also sets some default values in the secrets/< project name >_* files.
It sets up a .env file to make building easier
Another tedious process is remembering how to build your local dev image properly. To fix that, Waterdrop sticks a few useful variables in .env, and has the scripts/build-dev.sh script that will build the dev image for you.
It uses multi-stage building for the Docker image
Docker can run multiple build steps at once if you use multi-stage building. When compiling multiple php extensions, or copying a lot of files, that can help speed up the build significantly.
The tricky part is figuring out what files you need to copy from each stage into the final image.
Waterdrop uses this technique for both PHP extensions and copying Drupal code into the image.
Do take a look at the Dockerfile to understand exactly what is going on. Especially pay attention to how you can copy custom modules and themes into the image.
It provides examples of adding PHP extensions in the Dockerfile
A long time ago I got tired of wondering exactly how to add an extension to my PHP Docker image. So I sat down and figured out how to install the dependencies and properly add all the extensions supported by docker-php-ext-install. That work is present in the Dockerfile. So it should be as simple as un-commenting the lines for the extension you need.
Note that I did not do this for PECL extensions.
I do have the extensions required by Drupal un-commented. Note that a lot of the requirements listed on Drupal.org are enabled by default in the php 8.1 image.
For the record, here's a list of added extensions:
- imagick (This is via PECL.)
It provides s6-overlay and Varnish
Another long story, but running Varnish in it's own container is a pain the rear.
It is much simpler to run it in the same container as Apache.
Which means starting multiple processes in the same container. Not something that Docker is really good at.
Which is where s6-overlay comes it. It operates as pid 1 and manages running as many processes as you need.
The cost is a rather un-intuitive implementation, but the benefit is that it does it's job well.
Hopefully, if my project structure makes sense to you, Waterdrop can help save you some time. It was a fun little project to build, and I'll keep tweaking as I use it.
If you do decide to use it for production applications, I'll just point out that Waterdrop's structure works really well on a Docker Swarm/Traefik/Portainer stack. I have multiple applications, both Drupal and not, using it.
If you have questions, feel free to comment here, or just post an issue over on GitHub.