Docker is supposed to not only transform how you run your applications in production, but also improve your local development environment. Yet I have not seen much around actually using Docker for development.
In Your IDE
There are now some plugins for various IDEs coming out that make use of Docker for development. IntelliJ’s Python plugin now has the ability to run the Python executable from inside a Docker container. Now you could always have manually configured a build for running inside Docker. But this new approach is embedded such that it comes all the way back to underlining errors in the IDE. It doesn’t feel like a separate process any more.
Likewise, Visual Studio Code is also getting pretty good at running things through Docker containers as well. Although I haven’t yet seen the same type of integration as IntelliJ has done.
Command Line
Of course, you can always use Docker via the command line. I have found this to be the most useful to keep environments separate. While some tools make it easier to work with different versions of Java, or Node, some tools just don’t support that. Sometimes I need specific versions of Gradle and Java, and don’t want to bother with SDKMan.
This is exactly the type of thing Docker excels at. You just mount in your current directory, and run your commands inside the container.
docker run -it --rm --name myThing -v $(pwd):/app --entrypoint /bin/sh node:11-alpine
By mounting your directory into the container, you can still use your local IDE to make changes. These changes are seen inside the Docker container. Of course, you can also expose container ports to see your running app and all that good stuff too if needed.
Those Pesky Tools
Docker also comes in handy to run those pesky tools, that you need to upgrade, but may be risky. Things like Jenkins, GitLab, SonarQube, etc. We’ve run into many times when upgrading something in this chain wreaked havoc on our build pipeline. The new git client plugin in Jenkins has a bug. GitLab’s upgrade removes a piece of the API other tools were using.
We can deploy these tools in containers locally, and run through some tests to make sure everything still seems to work. We can gain confidence in an upgrade before we actually attempt it in real life. This brings our supporting tools closer in line with how we treat our production code. How many of us have integration tests checked in around our build pipeline? Yet how impactful is it if your build pipeline is broken for a couple of days?