I’m running a Docker-based homelab that I manage primarily via Portainer, and I’m struggling with how to handle container updates. At first, I had all containers pulling latest, but I thought maybe this was a bad idea as I could end up updating a container without intending to. So, I circled back and pinned every container image in my docker-compose files.

Then I started looking into how to handle updates. I’ve heard of Watchtower, but I noticed the Linuxserver.io images all recommend not running Watchtower and instead using Diun. In looking into it, I learned it will notify you of updates based on the tag you’re tracking for the container, meaning it will never do anything for my containers pinned to a specific version. This made me think maybe I’ve taken the wrong approach.

What is the best practice here? I want to generally try to keep things up to date, but I don’t want to accidentally break things. My biggest fear about tracking latest is that I make some other change in a docker-compose and update the stack which pulls latest for all the container in that stack and breaks some of them with unintended updates. Is this a valid concern, and if so, how can I overcome it?

  • Frederic
    link
    fedilink
    12 months ago

    @RadDevon While using latest in a production environment is not considered a good idea, I’ve been using Watchtower in my homelab for years to keep running images up to date without any issue.

    Some apps also provide major version tags (e.g. Postgres), so you avoid breaking changes (as long as they adhere to semver).

    • Frederic
      link
      fedilink
      12 months ago

      @RadDevon You can also use tools like Renovate or Dependabot to create a pull request once an image in your docker-compose files is updated (runs on GitHub, GitLab, Gitea, Forgejo, etc.)

      That leaves you with running tests in your CI pipeline and setting up a deployment step afterwards.