I’m trying to find a thing, and I’m not turning up anything in my web searches so I figure I’d ask the cool people for help.

I’ve got several projects, tracked in Git, that rely on having a set of command line tools installed to work on locally - as an example, one requires Helm, Helmfile, sops, several Helm plugins, Pluto, Kubeval and the Kubernetes CLI. Because I don’t hate future me, I want to ensure that I’m installing specific versions of these tools rather than just grabbing whatever happens to be the latest version. I also want to ensure that my CI runner grabs the same versions, so I can be reasonably sure that what I’ve tried locally will actually work when I go to deploy it.

My current solution to this is a big ol’ Bash script, which works, but is kind of a pain to maintain. What I’m trying to find is a tool where I:

  • Can write a definition, ideally somewhere shared between projects, of what it means to “install tool X”
  • Include a file in my project that lists the tools and versions I want
  • Run the tool on my machine and let it go grab the platform- and architecture- specific binaries from wherever, and install them somewhere that I can add to my $PATH for this specific project
  • Run the tool in CI and do the same - if it can cache stuff then awesome

Linux support is a must, other platforms would be nice as well.

Basically I’m looking for Pythons’ pip + virtualenv workflow, but for prebuilt tools like helm, terraform, sops, etc. Anyone know of anything? I’ve looked at homebrew (seems to want to install system-wide), and VSCode dev containers (doesn’t solve the CI need, and I’d still need to solve installing the tools myself)

  • ericjmorey@programming.dev
    link
    fedilink
    arrow-up
    1
    ·
    11 months ago

    If you’re using the Debian “no Frankenstein” approach, you’re limited to what is packaged, but if you do your own thing, you can have it build/install whatever you want.