this post was submitted on 20 Oct 2024
29 points (89.2% liked)
Linux
47941 readers
1353 users here now
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is great stuff.
My comment from the peanut gallery today is just that there's no law that CI/CD can't be kept under control and run in ten seconds.
Given the choice between a slow out of control CI/CD mess, or a shell script, I too will take the shell script every time.
But I am living my best life today, and have a simple shell script in my CI/CD pipeline.
Hah. I was just ranting about “modern” deployment pipelines and how ridiculous and complicated they were just to push some code.
I’m a big fan of scripts. Push the button, get the cheese.
For our lower environments we use rsync like the author but skip the pipeline altogether. The servers have a watch script to restart when files are rsynced. We then have a local watch script that rsyncs on file changes.
Relatively instant deploy (2-5s) whenever a file is saved.
I frequently amaze new colleagues when I show them that deploying an update for our backend application is a sub-second affair. Our pipeline keeps track of what git tag was deployed last, diffs between that tag and the new release, and uploads the files to each of the deployment targets. It takes longer for the pipeline agent to spin up from Cold on a Monday morning, than it does to actually deploy.
The core of the application is just php scripts, and those are either immediately up to date whenever the next call is, or swapped out the next time that component finishes a processing cycle.
Docker containers are nice, but nothing beats the cause of a stack trace being fixed, tested and deployed to the acceptance environment within minutes of it arriving.