So, this is a rather odd request of a backup solution, but it’s kinda what I want right now.

I’m still relatively new to Linux and self-hosting in general

A few years ago, my cousin and I were hosting our own Minecraft server. It had a mod that would create backups of the world folder. It zipped it up, named it “yyyy-mm-dd.zip” and placed it in a backups folder somewhere on the server.

The most important feature that I want is actually the next part. It would allow us to specify how many backups we wanted to keep, and also how frequent we wanted the backup to run.

We set it to backup daily, and keep 14 days of backups. After that, it would delete the oldest one, and make a new backup.

I would like to replicate that functionality! Specify the frequency, but ALSO how many backups to keep.

Idk if it’s asking too much. I’ve tried doing some research, but I’m not sure where to start.

Ideally I’d like something I can host on docker. Maybe connect to a Google account or something so it can be off-site.

I only want to use it for docker config files, compose files, container folders, etc.

I’ve looked into restic, but it seems it encrypts the backups, and you NEED a working copy of restic to restore? I’d like something simple like a .zip file instead or something, to be able to just download, unzip, and spin up the compose file and stuff.

Sorry for the wall of text, thanks in advance if you have any suggestions!

P.S. I’m pretty sure the upload to Google or some other service would have to be a separate program, so I’m looking into that as well.

Update: I want to thank everyone for your wonderful suggestions. As of right now, I have settled on a docker container of Duplicati, backed up to my Mega.nz account. Last I checked they lowered the storage limit, but I was lucky to snag an account when they were offing 50GB free when you joined, so it’s working out well so far. I did have to abandon my original idea, and decided to look for something with deduplication (now that I know what it is!) And encryption.

  • hitagi@ani.social
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    I do something similar with rclone. Most server software have some way of creating backups. Have that software create a backup and use rclone to move the file over to some cloud storage. Rclone also has the option to delete older stuff (rclone delete --min-age 7d). Do all that with a shell script and add it to the crontab.

    • beatle@aussie.zone
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 year ago

      I think this is the best solution. rclone also has built in crypt too.

      Edit: built in crypt if you configure it for use

    • 子犬です@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      That sounds like the 2nd part of what I want! The uploading to off-site part! Awesome, I’ll def look into it, thank you!

      • DataDreadnought@lemmy.one
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        If you look at my recent post history I gave out my script using rclone to backup my server. It’s in NixOS but you can ignore it as it is bash scripting at its core. It has everything you need like using rclone to delete older backups.

  • sczlbutt@lemmy.pubsub.fun
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    What you want is a bash script and a cron job that calls it. Most of what you need is likely already installed for you.

    “crontab -e” will pull up your crontab editor. Check out “man crontab” first to get an idea of the syntax…it looks complicated at first but it’s actually really easy once you get the hang of it.

    Your script will call tar to create your backup archive. You’ll need the path to the folder where your files to backup are and then something like: tar -C PATH_TO_FILES -czf PATH_AND_NAME_OF_BACKUP.tgz .

    That last dot tells it to tar up everything in the current folder. You can also use backticks to call programs in line…like date (man date). So if your server software lives in /opt/server and your config files you want to backup are in /opt/server/conf and you want to store the backups in /home/backups you could do something like:

    tar -C /opt/server/conf -czf /home/backups/server_bkup.`date +%Y%m%d`.tgz .
    

    Which would call tar, tell it to change directory (-C) to /opt/server/conf and then create (-c) +gzip (-z) into file (-f) /home/backups/blah.tgz everything in the new current directory (.)

    I don’t know if that’s what you’re looking for but that would be the easiest way to do it…sorry for potato formatting but I’m on mobile

    • 子犬です@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      No honestly, this was very helpful!

      This, in combination with the solutions some others have suggested here already, would be pretty much what I want, just in multiple different parts, instead of 1 program/utility.

      I’ll def look into this, and honestly see if I can find a docker image for something like this as well!!

      Thank you so much!!!

  • Zoe Codez@lemmy.digital-alchemy.app
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    1 year ago

    Borg Backup works great for that, exactly it’s use case. It’s a command line thing, but you can use Vorta as a UI if you want that. If you have a NAS, it can back up directly to that.

    I have a second cronjob in my setup that syncs the encrypted archive to B2 nightly. Works great

    • Voroxpete@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      But… The Vorta were the spokespeople for the Dominion. Why would they be working for the Borg?? Why would you do this to me???

  • ventrix@lemm.ee
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    Duplicati does this and it’s one of the best backup solutions imo

    • 子犬です@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Awesome, I’ll add it to the list of software to look into! Actually, if it does everything, then it’s gonna be the first 1 I try! Thank you!

    • 子犬です@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      I’m trying this out right now with Kopia for docker, and I’m not the biggest fan of (seemingly) not being able to turn off the obfuscation, and making it do just a single .zip file or .tar or whatever. Also, having a hard time setting up drive integration with the GUI, but that’s just my fault. I’m not familiar with rclone or Kopia at all.

      • randomname01@feddit.nl
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        You can mount the complete backup as a local file system, which I think would suit your needs. I’m not familiar with their various integrations either, I just backup over SFTP.

        But to reassure you, I also needed a bit of trial and error with Kopia, as it’s not the easiest GUI ever to get used to. But I’ve got it running now, and I’m very happy with it. I’ve also used it to successfully restore multiple backups (to test if it worked) and they all worked.

    • 子犬です@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I’ll have to look more into this, because I think I misunderstood, but it seems that it is ½ of the backup solution right? It won’t actually MAKE the backups, but it’ll allow me to “rotate” and only keep the last “x” files?

      • MaungaHikoi@lemmy.nz
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Yep that’s the one. If you can make a cron job to make the zip file, logrotate could handle keeping the last x files.

        It might sound complicated, but the cool thing about *nix environments is that everything is made up of a combo of little tools. You can learn one at a time and slowly build something super complicated over time. First thing would be figuring out the right set of commands to make a zip file from the directory I reckon. Then add that to cron so it happens every day. Then add logrotate into the mix and have that do its thing every day after the backup runs.

        • 子犬です@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          So I think I’ll try Duplicati for docker next, and if that fails, then I’ll try scripting and cronjobs.

          I’m so happy with all the support, thank you! :)

  • MangoPenguin@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    1 year ago

    I would say since you want simple .zip archives, this could be something to script yourself since it would be fairly easy.

    Basically:

    • Zip the server into a dated zip file
    • Check for old zip files and delete
    • Upload zip files using rclone to remote storage (gdrive, etc)
    • Optionally send a notification to discord, telegram, healthchecks.io, or something like that

    The downside of zipping backups like this is obviously storage space, every backup takes up the full amount of space of the server, since there’s no deduplication or incremental versioning happening.

    • 子犬です@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I think this might be the way I have to go!

      I’m really liking Kopia. Nice GUI and some pretty nice settings. But I don’t like the obfuscation. Like you said, I just want the zip files. I think I’ll try Borgbackup, then rclone to drive, but I’ll also look into just scripting it myself!

      Thank you so much :)

  • wwwwhatever@lemmy.omat.nl
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    You can look at backuppc, it has served us well for years now. Offsite, manages incremental and full back ups, file deduplication, etc.
    So on your Minecraft server do a daily backup and add the day off the week to it (whatever.7.gz), this way you always have 7 backups on the server and it auto rotates. Add that for folder to backuppc and the backup server will automatically decrease the amount of backups if they get older.