Docker docs:

Docker routes container traffic in the nat table, which means that packets are diverted before it reaches the INPUT and OUTPUT chains that ufw uses. Packets are routed before the firewall rules can be applied, effectively ignoring your firewall configuration.

  • peoplebeproblems@midwest.social
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    4 hours ago

    Ok

    So, confession time.

    I don’t understand docker at all. Everyone at work says “but it makes things so easy.” But it doesnt make things easy. It puts everything in a box, executes things in a box, and you have to pull other images to use in your images, and it’s all spaghetti in the end anyway.

    If I can build an Angular app the same on my Linux machine and my windows PC, and everything works identically on either, and The only thing I really have to make sure of is that the deployment environment has node and the angular CLI installed, how is that not simpler than everything you need to do to set up a goddamn container?

    • FuckBigTech347@lemmygrad.ml
      link
      fedilink
      arrow-up
      1
      ·
      1 hour ago

      I pretty much share the same experience. I avoid using docker or any other containerizing thing due to the amount of bloat and complexity that this shit brings. I always get out of my way to get Software running w/o docker, even if there is no documented way. If that fails then the Software just sucks.

    • null_dot@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 hour ago

      Sure but thats an angular app, and you already know how to manage its environment.

      People self host all sorts of things, with dozens of services in their home server.

      They dont need to know how to manage the environment for these services because docker “makes everything so easy”.

    • qaz@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      3 hours ago

      This is less of an issue with JS, but say you’re developing this C++ application. It relies on several dynamically linked libraries. So to run it, you need to install all of these libraries and make sure the versions are compatible and don’t cause weird issues that didn’t happen with the versions on the dev’s machine. These libraries aren’t available in your distro’s package manager (only as RPM) so you will have to clone them from git and install all of them manually. This quickly turns into hassle, and it’s much easier to just prepare one image and ship it, knowing the entire enviroment is the same as when it was tested.

      However, the primary reason I use it is because I want to isolate software from the host system. It prevents clutter and allows me to just put all the data in designated structured folders. It also isolates the services when they get infected with malware.

      • peoplebeproblems@midwest.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        3 hours ago

        Ok, see the sandboxing makes sense and for a language like C++ makes sense. But every other language I used it with is already portable to every OS I have access to, so it feels like that defeats the benefit of using a language that’s portable.

        • hitmyspot@aussie.zone
          link
          fedilink
          arrow-up
          1
          ·
          13 minutes ago

          I think it’s less about making it portable to different OS, and more different versions of the same os on different machines with different configuration and libraries and software versions.

    • sidelove@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      4 hours ago

      have to make sure of is that the deployment environment has node and the angular CLI installed

      I have spent so many fucking hours trying to coordinate the correct Node version to a given OS version, fucked around with all sorts of Node management tools, ran into so many glibc compat problems, and regularly found myself blowing away the packages cache before Yarn fixed their shit and even then there’s still a serious problem a few times a year.

      No. Fuck no, you can pry Docker out of my cold dead hands, I’m not wasting literal man-weeks of time every year on that shit again.

      (Sorry, that was an aggressive response and none of it was actually aimed at you, I just fucking hate managing Node.js manually at scale.)

    • miss phant@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      2
      ·
      3 hours ago

      I put off docker for a long time for similar reasons but what won me over is docker volumes and how easy they make it to migrate services to another machine without having to deal with all the different config/data paths.

    • llii@discuss.tchncs.de
      link
      fedilink
      arrow-up
      4
      ·
      4 hours ago

      You’re right. As an old-timey linux user I find it more confusing than running the services directly, too. It’s another abstraction layer that you need to manage and which has its own pitfalls.

    • TrickDacy@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      3 hours ago

      Even when it seems like an app runs identically on every platform, you can easily run into issues down the road. If you have a well configured docker image, that issue is just solved ahead of time. Hell, I find it worth messing with just moving a node.js app between Linux boxes, which would experience the least issues I can think of.

    • Lightfire228@pawb.social
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      3 hours ago

      Think of it more like pre-canned build scripts. I can just write a script (DockerFile), which tells docker how to prepare the environment for my app. Usually, this is just pulling the pre-canned image for the app, maybe with some extra dependencies pulled in.

      This builds an image (a non-running snapshot of your environment), which can be used to run a container (the actual running app)

      Then, i can write a config file (docker-compose.yaml) which tells docker how to configure everything about how the container talks to the host.

      • shared folders (volumes)
      • other containers it needs to talk to
      • network isolation and exposed ports

      The benefit of this, is that I don’t have to configure the host in any way to build / host the app (other than installing docker). Just push the project files and docker files, and docker takes care of everything else

      This makes for a more reliable and dependable deploy

      You can even develop the app locally without having any of the devtools installed on the host

      As well, this makes your app platform agnostic. As long as it has docker, you don’t need to touch your build scripts to deploy to a new host, regardless of OS


      A second benefit is process isolation. Should your app rely on an insecure library, or should your app get compromised, you have a buffer between the compromised process and the host (like a light weight VM)

    • MangoPenguin@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 hours ago

      The only thing I really have to make sure of is that the deployment environment has node and the angular CLI installed

      That’s why Docker is popular. Making sure every single system running your app has the correct versions of node and angular installed is a royal pain in the butt.