There was a time where this debate was bigger. It seems the world has shifted towards architectures and tooling that does not allow dynamic linking or makes it harder. This compromise makes it easier for the maintainers of the tools / languages, but does take away choice from the user / developer. But maybe that’s not important? What are your thoughts?
In what context? In Linux, dynamic links have always been a steady thing.
We could argue semantics here (I don’t really want to), but tools like Docker / Containers, Flatpack, Nix, etc. essentially use sort of a soft static link in that the software is compiled dynamically but the shared libraries are not actually shared at all beyond the boundary of the defining scope.
So it’s semantically true that dynamic libraries are still used, the execution environments are becoming increasingly static, defeating much of the point of shared libraries.
This garbage practice is imported from windows.
That may well be, but it doesn’t really change anything, does it?
Hot take: This is only still the case because the GNU libc cannot be statically linked easily