also if you “need” the data on SCSI.
2 backups are recommended anyway, but esp before fdisking around with partitions.
also if you “need” the data on SCSI.
2 backups are recommended anyway, but esp before fdisking around with partitions.
good cross platforms too.
I’ve used it from win, osx, linux, android.
It just finds the DLNA and CIFS shares from my nas so naturally in the library - better than thunar.
I just wish my “smart” TV had it.
Yeah, though previously you did have k-lite codec pack, and media player classic (i’m talking win 2k / xp days)
VLC did just dominate though.
because you chose canonical over debian.
give stock debian a try
just go stock debian xfce, keep it simple.
It’s what my 70 year old mother is perfectly happy with for several years since I told her to drop lubuntu.
install flatpack +flathub f you want even more app convenience.
in case it’s not clear from the comments . and sorry for repeating if it is, but this >> thing is a really useful terminal thing to know in many cases.
>>
will trap and redirect terminal output.
So consider any old commmand and its output:
echo abc
This invokes the echo command and echo outpts “abc” to terminal.
If we add on >> we can catch and redirect the output:
echo abc >> blah.txt
Will capture the output “abc” into the file.
Note this is an APPEND operation, so run it twice to the same output file and you’ll add more and more output to new lines at the end of the same file.
+1 to this.
You can reduce likelihood of any known risk with a preventative measure, in this case the permissions and ownership structure. That is good.
Backup does not reduce likelihood of risk.
It does something more wide-reaching, it mitigates against the bad outcome of loss (from most causes).So it defends from many unknown risks as well as known ones, and unexpected failure of preventative measures. It sort of protects you from your own ignorance and complacency.
Shit - i’m off to do some more work on backup.sh.
yeah I paid a lot for an apple laptop in 2008. (more than the hardware was worth - but the form factor was good)
It was okay, and osx was ok for most stuff for a few years .
But they cut support for updates well within 10 years and the version I was stuck on eventually just got too far behind on security updates and couldn’t even get firefox updates and stuff.
So they forced me back tolinux full time - thankfully dual bootng macos+linux was really easy on the old x86 ones.
It seems you have to keep shipping them big buckets of dollars every 5 years or so - fuck that.
I’d much rather just give the odd bit of pay-what-you-can/ tip jar to a few linux projects than chuck out perfectly good hardware every few years.
There’s always tinycorelinux for hardcore minimalists.
I can’t say about package support either - i’ve not used it enough, but theres a “dcore” extension that lets you acess debian repos.
I’ve installed it on a potato easily enough - and I did find it to be astonishing for how small it is.
But I don’t use it day to day, or much at all, so i’m not going to endorse it.
It’s not necessarily the most user friendly. and some people might cal the gui slightly dated - persnally i did like that.
So this is just make you aware of one of the lightest distros I know of (that is sort of usable out of the box)
Recommended: spec is 128mb ram and pentium2. min spec 46mb ram (maybe thats without the gui desktop environment)
It’s possibly a bit lighter than antix - for some reason i never quite got on with either antix or mx - not sure why.
steam deck? I wonder how many full-time staff valve devotes to testing and pushing regular updates.
I think a lot of arch people want the bleeding edge updates, so it seems a lot like to go btrfs or and setup snaphots or something if they want a safety net.
TLDR;
Doom was massively popular in it’s day because it was and still is an awesome game played on ibm pc compatibles.
Popularity was basically nothing to do with ports to other os ses or hardware.
Doom is an “MS- DOS game” not a “windows game”.
It had a brilliant shareware (free) version containing 1/3rd of the game - that spread like wildfire.
It had great multiplayer network deathmatch and coop modes.
It maybe gained a bit of notoriety by some morons (who probably didn’t know what a BBS or shareware was) calling it to be banned as a “video Game Nasty” - but it’d have been insanely popular without that because of how many light years ahead it was the previous gen - say wolfenstein or catacomb abyss in basically every way.
It also grew a network of BBS communities who shared user created WADs with levels and mods and stuff extending the game’s content and longevity - and creating a subculture of doom-obsessed tech geeks. Competitive home gamer “speedrunning” and stuff became possible at home as you could basically “record” and share a level on BBS and people could effectively validate each key-press to check for cheating.
It’s true that it was ported to mac and linux and a few other OS fairly soon after release, but the vast majority of home gamers would have been on MS-DOS. Probably there were a bunch of workplace deathmatches on networks of solaris terminals or something like that - but if you had a pc at home, you were playing DOOM on MS-DOS.
Back in 1993/1994 and for years after linux was just nowhere near MS-DOS in popularity, stability, usability, compatibility etc. Debian was literally only just born the same year - but if you think Arch or GEntoo is hard to get up and running . . . that’s peanuts to what a 1993 era linux user would be doing. In fact “linux programmer” is likely what you were - I don’t believe there was such a thing as “linux user” until a years later - and it was still very painful and unstable.
Back then MS-DOS with it’s CLI was stable, simple and fairly efficient - massively more so than the “windows GUIs” that would follow.
DOS was fairly cheap - and there were “other” ways to get it anyway - I don’t think MS cared about home user piracy much - they just wanted B2B deals (and pre-installs with pc sellers).
“Windows” was just not relevant for gaming in 1993 - even in win’95 and win’98 days windows was not really an “operating system”.
windows 3.x/95/98 was just a program that you could choose to run after booting into MS-DOS - and you’d only start up that mess if you wanted the GUI or some wizzywig programs like desktop publishers or something - of course Mackintosh was still the no1 choice for most pro GUI stuff.
Even when windows 1995/98 and so on came out for most gaming I’d have been booting into DOS anyway. everyone had a few DOS 6.2 boot disks lying around. Going into the naked DOS CLI meant you could access the large contiguous chunks of extended memory that games typically needed - starting windows always RAMmed you somewhere uncomfortable.
It wasn’t really until 3d graphics drivers became packaged into directX that that Windows became a real thing for gaming.
From memory something like Grand Theft Auto (1) in about 1997 would have been the first game I would have actually started windows for.
Doom was basically 4 years old and pretty ancient by then. But it was still the number 1 multiplayer game in my house - since by that time we had a couple of PCs capable of Doom plus maybe a laptop or one brought over from a firends. . . . and a bloody unreliable BNC-coaxial bus network. Couldn’t get enough PCs that could run quake well enough to be a fair fight.
However I could imagine a lot of people wanting to get up to four networked devices going to death match at home. SO that may well have been a driver for porting.
I didn’t install it on weird devices like sony ericsson P800 or my ipod until much later - for example not until those devices were invented and cheap enough.
And all that was just a gimmick -or geeks fucking around “because they can” - the control interface of P800 touchscreen was just nowhere near the proper keyboard experience. If you can’t simultaneously sidestep+sprint+turn and run backwards - you can’t play doom.
DOOM on a ipod click-wheel - just fucking stupid - surprisingly slightly better than the P800 though.
haha.
Similar, I skim then, don’t really know what they exactly mean, but often some terms and phrases are just scary.
Is there any youtube channel or something where someone knowledgeable goes through them and points out what the different parts mean.
I think that’d be quite interesting or at least useful.
explainingcomputers on youtube.
But really he just shows how there’s nothing to it these days.
Probably easier than a windows install.
Especially if you try to force your brain to read the windows user agreement - I tried to do a micrsoft virtual machine install recently, and got stuck at the EUA. My mouse just refused to click yes.
If you want a non-terminal os based on linux you just have to make something like android or chromeos or steamdeckos.
Those are and pretty popular, so I don’t know who can claim linux is “terminal obsessed” it’s just a kernel and there is a wide diversity of os based on it.
Debian , fedora , suse etc might all be “obsessed” with the terminal.
For me that’s just the obvious economical way to offer features. decent GUI costs a lot more to develop and document - so you have to have less features for a given amount of dev time. Or you have google /valve/microsoft type amounts of resources to spend.
I always thought this “year of linux” thing was a meme to make fun of canonical or idiotic tech journalists .
Is anyone realitsitcally interested in volunteering their time to win over legions of Microsoft fanboys. Fuck me sounds like hell.
And frankly the use of terminal is going to be far from the first blocker to linux adoption for those who don’t even know they’re using windows or mac.
A lot of trickle down economics fans in this thread.
gta 1isthe best
goruanga!
Volvo probably trying to cast off their reputation for being “safe ang boring” and take on a more edgy image.
Ditching Internal combustion in favour of steam power is also a major shift for them.
I’d go raspberry pi for kids - gpio projects are fun and linking computer to physical world.
The newer ones are a bit pricey for what they are though.
Devils advocate - you might be getting extra layer of testing, by the “derived” distro testing community.
I mean if they do any, it may be more focussed on the combo of setup and software you prefer.
So a small reduction in risk of bugs?
I thnik ubuntu did have a pupose in 2002 or whenever - it was a step foward in ease of install, and out of the box experience, esp. for noobs.
Now most have that, including stock debian. even arch comes with the idspispopd script these days.