
Apparently its called “Innovis”. I tried to find a few articles but they all read like press releases or AI slop.
There is a blurb on Wikipedia fortunately:
https://en.wikipedia.org/wiki/Credit_bureau#Consumer_reporting_agency
Apparently its called “Innovis”. I tried to find a few articles but they all read like press releases or AI slop.
There is a blurb on Wikipedia fortunately:
https://en.wikipedia.org/wiki/Credit_bureau#Consumer_reporting_agency
Yeah, I’ve been fortunate enough to be offered those multiple times as well. I froze my credit with the big three agencies after the third or fourth breach. Recently learned there’s apparently a fourth agency now? Cool. And there’s hundreds of data broker sites…
As a settlement for the wrongful death of your parents you are entitled to 12 months of LifeLock’s DataScrub™ service!
I had a tricky time getting hardware encoding to work and it ultimately ended up being I needed to expose the GPU to the Docker container. The yaml config needed:
devices:
- /dev/dri/renderD128:/dev/dri/renderD128
- /dev/dri/card0:/dev/dri/card0
Note this was on a low-end Synology NAS with some sort of crappy intel GPU, but it actually works now, I was surprised. I only mention because before this I spent lots of time messing around with the Jellyfin settings and only the logs tipped me off. Jellyfin loves to fallback silently to CPU transcoding it seems, which I guess is good, but make troubleshooting unintuitive. Searching for log errors online gave me this solution.
If you can find it, I keep a small bag of straight-up wheat gluten and I add a spoonful or two when I want to make stronger flour. A small bag lasts forever and a little goes a long way.
It’s not just tech. Gardening, DIY, cooking, and similar popular subjects have been completely destroyed by this crap. If I see an AI generated header image or thumbnail I immediately backpedal now because I assume that means the text is bullshit too.
The example stuck in my memory now is when I was trying to read about watermelon growing times and the article said they flower a week after germination.There’s now frequently this, “oh GOD DAMN IT *close tab*” moment when you realize it’s actually total slop. Like, “oh so this article is BULLSHIT bullshit.”
I found the original blog post more educational.
Looks like these may be typosquats, or at least “namespace obfuscation”, imitating more popular packages. So hopefully not too widespread. I think it’s easy to just search for a package name and copy/paste the first .git files, but it’s important to look at forks/stars/issue numbers too. Maybe I’m just paranoid but I always creep on the owners of git repos a little before I include their stuff, but I can’t say I do that for their includes and those includes etc. Like if this was included in hugo or something huge I would just be fucked.
I’m starting to think it’s something super specific to the particular hugo theme I’m using and how it wants users to insert custom js/css to get it all baked down into the right place in the final output. I’ll keep bashing on it, thanks for your help!
Thanks fixed. Interesting jerboa and the web version of lemmy are developed by the same person but using the “code” button in the web frontend only uses one backtick. That might be worth a bug report.
I’m actually trying to get away from github also, so maybe codeberg pages instead? This is a part of the process I haven’t done enough research into, I wanted to get the static site working locally first then “shop around” for hosts.
OK, looks like the image paths are correct. It’s something about the JS that fades them in. If I toggle the opacity property on/off then suddenly it works fine, until I refresh, so something funky is going on there. At least I know the structure is correct hugo-wise so it’s just a matter of tracking down the fade-in issue.
The issue seems to be with how hugo renders everything down into a /public directory. Somehow this is breaking the static images Lightbox uses to do prev/next/close. It’s a small issue and I’m sure the fix is something dumb, it just wasn’t obvious to me (the images appear to be correct). But sounds like it’s worth just debugging it…
Something about how hugo is cooking everything down into a /public directory is breaking the overlay images (like the next/prev arrow). I’m sure I can track it down but since I’m pretty inexperienced this will take me some time (at cursory glance all the paths seemed good, so I’m not sure why it’s broken).
I would also prefer to host it myself so maybe I should just do this…
Yep, this. Chisel, glue up, or carve a recess into a larger piece of scrap plywood to use as a jig to hold the work piece while you plane it. Make the recess the exact depth of thickness you need and you can plane it perfectly to thickness.
It’s the same as learning anything, really. A big part of learning to draw is making thousands of bad drawings. A big part of learning DIY skills is not being afraid to cut a hole in the wall. Plan to screw up. Take your time, be patient with yourself, and read ahead so none of the potential screw-ups hurt you. Don’t be afraid to look foolish, reality is absurd, it’s fine.
We give children largess to fail because they have everything to learn. Then, as adults, we don’t give ourselves permission to fail. But why should we be any better than children at new things? Many adults have forgotten how fraught the process of learning new skills is and when they fail they get scared and frustrated and quit. That’s just how learning feels. Kids cry a lot. Puttering around on a spare computer is an extremely safe way to become reacquainted with that feeling and that will serve you well even if you decide you don’t like Linux and never touch it again. Worst case you fucked up an old laptop that was collecting dust. That is way better than cutting a hole in the wall and hitting a pipe.
KDEnlive is improving, however Resolve is still more powerful and mature. That said, DaVinci’s business model seems precarious. It feels like they could, at any moment, enshittify Resolve and force users into a subscription just to maintain access to old edits. I think for that reason KDEnlive is better for almost all users. If you are a professional filmmaker then the color and vfx workflows of Resolve are probably worth paying for, but in that case it’s probably a FinalCut vs Resolve question anyway.
This reminds me of the Blaster Worm back in the early 2000s. Infected users had to patch their PC without the internet, because connecting is what would cause you to reboot (so many PCs were infected it was basically instant). I worked at a computer store and we burned a bunch of patch CDs and were giving them out like hotcakes. My boss decided to slap a price tag on them for a day or two but we convinced him the good will was worth the cost and he eventually made it free again. People were fucking pissed off and handing out the free CD made them very grateful.
Coming from Python I feel like it’s my partner and best friend. In fact the whole damn tool chain is amazing.
Ubiquitous in the games industry unfortunately, for at least the art side but often code as well.
That’s one kind, and Rust’s “ownership” concept does mean there’s built-in compile time checks to prevent dangling pointers or unreachable memory. But there’s also just never de-allocating stuff you allocated even though it’s still reachable. Like you could just make a loop that allocates memory and never stops and that’s a memory leak, or more generally a “resource leak”, if you prefer.
Rust is really good at keeping you from having a reference to something that you think is valid but it turns out it got mutated way down in some class hierarchy and now it’s dead, so you have a null pointer or you double free, or whatever. But it can’t stop the case where your code is technically valid but the resource leak is caused by bad “logic” in your design, if that makes sense.
Used it a ton in the art departments of vfx and game dev. Im talking about the tools that make assets, not the game engine or a runtime scripting language. More like the stuff launching and running in Maya, or Houdini, or Substance, etc.
Most of this is already highly OO, and there’s a lot of interaction with C++. Python is the perfect language for this. There’s a lot of rapid interation and gluing many different services and data together. Also you’re waiting on file IO or some massive scene graph update all the time so having the tools be slightly slower doesn’t matter. Also, at least in vfx, there’s mixed Linux/Windows/Mac and it’s great for that. ALSO art teams (unlike the programming team) have people who may not be super technical, and Python let’s them write tools and scripts more easily. They don’t even have to understand OO but you can say “copy this class template and implement these two methods” and they can write tools that “work” in the pipeline.
It’s honestly a godsend. Before the industry settled on Python, every program had its own proprietary scripting language and some were quite limited. Their C++ APIs are all different, of course. So now everyone just ships with a Python interpreter, you manage launching each app so you can control PYTHONPATH and you’re golden.