Same, I didn’t realize the directory I was deleting had a symlink to some root directory, at least until my mouse stopped working…
Same, I didn’t realize the directory I was deleting had a symlink to some root directory, at least until my mouse stopped working…
The closest argument that “the Bible argues for a work week” is the first two chapters of Genesis. God created the world in 6 days and rested on the 7th.
… That’s it. That’s the whole reason our work week is the way it is. Jewish tradition really ran with that, and Christianity started as a Jewish sect. And of course for-profit business tried to jam as much work as possible into that framework. You can thank unions for the second day in your weekend.
Everything else here, the “10 hours a day” and whatever else, is all just embellishment, possibly citing other parts of the Bible to make it sound more plausible.
Alright, fair enough. The brand new AAA graphical showcase doesn’t run above 40FPS if you’re insistent on native 4K from a 6800XT. I’m not sure that qualifies as “runs like ass” like your original comment, but it’s a fine thing to qualify.
I will add however that there’s no mention of XeSS issues on the “known issues” page, so I’m unsure what you’re referring to. Only an issue with FSR frame generation and manual window resizing, and frankly I wouldn’t recommend frame generation in any circumstance anyway. Perhaps the issue you’re referring to has already been resolved?
… am I supposed to be impressed by that?
It’s better than you’re getting on the tier-up card from the exact same generation as what you’re running, so… it pretty clearly indicates something is going wrong on your end.
And that’s with the forced ray tracing. Regarding FSR, DF recommends using XeSS, which I’ve had no problems with even using performance mode to play on a 4K display.
It’s only really fair to judge the performance cost of the ray tracing if you’re actually running the game fairly. If you’ve maxed out every setting to ultra nightmare at native 4K or something to get that “can’t go above 40FPS” figure, then I have no sympathy for you or your performance complaints.
I think you may want to look into DF’s recommended settings (just skip to the table and read from there if you aren’t interested in the details), touching base with my friend who I sold my previous 6700XT to, he reports a rock solid 60FPS targeting native 1080p.
That said, they don’t claim a performance increase that drastic, so you may have some other performance issues?
Oh, and DF stands for Digital Foundry, often considered the best source for benchmarking new games these days. They have several recent videos on Doom: The Dark Ages, graphics nerds always take an interest in a new idTech title.
Dude, what are you on about? Sure, it’s not as easy to run at 300 FPS, but it’s a new boundary pushing game and for what it’s doing it runs astoundingly well.
Absolutely gorgeous, and must rely on black magic because even DF reports it never has any stutter, traversal or shader, despite having massive levels with ridiculous fidelity and not even having a shader precompilation step. Hell, I can’t even understand how they got Denuvo to not introduce stutter.
Not to mention it’s somehow fairly light on the CPU despite huge enemy counts with good AI, raytracing, the best destruction physics I’ve seen in ages, and the streaming demands of massive levels. I’m completely GPU limited with a decent CPU and a 7900XTX.
Hell, it even hits 60 on consoles while doing all of this, the game’s performance is witchcraft. Eager to see the path tracing and how far we’ll be able to push this game a decade from now, like how I can run Eternal at native 4K/120 now.
Haha, dang it. Seems I got confused, turns out that was just a bedrock thing. Could’ve noticed that if I’d looked more closely at my own link 🙄
To add to this, Minecraft Java Bedrock used to ship their code with all the debug symbols included, making modding easy. Although these were recently removed, much to the displeasure of the modding community. Everyone should throw a vote at this feedback issue to request them back, btw:
Eh, not much nefarious you can do by pushing data around. Taking a lot of CPU/GPU usage? Certainly, you can do a lot of evil with distributed computing. But bandwidth?
Costs a lot to host all that data to push to people, and to handle streaming it to so many as well, all for them to just… throw it out? Users certainly don’t keep enough storage to even store a constant 100Mb/s of sneaky evil data, let alone do any compute with it, because the game’s CPU/GPU usage isn’t particularly out of the ordinary.
So not much you could do here. Ockham’s razor here just says… planes are fast, MSFS is a high fidelity game, they’ve gotta load a lot of high accuracy data very quickly and probably can’t spare the CPU for terribly complicated decompression.
I think it is a problem. Maybe not for people like us, that understand the concept and its limitations, but “formal reasoning” is exactly how this technology is being pitched to the masses. “Take a picture of your homework and OpenAI will solve it”, “have it reply to your emails”, “have it write code for you”. All reasoning-heavy tasks.
On top of that, Google/Bing have it answering user questions directly, it’s commonly pitched as a “tutor”, or an “assistant”, the OpenAI API is being shoved everywhere under the sun for anything you can imagine for all kinds of tasks, and nobody is attempting to clarify it’s weaknesses in their marketing.
As it becomes more and more common, more and more users who don’t understand it’s fundamentally incapable of reliably doing these things will crop up.
Honestly, makes sense, the active voice version is just… more efficient and easier to parse quickly.
Eh, you’re applying generalizations universally. These last couple weeks, I literally bought Zelda Echoes of Wisdom day one, even bought NSO vouchers so I can buy the next big game that comes out. Day one, I dumped the game from my own modded switch, and started playing it on Ryujinx rather than on my Switch.
I modded the game to change the resolution to always 1080p native, remove the double buffered vsync issue to smooth out the framerate and let VRR work, boosted LODs for better distant assets, and swapped the UI for Xbox controls. Ryujinx also let me play at 2x internal resolution, so I could run the game at native 4k.
My game looks sharper, runs smoother, and is a lot of fun to tinker with. I’ve had a blast checking gamebanana every day for new mods, or for Ryujinx patches. I love it, and I’m preferring that experience to Switch. I’m also getting fun stuff like discord rich presence, and being able to record with my GPU driver, stream my gameplay to discord, play with a controller I prefer to a Pro Controller, everything.
I’m also looking forward to tinkering with mods for unlimited echoes once I beat the game. I’m more than happy to pay for the game, but this is much more fun for me, and trades blows with the real hardware experience well enough that I’d much rather play here than on Switch.
Yeah… even worse, it appears the admin didn’t even announce it, this is just one of the developers clarifying what the admin probably did.
As someone who uses Ryujinx, I literally spent the afternoon curious about an error I was getting while updating about the build server being down “probably because it’s building a new version, check back in a few minutes”, only to find a Twitter screenshot of this linked in slack that evening.
Eh, this is a thing, large companies often have internal rules and maximums about how much they can pay any given job title. For example, on our team, everyone we hire is given the role “senior full stack developer”, not because they’re particularly senior, in some cases we’re literally hiring out of college, but because it allows us to pay them better with internal company politics.
I’m inclined to agree! That’s awesome, adding that to my following immediately.
Ooh, this is very interesting. I’m a sucker for emulator progress reports, just a fascinating intersection of programming, graphics, and gaming. My personal RSS feeds right now (which I’d love to add lemmy discussion to) are:
https://dolphin-emu.org/blog/feeds/ https://pcsx2.net/blog/rss.xml https://www.libretro.com/index.php/feed/ https://blog.ryujinx.org/rss/ https://xenia.jp/feed.xml
I don’t necessarily disagree that we may figure out AGI, and even that LLM research may help us get there, but frankly, I don’t think an LLM will actually be any part of an AGI system.
Because fundamentally it doesn’t understand the words it’s writing. The more I play with and learn about it, the more it feels like a glorified autocomplete/autocorrect. I suspect issues like hallucination and “Waluigis” or “jailbreaks” are fundamental issues for a language model trying to complete a story, compared to an actual intelligence with a purpose.
Eh, that’s a mixed bag. Absolutely, one could setup shared delete requests, to federate a delete request, but it would be a bit of a lie as anyone could simply… update their instance to simply ignore delete requests.
For now, simply not having a delete feature is a more honest to the realities of the fediverse. There’ll never be a “true” delete, even if they do eventually support one that’s “good enough”.
I’d be very surprised if anything functional actually comes out of this. Far more likely they get scammed out of the money by garbage like the current “AI writing detection” methods, with terrible success rates that cause more societal problems than they solve.
Earnest question, how is this actually legally viable?
Obviously the decompilation is open source, but those are usually distributed without assets, in some kind of builder that requires a copy of the game. And clearly the original game isn’t open source, or else this decompilation wouldn’t need to exist.
So… has the game been released free to the public without the source code? Has Lego or the original developer blessed this project? Or is the game just… in legal limbo or something where they feel comfortable taking the risk?