So after one of my recent comments about if Linux is ready for gaming, I decided to pick-up a new Intel based wifi adapter (old one was broadcom and the drivers on fedora sucked and would drop connection every few minutes).
So far everything is great! Performance wise I can usually run every game about one tier higher graphically (med -> high) with the same or better performance than on Windows. This is on an rx 5700 and an ultrawide.
Bazzite is running great as always. Still getting used to the immutability of the system as I usually use Arch btw, but there are obviously workarounds to that.
Overall I’m still getting used to the Steam “processing vulkan shaders” pretty much every time a game updates, but it’s worth it for the extra performance. Now I’m 100% Linux for my gaming between my Steamdeck and PC.
Aah okay that makes sense. I wouldn’t mind the extra space.
If that’s turned off, do you know if the game generates and caches shaders as you play? If so, does that also apply to games run outside of Steam?
Currently I’m just playing games like Hunt Showdown and Helldivers.
For Hunt Showdown specifically, I have tried skipping pre-caching before and the load into a level took so long that I got disconnected from the match. I recommend keeping it enabled for multiplayer games for that reason.
It’s usually disabled outside of steam. However you can use environment variables :
well, I do have this one game I’ve tried to play, Enshrouded, it does do the shader compilation on it’s own, in-game. The compiled shaders seem to persist between launches, reboots, etc, but not driver/game updates. So it stands to reason they are cached somewhere. As for where, not a clue.
And since if it’s the game doing the compilation, I would assume non-steam games can do it too. Why wouldn’t they?
But, ultimately, I don’t know - just saying these are my observations and assumptions based on those. :P