It is. But every week I see they've been doing work on it.
Apparently there's been an issue in which exiting some fullscreen programs causes the cursor framerate to come out of sync with the content on the display, causing cursor flicker.
I think Plasma also had this issue, but pushed the feature anyway then ironed out the kinks while it was in production. Now it works pretty well.
Gnome, sometimes frustratingly, doesn't really release things until they think they're perfect and as bug-free as possible.
And I get it, and agree, it's led to Gnome being ridiculously polished and about as bug-free as an up-to-date DE can get.
But sometimes I'm just like damn Gnome this must be slowing down your development
Plasma never had this issue in any release, VRR has worked exactly the same since it was introduced until Plasma 6 (where there's some improvements for the "always" mode)
Adjusting the refresh rate to the performance of the desktop is one.
That's the definition, isn't it? Why is this better than a fixed refresh rate? Can the monitor scale the rate down to consume less power or something?
I also heard it would make it easier to manage multiple monitors sporting different refresh rates, although I haven't had issues with that personally.
I heard that too and got similarly confused. I work with two monitors with different refresh rates (75 and 60) on Mint and it seems fine. Is X downgrading my 75 Hz monitor to 60 silently? I don't know how to check that.
To avoid having to skip frames to make the desktop look more fluid, thus matching the refresh rate of the monitor.
I think the whole desktop runs at the higher refresh rate when you have mismatched monitors? Not sure. Wayland and X11 might differ as well on how they handle this.
I guess, but you're usually not rapidly rotating models while you're designing it. At least the workloads I'm familiar with, movements are much more deliberate and even a fixed 50 Hz laptop monitor can handle.
It's also good for video, as it can play videos at the highest possible Hz multiple of the video's FPS. So for example 24 FPS video could be played back with 144 Hz, 25 FPS with 125 Hz etc. VRR isn't technically required for this as many non-VRR monitors support different video modes with different fixed Hz as well, but the transition between Hz is seamless (no need to change video mode).
You lost me here now. Why would want to repeat the same frame four or five times in video? Is that to add post processing effects like motion blur between them?
It's not redrawing the frame, it's more related to aligning the monitors refresh rate to the frame rate of the content being displayed. Alignment means your monitor doesn't refresh the screen when the frame is only partially rendered (aka screen tearing).
Right, it doesn't need to be multiples then, it could be the exact same refresh rate as the movie. Even those weird 25.xx refresh rates some are distributed in. Thanks for answering.
I wasn't offended. I was just using a metaphor to demonstrate how your question sounded to me.
As far as I'm personally aware, VRR tech as a whole was invented for and is used for gaming pretty much exclusively. I'm not the person you asked, nor did I up or down vote your question.
Tbh I always disable VRR because I find the flicker in games and full screen video way too distracting. At first I thought it was my previous VA monitor but the exact same thing happens on my OLED.