Okay, so all they're saying is that they won't certify any monitor under 144 for FreeSync technology. Okay? That basically changes nothing. If you are using a 60hz monitor, basic vsync is all you need.
That's news to me. I haven't had any gaming machine for sometime now, my last PC was using an AMD FX6300 and Radeon 285 slotted into and ASUS board. I didn't have the budget to buy any monitor beyond 1080p 60hz. Ah, lots of Skyrim, GTA V and Witcher 3 memories..
I really miss gaming on MacOS, the Switch is fun but nowhere close. A Steam Deck might scratch the itch?
Probably their freesync is mostly ineffective on 60Hz but better on higher refresh rates, and they just made a good spin out of that in the marketing department.
1440p high refresh rate gamers unite. I have an Alienware AW3423DWF and boy are those new OLED panels beautiful. Expensive but beautiful. I still remember playing Left 4 Dead right after I got it and even without HDR I was baffled by the credits at the end of the match. Just white text floating in nothingness.
They also recently released the AW3225QF which is 4k@240.
I have a 2.5K ultrawide 144Hz. Even when this PC was new it struggled on that era of games :(. We need better graphics cards that don't cost the price of a mortgage.
FreeSync is for variable refresh rates, which 60Hz monitors generally don't support anyway. So this headline is nothing but clickbait.
Also, I don't know of any sub-120Hz VRR monitors that are still being made, but if they exist, they're not aimed at anyone who cares about FreeSync branding.
So this whole article is a pointless waste of time.
Provided the 60hz monitors still work who cares..if they do some arbitrary bullshit to prevent stuff from not working just because profit, then get fucked. I personally don't care about their certification or claims.
Open source is the best. That doesn't mean the recommendation to move off 60hz isn't profit motivated. Especially when driving displays at over 60hz means selling more graphics cards since your older one may not go far beyond 60.
Maybe I'm weird because anything over 20 FPS looks smooth to me (and I know it doesn't to other people) but what's the point of going over 60 FPS? Can anyone actually see the difference or is this just a matter of "bigger numbers must be better"?
Theres a huge difference once you use it for long enough. I have a 144hz monitor and love getting to play games that high, they’re so smooth! If you play long enough the difference becomes night/day.
Lucky you. Seriously. I wish I didn't care because it means displays are more expensive for me.
I definitely thought it was all hype but once I saw games 120+ fps, even 60fps looks choppy to me. I also very much notice the difference between 30fps and 60fps video but 120fps (at full speed) didn't do much for me
For what it's worth, I was a professional video editor for years so I'm a bit more inclined to notice than the average person
I'm kind of in that boat - digital art, and so on more. I never understood buying a computer monitor of over about 22" that was 1080p resolution. I want decent colour reproduction - I get it, it won't be perfect unless you spend a fortune but it should be at least decent.
120hz w/ good HDR support is fantastic for content that supports it, and 240hz is just buttery smooth. Variable refresh is pretty much a must for modern gaming.
It's very easy to tell the difference when you see them in person. I have a 60Hz monitor and a 144Hz monitor on the same PC and you can drag a window across the desktop from one to the other and the lack of animation frames in the movement going from 144 to 60 makes the movement look choppy on the slower one. In games, the animation becomes smooth to the point of being lifelike and visually vibrant when your framerate is able to go up to 90 to 100 or more FPS
It really depends on what and how you play. If reaction time is important then you'll feel more than see the difference in refresh rates. If none of your games require sub second reaction time accuracy, then it's much more of a nice to have luxury than a game changer.
Also, frametime pacing matters a lot. If your system very consistently puts out 30 fps, you'll have more accurate keypresses than if you normally get 50 and it gets hung on a few frames and it dips to 30fps. Your nervous system adapts pretty well to consistent delay, but it's much more difficult to compensate for delay that varies a lot.
I don't really play first person shooters so resolution matters more to me than framerate.
For 3d games where the whole screen is moving and changing as the camera moves, I've noticed a big difference between 60 and 144. It just makes the game feel absurdly smooth.
For smaller games with more static views it doesn't really make much difference.
I don't play competitive games so I don't need the extra shooting accuracy. What I have found is that the higher refresh rate has made panning maps in RTS or looking around quickly in FPS much smoother. Its an overall nicer experience, but not really any better gaming than at 60hz.
Occasionally my fps gets set to 60. As soon as I start playing rocket league I can tell it is off. I went to a friends house and asked why everything is so choppy, checked his monitor settings and it was set to 60 instead of 144. There are people that can see the difference
To be "FreeSync certified", a monitor has to have certain minimum specs and must pass some tests regarding its ability to handle Variable Refresh Rate (VRR). In exchange for meeting the minimum spec and passing the tests, the monitor manufacturer gets to put the FreeSync logo on the box and include FreeSync support in its marketing. If a consumer buys an AMD graphics card and a FreeSync certified monitor then FreeSync (AMD's implementation of VRR) should work out of the box. The monitor might also be certified by Nvidia as GSync compatible, in which case another customer with an Nvidia graphics card should have the same experience with Gsync.
What does this mean for standard TVs that people us for gaming. LG/Sony/Samsung OLEDs tend to be able to do 4k@120, having native 120hz panels. Maybe this only covers "monitors" getting freesymc certified.
Rtings says that the LG (B2 at least) TV's support VRR via several standards: HDMI 2.1 , FreeSync, and GSYNC. I have a console hooked up, but no GPU good enough in a PC.