Always the first thing I turn off, but surely there are some people out there that actually like it. If you're one of those people is there a particular reason?
Only for very specific games, and only because I don’t have a high refresh rate monitor.
If I’m in Forza driving 200 km/h I shouldn’t be able to see the bricks I’m flying past. With my low refresh rate monitor I can, so adding just a hint of motion blur really helps add that flourish of immersion that I can’t get with my setup. But that’s again very specific games and only because I cap out at 60fps.
Because at lower frame rates your eyes don’t add motion blur. So you use the processing power to add it. If I had a higher refresh rate monitor I wouldn’t need motion blur.
I also have the impression that motion blur causes frame drops. Then again, some games do seem to hiccup when turning regardless of if motion blur is enabled.
Now I'm wondering if it's causation or just correlation. Intuition suggests that additional post-processing would at the very least exacerbate frame drops even if it doesn't cause them itself, but I've never done a deep dive to find out.
Motion blur off looks like those high shutter speed fight scenes from the Kingsman movies. Good for a striking action scene but not pleasant to look at in general. Motion blur blends the motion that happen between frames like how anti aliasing blurs stairstepping.
Motion blur in film does that, but with video games, in every implementation I've seen, you don't get a blur that works the same way. Movies will generally blur 50% of the motion between frames (a "180 degree shutter"), a smooth blur based on motion alone. Video games generally just blur multiple frames together (sometimes more than two!) leaving all of the distinct images there, just overlayed instead of actually motion blurred. So if something moved from one side of the screen all the way to the other within a single frame, you get double vision of that thing instead of it just being an almost invisible smear across the screen. To do it "right" you basically have to do motion interpolation first, then blur based on that, and if you're doing motion interpolation you may as well just show the sharp interpolated mid frames.
On top of that, motion blur tends to be computationally very expensive and you end up getting illegible 30fps instead of smooth 60+.
This is not how motion blur works at all. Is there a specific game you're taking about? Are you sure this is not monitor ghosting?
Motion blur in games cost next to no performance. It does use motion data but not to generate in between frames, to smear the pixels of the existing frame.
DoF is hit or miss depending on the game, for me. I turn it off in games that have rather poor context sensitivity for what it blurs, but I'm okay with it in games where it only applies to, like, ADS. The former I hate because there are so many times I'm trying to get a good look at something, and it constantly blurs what I'm looking at because it's too close, or too far, or the cross hair isn't exactly on the right pixel, etc.
Playing MGS5 again recently and it annoys me that I can't turn DOF off (at least on PS5) because it works the way I dislike.
Perhaps the phrasing is wrong, but you could give op benefit of the doubt and think about what you like about it since it's the de facto standard. For example, you could say "it makes me feel like I'm actually going faster, but also I just like it and your question is dumb". Informative and mean at the same time!
If a gay man asked you "what do you find attractive about women" or the N other combos of that question would you helpfully say "get lost weirdo, I like what I like and there is no point in discussing it"?
Note while you're shitting on op, op at no point said your opinion is wrong just that they wished to understand. You're the bad guy here, with unnecessary hostility in response to a question.
Motion blur in video games doesn't really work for many people. For example, it induces nausea for me. For others, it makes it difficult to identify and analyze a scene properly.
The OP's question asks you why you leave it on. Your answer could very well have ended at "Because I like it", but you chose to read it in bad faith and proceeded to make it about preference bashing, which it's clearly not.
OP's title, and similarly phrased ones for other commonly disliked settings, aren't actually looking for dialogue.. they're just "hey guys, light mode, amirite?" jokes phrased as questions
Motion blur is a win if it's done correctly. Your visual system can make use of that blur to determine the movement of objects, expects it. Move your hand quickly in front of your eyes -- your fingers are a blur.
If you've ever seen something filmed at a high frame rate and then played back at a low frame rate without any sort of interpolation, it looks pretty bad. Crystal-clear stills, but jerky.
A good approximation -- if computationally-expensive -- is to keep ramping FPS higher and higher.
But...that's also expensive, and your head can't actually process 1000 Hz or whatever. What it's getting is just a blur of multiple frames.
It's theoretically possible to have motion blur approaches that are more-efficient than fully rendering each frame, slapping it on a monitor, and letting your eye "blur" it. That being said, I haven't been very impressed by what I've seen so far in games. But if done correctly, yeah, you'd want it.
EDIT: A good example of a specialized motion blur that's been around forever in video games has been the arc behind a swinging sword. It gives the sense of motion without having to render a bazillion frames to get that nice, smooth arc.
One other factor that I think is an issue with motion blur: the modeling of shifting gaze in video games often isn't fantastic, due to input and output device limitations.
So, say you're just looking straight ahead in a game. Then motion blur might be fine -- only moving objects are blurred.
But one very prominent place where motion blur shows up is when the direction of your view is changing.
In a video game, especially if you're using a gamepad, it takes a while to turn around. And during that time, if the game is modeling motion blur, your view of the scene is blurred.
Try moving your eyeballs from side to side for a bit. You will get a motion-blurred scene. So that much is right.
But the problem is that if you look to the side in real life, it's pretty quick. You can maybe snap your eyes there, or maybe do a head turn plus an eye movement. It doesn't take a long time for your eyes to reach their destination.
So you aren't getting motion blur of the whole surrounding environment for long.
That is, humans have eyes that can turn rapidly and independently of our heads to track things, and heads that can turn independently of our torsos. So we often can keep our eyes facing in one direction or snap to another direction, and so we have limited periods of motion blur.
Then on top of that, many first person shooters or other games have a crosshair centered on the view. So aiming involves moving the view too. That is, the twin-stick video game character is basically an owl, with eyes that look in a fixed position relative to their head, additionally with their head fixed relative to their torso (at least in terms of yaw), and additionally with a gun strapped to their face, and additionally, with a limited rate of turn. A real life person like that would probably find motion blur more prominent too, since a lot of time, they'd be having to be moving their view relative to what they want to be looking at.
Might be that it'd be better if you're playing a game with a VR rig, since then you can have -- given appropriate hardware -- eyetracking and head tracking and aiming all separate, just like a human.
EDIT: Plus the fact that usually monitors are a smaller FOV than human FOV, so you have to move your direction of view more for situational awareness.
Human field of view is around 210 degrees horizontally. Each eye has about 150 degrees, with about 110 degrees common to the two and 40 degrees visible only to that eye.
A typical monitor takes up a considerably smaller chunk of one's viewing arc. My recall from past days is that PC FPS FOV is traditionally rendered at 90 degrees. That's actually usually a fisheye lens effect -- actual visible arc of the screen is usually lower, like 50 degrees, if you were gonna get an undistorted view. IIRC, true TV FOV is usually even smaller, as TVs are larger but viewers sit a lot further away, so console games might be lower. So you're working with this relatively-small window into the video game world, and you need to move your view around more to help maintain situational awareness; again, more movement of your direction of view. A VR rig also might help with that, I suppose, due to the wide FOV.
Move your hand quickly in front of your eyes - your fingers are a blur.
Actually it depends on the lights you're under if it'll look smooth or not. The ones at my house makes it slightly flickery like there's not motion blur. If you have lights where you can control brightness it'll look choppier the dimmer it is.
However some lights are different, the ones I'm under right now on my work break look smooth.
It depends on the implementation. Properly Implemented motion blur can look rather pleasing. Also with new frame generation tech motion blur really helps smooth out the in between frames I've found.
70% of the time, bloom is garbage, 25% of the time it's garbage and is covering up other graphical issues. 5% of the time, it gives some nice depth to light and emphasizes brightness differences, even without HDR.
I wouldn’t say I particularly prefer it, but a lot of the time I don’t mind it or notice it enough to turn it off. There have been a few games where it’s been egregious enough to disable it as soon as I can, though.
When i enable it, it makes it so blurry that i can only properly see stuff when i stop moving my mouse. Is that because of low framerate? (happens in nearly every game that i try to enable it in, even when setting motion blur to the lowest amount)
Some games are designed with motion blur in mind. Elden Ring, for example, looks very unpleasant to me in 60 FPS without motion blur. But I disable it when using a mod that unlocks the FPS.
I usually turn on a light motion blur in games that I f don’t get above 40-ish fps, because the motion blur masks the stuttering. I prefer no motion blur and stuttering to too much or bad motion blur though.
I couldn’t play Horizon Zero Dawn on the PS4 Pro, because the motion blur was really intense, even in performance mode and there was no way to turn it off.
I really like it when games give you an intensity slider instead of just on or off. Spiderman on the PS4, for example runs at 30fps. It looks like a stuttery mess with motion blur off. With motion blur at the highest setting (which is the default I think), you cannot see a thing when moving. But putting it at ~20% or so masks the stuttering very well without being a complete eyesore.
I also like object based motion blur a lot, like the Jedi games have. Instead of blurring the camera movement, it only blurs the movement of objects that are actually moving (quickly), which has a nice effect, in my opinion.
In general though, I prefer having better performance and a clear image, but motion blur is a useable band-aid solution if performance is a limiting factor.
I have similar opinions to the likes of DLSS, FSR & Co.
I vastly prefer running games at native resolution but when my GPU can’t keep up, FSR it is.
I‘m not yet convinced of frame generation as an alternative to motion blur to get 30fps feeling a little closer to 60 but I haven’t gotten around to testing that yet either. Im not categorically against it in Games, unlike in movies. Motion smoothing in TVs is a pest.
It smooths out the framerate, also it looks better to me 🤷♀️. I've been playing games since I was little so I don't really get nauseous from it like others in this thread.
I have a pretty high end computer but also keep it on playing games on my Steamdeck too.
I use it occasionally, in some games it looks better. Particularly games where the camera doesn't swing around as wildly, meaning NO FPS GAMES! Or any game where you're manually moving the camera all the time. I have yet to see a FPS where motion blur doesn't fucking blind me for every split second I move.
There is no reason a ryzen 5 4000 and a GTX 1650 with 16 GB of ram shouldn't be able to run a game at 60 fps at 1080p native resolution, or at 1440p (monitor I use now is the resolution) with upscaling and still look decent. That's not even an opinion thing, cyberpunk runs at a good framerate at 1440p looking absolutely gorgeous with fidelityfx 3, but I shouldn't even need that. Also, "just upgrade your pc" is like telling a homeless guy to just buy a house because 1) PC shit is expensive and 2) I have a laptop so I can't just upgrade bits and pieces.