I use a 1080p monitor and what I've noticed is that once creators start uploading 4k content the 1080p version that I watch on fullscreen has more artifacting than when they only uploaded in 1080p.
Did you notice that as well?
Watching in 1440p on a 1080p monitor results in a much better image, to the detriment of theoretically less sharper image and a lot higher CPU usage.
YouTube automatically generate videos in lower resolution of the one uploaded.
So when you watch a 4k video and switch to 1080, you are no longer watching the original video but a re-encoded one by YouTube itself which could have more artifacts since it's resized and compressed.
I dunno the exact specs (like bit rate, etc.), someone will probably add them in another reply.
I believe YouTube always re-encodes the video, so the video will contain (extra) compression artefacts even if you’re watching at the original resolution. However, I also believe YouTube’s exact compression parameters aren’t public, so I don’t believe anyone outside of YouTube itself knows for sure which videos are compressed in which ways.
What I do know is that different content also compresses in different ways, simply because the video can be easier/harder to compress. IIRC, shows like last week tonight (mostly static camera looking at a host) are way easier to compress than higher paced content, which (depending on previously mentioned unknown parameters) could have a large impact on the amount of artefacts. This makes it more difficult to compare different video’s uploaded at their different resolutions.
Just to be clear it is probably a good thing that YouTube re-encodes all videos. Videos are a highly complex format and decoders are prone to security vulnerabilities. By transcoding everything (in a controlled sandbox) YouTube takes most of this risk on and makes it highly unlikely that the resulting video that they serve to the general public is able to exploit any bugs in decoders.
Plus YouTube serves videos in a variety of formats and resolutions (and now different bitrates within a resolution). So even if they did try to preserve the original encoding where possible you wouldn't get it most of the time because there is a better match for your device.
YouTube compresses the shit out of 1080p content. Any video that has a lot of movement will look like trash at 1080p. Even if you're on a lower resolution monitor, the higher bit rate of higher resolution videos will look better. It's all very stupid on our end, but I assume it saves them a ton on bandwidth.
I'm pretty sure that YouTube has been compressing videos harder in general. This loosely correlates with their release of the "1080p Enhanced Bitrate" option. But even 4k videos seem to have gotten worse to my eyes.
Watching a higher resolution is definitely a valid strategy. Optimal video compression is very complicated and while compressing at the native resolution is more efficient you can only go so far with less bits. Since the higher resolution versions have higher bitrates they just fundamentally have more data available and will give an overall better picture. If you are worried about possible fuzziness you can try using 4k rather than 1440p as it is a clean doubling of 1080p so you won't lose any crisp edges.
There's something else that hasn't been mentioned yet: Video games in particular have been so detailed since the eight generation (XB1/PS4) that 1080p with its significant compression artifacts on YouTube swallows too many of those fine moving details, like foliage, sharp textures, lots of moving elements (like particles) and full-screen effects that modify nearly every pixel of every frame.
And no, you will not get a less sharp image by downsampling 1440p or even 4K to 1080p, on the contrary. I would recommend you take a few comparison screenshots and see for yourself. I have a 1440p monitor and prefer 4K content - it definitely looks sharper, even down to fine-grain detail and I did the same when I had a 1200p screen, preferring 1440p content then (at least as soon as it was available - the early years were rough).
If you are noticing high CPU usage at higher video resolutions, it's possible that your GPU is outdated and can't handle the latest codecs anymore - or that your operating system (since you're on Linux based on your comment history) doesn't have the right drivers to take advantage of the GPU's decoding ability and/or is struggling with certain codecs. Under normal circumstances, there should be absolutely no increased CPU usage at higher video resolutions.
It may be worth right-clicking the video and choosing "Stats for Nerds" this will show you the video codec being used. For me 1080p is typically VP9 while 4k is usually AV1. Since AV1 is a newer codec it is quite likely that you don't have hardware decoding support.
The one I've noticed is that for videos with the 1080p "Enhanced Bitrate" option, the free 1080p video looks like a blurry mess compared to normal 1080p content.
From my experience it doesn't matter if there is an "Enhanced Bitrate" option or not. My assumption is that around the time that they added this option they dropped the regular 1080p bitrate for all videos. However they likely didn't eagerly re-encode old videos. So old videos still look OK for "1080p" but newer videos look trash whether or not the "1080p Enhanced Bitrate" option is available.
About the “much higher CPU usage”: I’d recommend checking that hardware decoding is working correctly on your device, as that should ensure that even 4K content barely hits your CPU.
About the “less sharper image”: this depends on your downscaler, but a proper downscaler shouldn’t make higher-resolution content any more blurry than the lower-resolution version. I do believe integer scaling (eg. 4K -> 1080p) is a lot less dependant on having a proper downscaler, so consider bumping the resolution up even further if the video, your internet, and your client allow it.
Youtube pushes the AV1 "format" heavily these days which is hard to decode using hardware acceleration, given that a lot of devices still out there do not support that.
which is hard to decode using hardware acceleration
This is a little misleading. There is nothing fundamental about AV1 that makes it hard to decode, support is just not widespread yet (mostly because it is a relatively new codec).
Good point, though I believe you have to explicitly enable AV1 in Firefox for it to advertise AV1 support. YouTube on Firefox should fall back to VP9 by default (which is supported by a lot more accelerators), so not being able to decode AV1 shouldn’t be a problem for most Firefox-users (and by extension most lemmy users, I assume).
I haven’t noticed anything. Would you do me a disservice and explain what I’m missing in my blissful ignorance. Make me see something that can never be unseen.
I can only imagine that they (OP) set quality settings on [auto]. That way they might have YT constantly lowering bitrates/resolution.
I do not have any issues either, but I use fixed quality settings.
No, that's not what they are talking about. Even if you set the video to 1080p and make sure that YouTube isn't lowering it to a lower resolution, it still won't look very good.
Whether you notice or not depends on how perceptive you are, the quality of your eyesight and also the size and quality of your display. It's hard to notice on a low-grade laptop screen (or smaller), as well as a cheap TN panel monitor, but go beyond around 20" and use a decent enough IPS panel and those blocky compression artifacts are hard to miss.
I sit quite close to a large 1080p monitor. That's why I notice when the bitrate is low and the video I am seeing lacks true 1080*720 pixels. Basically it's compressed so much, that the image is noticeably worse than an image my monitor could display. That's why when I use a higher pixel count compression, like 1440p, the compression problems don't show as bad on the screen that will only show 1080p pixels anyways. That's what I am talking about. On a phone or a laptop screen it will probably be less noticeable. I guess that's why Youtube does it, it probably saves them a huge amount of bandwidth and people who want really good quality video might already have 4k displays which then get a way higher bitrate video feed anyways.
I guess the 1080p monitor size starts to be a niche. More and more people using it are on smartphones I guess so it really makes sense to have a very low bitrate.
Turns out, I have an old dumb FullHD TV that should be ideal for this experiment. So, if I watch a YT video on 1080p, I should be able to see compression artefacts that are invisible when using a higher resolution. How is that supposed to work anyway, given that the browser knows the output resolution? Will it just download a higher resolution video, drop every other pixel, and display the rest?