Intel researchers create a method for AI-generating frames in games without added input latency
Intel researchers create a method for AI-generating frames in games without added input latency

Intel researchers create a method for AI-generating frames in games without added input latency

This is great and I hope this technology can be implemented on older hardware that maybe barely doesn't meet todays high system requirements.
I hope this is not used as a crutch by developers to hide really bad optimization and performance, as they have already been doing with upscalers like FSR/DLSS.
no, I fucking hope not. Older games rendered an actual frame. Modern engines render a noisy, extremely ugly mess, and rely on temporal denoising and frame generation (which is why most modern games only show you scenes with static scenery with a very slow moving camera).
Just render the damn thing properly in the first place!
Depends what you want to render. High fps requirements in conjunction with movement where the human eye is the bottleneck is a perfect interpolation case. In such a case the bad frames aren't really seen.
I think you are misunderstanding, because I agree with you when the games minimum hardware requirements are met.
I am saying I hope this technology can be used so that hardware that is below minimum requirements could potentially still get decently playable framerates using this technology on newer titles. The obvious drawback being decreased visual quality. I agree that upscaling, particularly TAA and its related effects, should not be used to reduce system requirements because the developers do not design their game well or make use of ugly effects. But I think this can be useful for old systems or perhaps only integrated graphics chips depending on how the technology works. That was what I meant. Sorry I was not clear enough initially.