What's the speed? Do you have a shitty 10mbps connection like my parents? Then WiFi, because you're easily saturating that line either way.
Do you have gigabit? Then Ethernet, but then again getting like 600mbps wirelessly is good enough.
Biggest thing is having GOOD coverage. My house has multiple access points so that my connection is great everywhere. People with a shitty ISP router shoved in the cupboard in their basement make no sense lol.
Packet loss really, and the latency and jitter said loss can contribute to.
Radio waves go faster (speed of light) than through a medium (copper).
Not that it matters at such a small scale, but it's helpful to have a good picture of the elements at work here. The further you are from the receiving point, the more obstacles (matter) that can obstruct it. But in ideal conditions WiFi is better than most people think. Replicating those ideal conditions though...
Radio waves go faster (speed of light) than through a medium (copper).
Except that copper ethernet is baseband, so it's not radio waves. WiFi is still faster than copper AFAIK (there was a huge debate about this between youtubers not that long ago), at least for signalling, but the difference is smaller than you think. light (which is EM, the same waves as radio/WiFi) through glass is about 2/3rds c (aka the speed of light), and it's actually a lot slower than ethernet or WiFi for propagation delay, however, WiFi must use CSMA/CA as well as other tricks to ensure it doesn't step on itself, and that it doesn't step on other sources of radio interference (Microwave ovens, wireless controllers (like xbox), bluetooth, zigbee, etc, on 2.4Ghz and stuff like RADAR on 5Ghz). It's half-duplex, so only one station can transmit at a time, hense CSMA/CA being required, where ethernet doesn't need any collision avoidance or detection except for rare cases of 10/100 half duplex, all gigabit is full duplex. Half duplex on wireline networks is basically eliminated at this point, so it's little more than a footnote.
Factoring all this in, getting the signal down the line, WiFi loses in almost every case, due to all the considerations it needs to take into account.
What crap are you doing that so intensive WiFi causes latency? It's essentially a negligible difference unless you are saturating the signal. We're taking less than 3ms for a reliable round trip.
There are lots of factors that can cause jitter on WiFi, and it's mostly outside of your control if you're living somewhere more densely populated. My apartment randomly gets a lot of noise, and as a result my WiFi starts to get unacceptable amounts of packet loss and jitter. It doesn't happen often enough to motivate the effort for me to go around signal analyzing, but still...
The the picture to me says, would you rather have a desktop and console plugged up via ethernet in here and 15 other devices connected through your house or 1 single device plugged in.
No more "why's it down now"; no deauth attacks; no weird outages when highway traffic spikes from nav\music-streaming users getting tower timeouts that cause their WiFi to aggressively cry out for every known SSID.
With wired connections, I set it up once & it keeps working.
With WiFi, it's a constant shouting match version of the Telephone game, with openly malicious actors literally headquartered a few blocks away.
Wi-Fi has constant retransmissions. This adds perceptible latency because the checksum check, turnaround, and packet transmission add a lot of time compared to the speed of light through air across 3 meters.
I had 100mbps ethernet because incompetent ISP worker who crimped only two pairs out of four. And I had AFAIR 150mbps plan! Don't know what to wish for that idiot.