It'll happen to you!
It'll happen to you!
It'll happen to you!
You're viewing a single thread.
Remember DVI?
Hell, remember dot-matrix printer terminals?
VGA and DVI honestly were both killed off way too soon. Both will perfectly drive a 1080p 60fps display and honestly the only reason to use HDMI or Displayport with such a display is if that matches your graphics output
The biggest shame is that DVI didn't take off with its dual monitor features or USB features. Seriously there was a DVI Dual Link with USB spec so you could legitimately use a single cable with screws to prevent accidental disconnects to connect your computer to all of your office peripherals, but instead we had to wait for Thunderbolt to recreate those features but worse and more likely to drop out
DVI -- sure, but if you think 1080p over VGI looks perfect you should probably get your eyes checked.
I wouldn't be surprised if it varies by monitor but I've encountered plenty of monitors professionally where I could not tell the difference between the VGA or HDMI input, but I can absolutely tell when a user had one of the cheap adapters in the mix and generally make a point of trying to get those adapters out of production as much as possible because not only do they noticably fuck with the signal (most users don't care but I can see it at least) but they also tend to fail and create unnecessary service calls
We are still using VGA on current gen installations for the primary display on the most expensive patient monitoring system products in the world. If there's a second display it gets displayport. 1080p is still the max resolution.
My dude, VGA is ANALOG it absolutely cannot run a LCD and make it look right, it is for CRTs
The little people inside the LCD monitor can absolutely turn the electricity into numbers. I've seen them do it.
This is incorrect. Source: do it all the time for work.
Remember? I still use it for my second monitor. My first interaction with DVI was also on that monitor, probably 10-15 years ago at this point. Going from VGA to DVI-D made everything much clearer and sharper. I keep using this setup because the monitor has a great stand and doesn't take up much space with its 4:3 aspect ratio. 1280x1024 is honestly fine for having voice chat, Spotify, or some documentation open.
Hell yeah. My secondary monitor is a 1080p120 shitty TN panel from 2011. I remember the original stand had a big “3D” logo because remember those NVIDIA shutter glasses?
Connecting it is a big sturdy DVI-D cable that, come to think of it, is older than my child, my cars, and any of my pets.
Remember it? I work on PCs with DVI connected monitors every day.
Hell, I still use VGA for my work computer. I have the display port connected to the gaming laptop, and VGA connected to the work CPU. (My monitors are old, and I don't care)
My monitors are old, and I don't care
Sung to the tune of Jimmy crack corn.
My monitor is 16 years old (1080p and that's enough for me), I can use dvi or HDMI. The HDMI input is not great when using a computer with that specific model.
So I've been using DVI for 16 years.
I ran DVI for quite a while until my friend's BenQ was weirdly green over HDMI and no amount of monitor menu would fix it. So we traded cords and I never went back to DVI. I ran DisplayPort for a while when I got my 2080ti, but for some reason the proprietary Nvidia drivers (I think around v540) on Linux would cause weird diagonal lines across my monitor while on certain colors/windows.
However, the previous version drivers didn't do this, so I downgraded the driver on Pop!_OS which was easy because it keeps both the newest and previous drivers on hand. I distrohopped to a distro that didn't have an easy way to rollback drivers, so my friend suggested HDMI and it worked.
I do miss my HDMI to DVI though. I was weirdly attached to that cord, but it'd probably just sit in my big box of computer parts that I may need... someday. I still have my 10+ VGA cords though!
Yeah it was a weird system just like today's usb-c it could support different things.