I watched this happen in a research institution in the early 2000s. Scientists who had been heavy SGI, Sun, or HP customers realized that they could get a lot more bang for their buck with beige-box PCs running Linux (or occasionally FreeBSD). Aside from up-front costs, hardware upgrades and replacements were much cheaper and easier to get for PCs.
The big Unix vendors did not help their reputation when they started selling Windows machines — which all of them except Sun did in that era. It became increasingly clear that commercial Unix for scientific computing no longer had a future.
SGI was able to sputter along while graphics cards caught up. Still large systems had some incredible hardware stuff like interconnections that im afraid got lost to humanity.
Yes, Unix was paid, but companies didn't switch to Linux, they switched to Windows.
So the cost of Linux doesn't have anything to do with this.
The reason why Unix workstations died was that Windows became good enough and PCs we getting good enough.
Windows PCs were cheaper than Unix workstations, and if they are good enough then there is no reason for a company to pick something else.
Then we have the last nail in the coffin, Itanium.
Itanium was supposed to be the future, Intel marketed it hard enough that companies that had previously developed their own CPUs decided to switch to Itanium and stopped developing their own CPUs.
Then when Itanium proved to be a crap shoot, they had nowhere to go.
I didn't mention Windows because I don't remember any serious workstations using it. PCs did become good enough but it still wasn't the professional workstation level. Though if we're talking about computing in general, Windows has been the king ever since 95 or even 3.0.