We all think we know why Stadia died. Here's Google's official take.
A statement from a Google employee, Dov Zimring, has been released as a part of the FTC vs Microsoft court case (via 9to5Google). Only minorly redacted, the statement gives us a run down of Google's position leading up to Stadia's closure and why, ultimately, Stadia was in a death spiral long before its actual demise.
"For Stadia to succeed, both consumers and publishers needed to find sufficient value in the Stadia platform. Stadia conducted user experience research on the reasons why gamers choose one platform over another. That research showed that the primary reasons why gamers choose a game platform are (1) content catalog (breadth and depth) and (2) network effects (where their friends play).
...
"However, Stadia never had access to the extensive library of games available on Xbox, PlayStation, and Steam. More importantly, these competing services offered a wider selection of AAA games than Stadia," Zimring says.
According to the statement, Google would also offer to pay some, or all, of the costs associated with porting a game to Stadia's Linux-based streaming platform to try and get more games on the platform. Still, in Google's eyes, this wasn't enough to compete with easier platforms to develop for, such as Nvidia's GeForce Now.
Because everything ran locally at a datacenter, the real killer app of Stadia would have been a super-massively multiplayer game. There wouldn't be any problems with latency between game states, (any lag would be between the server and the console.) Imagine massive wars or mediaeval battles with thousands of participants. They never developed games that took advantage of what was unique about the platform.
AFAIK, MMOs keep all the game state on the servers already. The difference is that what they send to the client is key deltas to the game state, which the client then renders. Stadia type services instead render that on the datacenter side and send the client images.
With their expertise at networking and so-on, Google might have been able to get a slight advantage in server-to-server communication, but it wouldn't have enabled anything on a whole different scale, AFAIK.
IMO, their real advantage was that they could have dealt with platform switching in a seamless way. So, take an addictive turn-by-turn game like Civilization. Right now someone might play 20 turns before work, then commute in, think about it all day, then jump back in when they get home. With Stadia, they could have let you keep playing on your cell phone as you take the train into work. Play a few turns on a smoke break. Maybe play on a web browser on your work computer if it's a slow day. Then play again on your commute home, then play on the TV at home, but if someone wanted to watch a show, you could either go up and play on a PC, or pull out your phone, or play on a laptop...
Larger massive multiplayer capability was one of the features Google was touting upon Stadia's launch:
Over time, Buser [Google’s director of games] says we should not only see additional exclusive games on Stadia, but also cross-platform games doing things on Stadia “that would be impossible to do on a console or PC.” Instead of dividing up virtual worlds into tiny "shards" where only 100 or 150 players can occupy the same space at a time because of the limitations of individual servers, he says Google’s internal network can support living, breathing virtual worlds filled with thousands of simultaneous players. https://www.theverge.com/2019/6/6/18654632/google-stadia-price-release-date-games-bethesda-ea-doom-ubisoft-e3-2019
Sure, they claimed that, but it's telling that nobody ever took them up on that.
Google's internal network may be good, but it's not going to be an order of magnitude better than you can get in any other datacenter. If getting thousands of people into the same virtual space were just a matter of networking, an MMO would have already done it.
A shard is going to be storing the position, orientation and velocity of key entities (players, vehicles, etc.) in memory. If accessed frequently enough they'll be in the processor's cache. There's no way the speed of accessing that data can compare with networking speeds.
That doesn't mean there couldn't have been some kinds of innovations. Say a game like Star Citizen where there are space battles. In theory you could store the position and orientation of everything inside a ship in one shard and the position and orientation of ships themselves in a second shard. Since people inside the ship aren't going to be interacting directly with things outside the ship except via the ship, you could maybe afford a bit of latency and inaccuracy there. But, if you're just talking about a thousand-on-thousand melee, I think the latency between shards would be too great.
You'd only be able to play with people local to you, in the same Stadia datacenter. If Stadia wanted to minimize latency, they would increase the number of datacenters (thus making fewer people per instance).