Massive battery deployment stabilizes America's electrical infrastructure, allowing for renewable energy growth and preventing blackouts.
According to data from the Energy Information Administration (EIA), more than 20 gigawatts (GW) of battery capacity have been added to the US electric grid in the last four years. This rapid expansion is equivalent to the production of 20 nuclear reactors and is crucial for averting power disruptions, especially in states that rely significantly on intermittent renewable energy sources such as wind and solar.
How can an article reliable convey information when the core measurement unit used in article is invalid? Not taking about slightly wrong numbers, but the foundation of the article.
Storage does have 2 relevant metrics. how fast it can charge/discharge in GW, and the amount of energy available in gwh. Batteries typically have both these amounts equal. While other storage technologies usually can discharge a large amount of gwh at a slow rate. The discharge rate is often limited to the line capacity available as well.
They're equal if they're running at a 1c discharge rate. Lfp, which are stable and good for safety, can have higher discharge rates of 5c up to 25c. Which would mean the capacity would be much less. To compare apples to apples, it'd be much better if they gave both the GW and GWh numbers.
Turns out nobody cares about the capacity but about the discharge rate. Which is why you'll often hear about how many Gigawatts a certain energy storage project has, and nothing about GWh.
If you think about it for a bit, it does make sense. A lot of solutions would take days to fully discharge, so you might think "oh we have an entire city's energy consumption for a week in some storage", but in reality you could maybe power one neighborhood with that cuz of the discharge rate.