On the other hand, if we move from larger and larger models with as much data they can gather to less generic and more specific high quality datasets, I have a feeling there's still a lot to gain. But quality over quantity takes a lot more effort to maintain.
The video is more about the diminishing returns when it comes to increasing size of training set. It’s following a logarithmic curve. At some point, just “adding more data” won’t do much because the cost will be too high compared to the gain in accuracy.