“But perhaps the most meaningful use case for a language replicator has to do with the specific language it works for: the language of the internet from whence its training data was scraped. As Timnit Gebru and her research colleagues correctly pointed out in the stochastic parrot paper, these LLMs train most and perform best on the linguistic tropes present in the language of the internet: well-off, well-connected, and in many datasets, largely anglophone.”
https://chelseatroy.com/2024/06/23/how-do-we-build-the-future-with-ai/