Machine translators have made it easier than ever to create error-plagued Wikipedia articles in obscure languages. What happens when AI models get trained on junk pages?
Sure there are limitations. The point still stands: an imperfect machine translation is better than no translation, as long as people understand it is.
Can we afford to allow a high bad deprive people of knowledge just because of the language they speak?
The article complains about the affect on languages of poor machine translations, but the affect of no translations is worse. Yes those Greenlanders should be able to read all of Wikipedia without learning English and even if the project has no human translators
Sure there are limitations. The point still stands: an imperfect machine translation is better than no translation, as long as people understand it is.
Can we afford to allow a high bad deprive people of knowledge just because of the language they speak?
The article complains about the affect on languages of poor machine translations, but the affect of no translations is worse. Yes those Greenlanders should be able to read all of Wikipedia without learning English and even if the project has no human translators
Wikipedia already has a button where you can go to another language’s version of that page where you can then machine translate it yourself.
I didn’t know that. I guess my “English privilege” is showing
Again, your assuming a high level of accuracy from these tools. If LLM garbage leaves it unreadable, is that actually better?