AI-powered writing instruments promise to democratize communication, serving to individuals write quicker, extra clearly, and with larger confidence. But as these AI instruments go world, a rising physique of analysis warns they could be reshaping cultural identification in refined however vital methods.
AI writing instruments homogenize world voices
A brand new research from Cornell has recognized an sudden impact from the worldwide attain of AI assistants: They homogenize language, making billions of customers within the Global South sound extra like Americans.
In the research, members from the US and India who used an AI writing assistant produced extra comparable writing than those that wrote with out one. Indian members additionally spent extra time modifying the AI’s solutions to higher mirror their cultural context, which in the end diminished the instrument’s general productiveness advantages.
Cultural stereotyping via predictive solutions
“This is one of the first studies, if not the first, to show that the use of AI in writing could lead to cultural stereotyping and language homogenization,” mentioned Aditya Vashistha, assistant professor of knowledge science and the senior creator of the research.
“People start writing similarly to others, and that’s not what we want,” Vashistha added. “One of the beautiful things about the world is the diversity that we have.”
How the Cornell research about AI writing assistants was designed
The Cornell research gathered 118 members — about half from the US and half from India. Participants had been then requested to put in writing about cultural matters, with half in every nation writing independently, and the opposite half utilizing AI assistants.
Indian members utilizing the AI writing assistant accepted about 25% of the instrument’s solutions, whereas American writers accepted roughly 19%. However, Indians had been much more more likely to modify the solutions to suit their cultural writing model, making the instrument a lot much less useful.
Western norms embedded in AI defaults
One instance is that when members wrote about their favourite meals and vacation, the AI assistant really helpful distinctly American favorites, together with pizza and Christmas. And when writing about their favourite actors, Indians who began typing “S” obtained solutions of Shaquille O’Neil or Scarlett Johansson, slightly than well-known Bollywood actor Shah Rukh Khan.
The cause for this Western bias could also be that AI assistants like ChatGPT are powered by massive language fashions (LLMs) developed by US tech corporations. These instruments at the moment are getting used globally, together with by 85% of the world’s inhabitants within the Global South.
Rising issues of ‘AI colonialism’
Researchers recommend that Indian customers at the moment are going through “AI colonialism,” with the bias of those assistants presenting Western tradition as superior. This has the potential to vary not solely the way in which non-Western customers write but in addition how they assume.
“These technologies obviously bring a lot of value into people’s lives,” mentioned Paromita Agarwal, a co-author of the research. “But for that value to be equitable and for these products to do well in these markets, tech companies need to focus on cultural aspects, rather than just language aspects.”
TechnologyAdvice contributing author Michael Kurko wrote this text.







