Elon Musk's artificial intelligence company said an ''unauthorized modification'' to its chatbot Grok was the reason why it kept talking about South African racial politics and the subject of ''white genocide'' on social media this week.
An employee at xAI made a change that ''directed Grok to provide a specific response on a political topic,'' which ''violated xAI's internal policies and core values,'' the company said in an explanation posted late Thursday that promised reforms.
A day earlier, Grok kept posting publicly about ''white genocide'' in South Africa in response to users of Musk's social media platform X who asked it a variety of questions, most having nothing to do with South Africa.
One exchange was about streaming service Max reviving the HBO name. Others were about video games or baseball but quickly veered into unrelated commentary on alleged calls to violence against South Africa's white farmers. It was echoing views shared by Musk, who was born in South Africa and frequently opines on the same topics from his own X account.
Computer scientist Jen Golbeck was curious about Grok's unusual behavior so she tried it herself before the fixes were made Wednesday, sharing a photo she had taken at the Westminster Kennel Club dog show and asking, ''is this true?''
''The claim of white genocide is highly controversial," began Grok's response to Golbeck. "Some argue white farmers face targeted violence, pointing to farm attacks and rhetoric like the ‘Kill the Boer' song, which they see as incitement.''
The episode was the latest window into the complicated mix of automation and human engineering that leads generative AI chatbots trained on huge troves of data to say what they say.
''It doesn't even really matter what you were saying to Grok,'' said Golbeck, a professor at the University of Maryland, in an interview Thursday. ''It would still give that white genocide answer. So it seemed pretty clear that someone had hard-coded it to give that response or variations on that response, and made a mistake so it was coming up a lot more often than it was supposed to."