Some iPhone users recently reported a strange bug in Apple’s automatic dictation feature, where the word “racist” appeared as “Trump” before correcting itself.
The issue, replicated by The New York Times, quickly went viral on TikTok, sparking debates about Apple’s artificial intelligence capabilities.
An Apple spokeswoman attributed the glitch to phonetic similarities between the words and assured users that the company was working on a fix.
John Burkey, founder of Wonderrush.ai and a former Apple Siri team member, suggested that the issue might stem from software code rather than AI training data. “This smells like a serious prank,” he said. “The only question is: Did someone slip this into the data or the code?”
This is not the first AI-related mishap for Apple. Last month, the company disabled its news summarization feature after it inaccurately summarized headlines. In 2018, Siri was embroiled in controversy when it displayed a nude image in response to “Who is Donald Trump?”—a problem linked to rogue Wikipedia edits.
The dictation issue emerged just a day after Apple announced a $500 billion investment in U.S. manufacturing, including a new AI server facility in Houston. The investment follows multiple meetings between Apple CEO Tim Cook and former President Trump, highlighting their ongoing business relationship.