Thursday, February 27, 2025

 
HomeTECHNOLOGYNo, Apple isn't subliminally calling Trump a racist via dictation

No, Apple isn’t subliminally calling Trump a racist via dictation


Voice Memos showing the audio waveforms for the spoken words ‘Trump’ and ‘racist’

Conspiracy theorists believe Apple may have coded speech-to-text to briefly showing “Trump” when the word “racist” is said. That’s not true, but there’s a reason that it’s happening.

Machine learning algorithms can be fickle things. Apple’s on-device dictation is still run by machine learning algorithms built with massive data sets trained on user inputs.

The problem with teaching any algorithm anything is that they are very predisposed to relying on pattern matching. In the decade Donald Trump has been in the public eye, there has likely been an increase in the word “racist” appearing near “Trump.”

If the word association occurs enough, it may tilt an algorithm one way. While the waveforms in the image above show distinct differences in the two words, there are enough phonetic similarities that might confuse an algorithm further.

Apple told The New York Times that the issue is related to a phonetic overlap between the two words. The report also shared a statement from John Burkey, founder of Wonderrush.ai, who said it may have been a prank inserted into the system somewhere.

When I first tried to trigger it, the result was instant. But as I continued to say “racist,” the occurrences of “Trump” appearing briefly decreased. It’s as if the ML was figuring out that I meant to say racist instead of Trump, which is what algorithms do.

It is highly unlikely Apple or any individual did this on purpose. The behavior seems to lean more towards an ML issue, and as Apple said, phonetic likeness — not an intentional change.

There are only so many sounds a human can make, so many words have bits and pieces that are identical. That’s why a seemingly very different word like “Trump” can be mistaken by a computer algorithm for the word “racist.”

Accents and pronunciations will alter how that waveform looks as well. It’s part of why some people are more likely to trigger the temporary “Trump” appearance and some aren’t.

These phonetically similar words being uttered near each other are due to Donald Trump being a controversial figure. Between his attitude towards the Black Lives Matter movement and his recent push to eliminate DEI programs, a lot of people are stating “racist” very close to the name “Trump.”

The algorithm issue is the most likely culprit. Not some conspiratorial nonsense about a deep state controlling people’s minds with subliminal messaging or whatever Alex Jones was proposing.

Apple’s aggressive algorithms have made mistakes before. One example occurred in 2024 when the emoji suggestions show a Palestinian flag when users typed Jerusalem, which Apple corrected.

Apple says a fix is coming to correct the dictation mistake.



This story originally appeared on Appleinsider

RELATED ARTICLES

Most Popular

Recent Comments