Update: Apple fixed it. Original story below. Siri uses Wikipedia results to feed much of its knowledge base. This can of course backfire as Wikipedia is editable by anyone, including internet vandals. In this case, Siri is returning a very Not Safe For Work image when asked ‘Who is Donald Trump?’.

Try Amazon Prime 30-Day Free Trial

The problem was first discovered by The Verge; their redaction of the image is much less intense than what we have blurred out above but it’s pretty obvious what is being shown. It’s not a picture of Donald Trump.

I have independently verified this so you don’t have to. If you want to see it for yourself, ask Siri right now ‘Who is Donald Trump?’ but note that you have been warned. Other queries like ‘How old is Donald Trump?’ are also doing the same thing.

Obviously, Apple has not selected that particular appendage to be the face of the US President. It has likely been picked up by the algorithms at a moment when the source Wikipedia entry had been vandalised, and Siri is yet to refresh back to the corrected photo of Trump’s profile.

Now that this story is gaining some publicity, Apple will likely push out a fix promptly, as obviously the company has very strict lines against pornography on its devices.

It doesn’t look like this is happening in all regions, but it is definitely happening in some — including mine.

This is not the first time that Siri has been the gateway to some discretions. In the past, people have found ways to make Siri say NSFW words by finding words that have swear words in their definitions. These avenues are quickly patched by Apple when they are found.

Update: Wikipedia informed 9to5Mac that the vandalism affected the Donald Trump page and some other articles. The accounts involved have been locked and banned as appropriate.

Apple initially fixed the problem by removing the full Siri Knowledge listing. The (correct) Wikipedia data now appears to be showing again.

About the Author