Searching for Taylor Swift on X, formerly known as Twitter, showed an error message at the weekend after pornographic, AI-generated images of the singer were circulated across social media last week.
X's search function only displays results for Swift under its "Media" and "List" tabs. However, Swift is still searchable using several boolean operators. Inputting "Taylor Swift" with quotation marks, as well as "Taylor AND Swift" yield normal search results under all of X's search function tabs.
The search function error message does not appear on either Instagram or Reddit.
The fake images of Swift — which show the singer in sexually suggestive and explicit positions — were predominantly circulating on X, and were viewed tens of millions of times before being removed from social platforms.
Like most major social media platforms, X's policies ban the sharing of "synthetic, manipulated, or out-of-context media that may deceive or confuse people and lead to harm."
"This is a temporary action and done with an abundance of caution as we prioritize safety on this issue," the company told CNN in a statement.
Digitally manipulated pornographic images of celebrities are nothing new on the internet, and have been circulating online since the advent of software like Photoshop. But the rise in mainstream artificial intelligence software has heightened concerns due to its ability to create convincingly real and damaging images.
The incident comes as the United States heads into a presidential election year, prompting fears misleading AI-generated images and videos could be used in disinformation efforts.