Site icon For God's Glory Alone Ministries

Trump deepfakes and TikTok’s troubling algorithm

Images created by Eliot Higgins with the use of artificial intelligence show a fictitious skirmish with Donald Trump and New York City police officers posted on Higgins’ Twitter account, as photographed on an iPhone in Arlington, Va., Thursday, March 23, 2023. The highly detailed, sensational images, which are not real, were produced using a sophisticated and widely accessible image generator. (AP Photo/J. David Ake)
Trump deepfakes and TikTok’s troubling algorithm reveal our deepest need
While news of Donald Trump’s impending arrest dominated headlines earlier this week, the former president remains a free man as of this writing. That reality might come as a surprise, however, for the millions of people who’ve already seen pictures of him thrown to the ground and dragged off by police.

It turns out, those “deepfake” pictures that went viral across social media were the work of an AI art generator following the suggestions of Eliot Higgins, the founder of an open-source investigative outlet called Bellingcat.

As Higgins described, “I was just mucking about. I thought maybe five people would retweet it.” More than 5.5 million views later—not counting all those who have shared the images across other platforms—it’s safe to say that the images have surpassed his initial expectations. And while the original post included the caption “Making pictures of Trump getting arrested while waiting for Trump’s arrest” to clarify that the images were fake, that disclaimer was quickly lost as the pictures spread.

Senator Mark Warner (D-Va.) noted that “while it took a few years for the capabilities to catch up, we’re now at a point where these tools are widely available and incredibly capable.” And the more famous the person at the focus of the art, the more realistic the images become since the AI gets better at portraying someone the more often it attempts to do so.

Sam Gregory, the executive director of the human rights organization Witness, warns that a time could be fast approaching when realistic but false images made for fun are the least of our concerns: “There’s been a giant step forward in the ability to create fake but believable images in volume. And it’s easy to see how this could be done in a coordinated way with an intent to deceive. . . . The aim may not be to convince people that a certain event happened but to convince people that they can’t trust anything and to undermine trust in all images.”

However, if that outcome were to become a reality, it would not necessarily be the fault of the AI but rather of the people who use it. And we don’t have to look far to see how those decisions are already yielding potentially devastating consequences.

Ten minutes to guns loaded

In a recent study by the group EKO, researchers set up nine new TikTok accounts (PDF), each with a birthday portraying the account holder as a thirteen-year-old, the youngest a user can be to set up an account with the service. Their goal was to see how easy it would be for a child to find explicit videos related to suicide, incel and “manosphere,” and drugs.

After establishing accounts to focus on each of those subjects, they liked and bookmarked—but did not share or comment on—ten videos related to one of those topics. That sample proved sufficient for TikTok’s algorithm to flood their For You Page with videos that promoted increasingly explicit content related to their search.

The results on suicide were particularly troubling.

As the researchers relate, it only took ten minutes of basic viewing for TikTok to begin recommending videos “with guns being loaded and text suggesting suicide, alongside hundreds of comments in agreement and some listing exact dates to self-harm or attempt suicide. Beyond videos explicitly pushing suicide, TikTok’s For You Page was filled with videos promoting content that pushes despondent and hopeless commentary.”

The study’s authors caution that “looking at these videos in isolation might not raise concern. . . . [but] the algorithm seemed to be chasing our researcher with content to keep them on the platform. In this case, the content fed by TikTok’s algorithm was overwhelmingly depressing, nihilistic and otherwise hopeless.” They go on to describe how “even employees at TikTok have been disturbed by the app’s push towards depressive content, that could include self-harm.”

And these issues are hardly limited to TikTok. Most social media platforms have AI-driven algorithms designed to promote increasingly engaging content in whatever areas a user shows interest.

The true problem with AI

It would be easy to look at the findings in the EKO survey or the chaos created by the fake images of Donald Trump’s arrest and conclude that the problem is the technology.

We must remember, however, that AI is not inherently evil. After all, if you go looking for funny animal videos, cooking tips, or sports highlights, it can fill your feed with content that brings happiness and laughter. But if you go with a darker purpose in mind, it can easily exacerbate those intentions as well. And those darker intentions have been around since humanity first left Eden.

Ultimately, the problem with AI is the degree to which it makes feeding our sinful impulses so much simpler. And it can do so in a way that is so subtle that we hardly even notice it’s happening. Again, though, the foundational problem is and always will be our sin.

We can get mad at TikTok or other forms of social media—and such anger or hesitance is by no means unwarranted—but even if they went away tomorrow we would still create new ways to satisfy those same desires.

At the end of the day, people just need Jesus. And as clichéd or preachy as that may sound, it’s the truth.

So be mindful of the power wielded by social media and the artificial intelligence baked into its algorithms, but don’t forget that you are ultimately responsible for its influence in your life. And be sure that when it comes to evaluating that influence, you remember to include God in the conversation.

He is the only One who can save us from the sin that resides at the heart of these problems, both eternally and in the present moment.

Will you seek his help today?

FEATURED ARTICLES

Gwyneth Paltrow’s trial and “Celebrity Worship Syndrome”

Read more

Hockey goalie James Reimer faces backlash for not wearing a Pride Night jersey: A lesson in sharing the truth in love

Read more

Share today’s Daily Article on social media or forward this email:

Donate to Denison Forum
Ryan Denison, PhD, is the Senior Editor for Theology at Denison Forum, where he contributes writing and research to many of the ministry’s productions. He holds a PhD in church history from BH Carroll Theological Institute after having earned his MDiv at Truett Seminary. Ryan has also taught at BH Carroll and Dallas Baptist University.

Denison Ministries includes:

– DenisonForum.org
– First15.org
– ChristianParenting.org
– FoundationsWithJanet.org

Exit mobile version