Scranton, Pa

Student Blog Series: TikTok and the Growing Media Exploitation of Minors

Posted: February 28, 2024

TikTok has emerged as one of the world’s most popular apps in the last few years and is currently used by over a billion people per month. Of its total users, roughly 68% are under the age of twenty-four, and in the United States, TikTok is reported to be used by about two-thirds of the country’s teenagers. As of September 2023, TikTok removed over twenty million accounts that were suspected of belonging to children under the age of thirteen, revealing that there is likely a large unrepresented demographic of underage users.

Like many other social media platforms, TikTok operates on a system of views designed to encourage creators to garner as many eyes and as much engagement as they can to their content in order to receive a profit. TikTok’s algorithm then keeps track of what content a user engages with and circulates similar videos on their screen. Through TikTok’s “live” feature, creators can broadcast their actions in real time and receive “gifts from those watching. Gifts are then converted into diamonds which can then be exchanged for real currency. These interactions can quickly become exploitative, with minors performing sexually explicit acts in exchange for financial gain. Under this scheme, minors are encouraged to create content of themselves dancing or behaving in a way that might otherwise be viewed in the name of “likes” and “diamonds.” Once a user views these types of videos, they will be shown more and more content featuring minors. As Jon Rouse, a police-veteran who currently leads a team targeting child sex-offenders for Interpol, states, “Child sex offenders will gravitate toward where there are children,” and TikTok represents what is likely the world’s largest community of minors.

This platform, then, is a prime place for the commercial exploitation of minors, as money can be made from both participation in and engagement with such content. Since the open-access platform has been shown to be a hotspot for  potential traffickers and buyers, children who post or are posted on the app are at risk of coming into contact with potentially dangerous individuals. There have been a number of reported cases of adults using the platform to prey on minors, ranging from inappropriate messages to physical assault. In March of 2022, an interaction between a 42 year-old man and a 14 year-old girl went from comments on a post to criminal charges after the man took a bus to go visit the minor he claimed to be “in love” with. In other instances, adults have used their ability to communicate with children via the app to request and exchange child sexual abuse material, even from those as young as 8 years-old.

While TikTok assures that security measures are in place to prevent harmful or inappropriate content, these security measures are far from foolproof. Between July and September of 2023, a total 136,530,418 videos were removed by the platform, though the exact number of youth safety and well-being removals are not specified. TikTok’s latest Community Guidelines Report, however, does reveal that, of the removed videos, only about 60% of those depicting youth exploitation, abuse, or underage nudity were removed prior to receiving any views. Additionally, the numbers are even more alarming surrounding sexually suggestive content of minors, which had a roughly 40% removal rate prior to any views, and a 66.7% removal rate within 24 hours.

TikTok’s system often fails to block content quickly or fails to recognize the methods users have begun to employ to circumvent restrictions. One such method is to create and exchange private accounts that contain sexually explicit content hidden under the app’s “only me” feature. Since the videos are hidden, other users are unable to report them for inappropriate content unless they are suddenly made public again. Even when these videos are reported, TikTok’s screening system often fails to remove them, claiming there were no violations. Even when videos are removed, new accounts are circulated faster than the platform’s moderators can keep up with. Those seeking out such content are connected through a series of code words that go undetected by TikTok’s security features. These codes can be commented, tagged, or hidden in profiles so that traffickers know where to look.

The private accounts community is a thriving sector of TikTok, and it does not take much effort to be sent an invitation to access such content. Those who organize these groups will advertise  the types of, mainly, girls that they are hoping will make content for them. A number of these girls are minors, typically under the age of 15. Passwords to these accounts are then shared to allow individuals access to a private account which can be used to view or upload explicit content that is only visible to those who also share the password. Additionally, once a minor engages with this comment once, they are then thrust into the world of child sexual abuse material (CSAM) and are at greater risk of being persuaded to help make content for the sake of popularity and promises of “going viral.”

The continued existence and popularity of such content exhibits the growing need to strengthen security measures and restrictions on platforms that are easily accessible to, if not outright marketed towards, children. With investigations being initiated by the attorney generals of several states and a growing number of content being flagged and reported to the platform every year, TikTok has recently attempted to strengthen its policies in regards to child sexual abuse material. However, based on the recent 2023 data, there are still many changes that need to be made in order for the platform to effectively address the growing threat of child exploitation.

The CSE Institute believes that companies that benefit from the sexual exploitation of children and the proliferation of child sexual abuse materials must be held accountable. The CSE Institute further condemns platforms that fail to implement effective policies to protect their users from exploitation.

This piece is part of our first-year law student blog series. Congratulations to author Alex Taylor on being chosen!

All views expressed herein are personal to the author and do not necessarily reflect the views of the Villanova University Charles Widger School of Law or of Villanova University.

Category: News

« Back to News
  • Learn More About The CSE Institute

    We welcome contact from organizations and individuals interested in more information about The CSE Institute and how to support it.

    Shea M. Rhodes, Esq.
    Director
    Tel: 610-519-7183
    Email: shea.rhodes@law.villanova.edu

    Prof. Michelle M. Dempsey
    Faculty Advisor
    Tel: 610-519-8011
    Email: dempsey@law.villanova.edu

    Contact Us »