In the last decade, the internet and social media have transformed daily life, offering global access and financial opportunities. While these platforms empower users to build audiences and monetize content, they also expose vulnerable individuals, particularly those from lower socio-economic backgrounds, to exploitation. In a society marked by financial insecurity, these platforms exacerbate class inequalities, allowing traffickers to prey on vulnerable people. This exploitation has led to legal action against major platforms, like TikTok, for enabling such harm.
In October 2024, a lawsuit filed by District of Columbia Attorney General Brian Schwalb accused TikTok of deceiving users about the safety of its platform. The suit claims TikTok operates an unlicensed virtual economy, violating money transmission and consumer protection laws. Specifically, TikTok pairs its live-streaming feature, TikTok LIVE, with a virtual currency called “TikTok Coins,” which users can buy to send “gifts” to streamers. These “gifts” can then be exchanged for real money, with TikTok taking up to a 50% commission. The complaint alleges TikTok has failed to register as a licensed money transmitter, knowingly creating an environment that exposes vulnerable populations, including children, to sexual exploitation for financial gain.
Although TikTok has implemented age restrictions, the platform has been ineffective in preventing minors from engaging in these monetized interactions. In 2022, TikTok launched an internal investigation named Project Meramec. This investigation revealed that TikTok is aware that hundreds of thousands of minors were able to bypass TikTok’s age restrictions and host LIVE sessions, despite the platform’s guidelines requiring users to be at least 18. However, TikTok has taken no further action to enforce compliance with these age restrictions. According to a lawsuit filed in Washington D.C., TikTok falsely claims its LIVE features are age-restricted and safe, while exposing children to significant risks, including exploitation for explicit content.
Further supporting these findings, on January 6, 2025, the Utah Division of Consumer Protection released previously redacted information from their 2024 case against TikTok in the Unsealed Redaction Highlights, which revealed that “TikTok employees recognized . . . [the] Live Recommendation algorithm prefers feeds with gifts, so [it] incentivizes sexual content.” After this report was released, TikTok claimed to have implemented measures to improve the safety of minors on its platform. However, the company faced significant criticism for failing to take more substantial action in response to the findings of Project Meramec. The Unsealed Report also revealed that TikTok was aware its livestream feature encouraged sexual content and exploitation of minors but failed to act as they “profited significantly from ‘transactional gifting’ involving nudity and sexual activity, all facilitated by TikTok’s virtual currency system.”
Trafficking victims—often marginalized due to gender, race, or poverty—are seen through social stigmas, making them easier targets as these biases allow traffickers to exploit them with fewer societal repercussions. For instance, women of color are often hypersexualized, while those from disadvantaged backgrounds are viewed as naive, increasing their susceptibility to traffickers’ false promises. As the digital economy grows, traffickers frequently hold more power than victims, perpetuating a dangerous cycle of commercial sexual exploitation.
The internet has become a marketplace where power, class, and exploitation intersect in complex ways. The anonymity of social media, combined with its global reach, allows traffickers to conceal their actions while preying on vulnerable individuals. Traffickers use social media to lure victims into false relationships under the guise of romance or promises of financial support. These online connections quickly turn exploitative, with traffickers coercing victims into commercial sexual exploitation after promising them a better life or career opportunities.
Online platforms have a responsibility not only to moderate harmful content but also to implement proactive safeguards that prevent exploitation. The failure of companies like TikTok to address these risks underscores the urgent need for stronger regulations and legislative action. The Kids Online Safety Act (KOSA), for instance, was introduced to hold platforms accountable for the well-being of minors by requiring more comprehensive safety measures. Although the bill passed in the Senate, it stalled in the House, leaving a regulatory gap that continues to put children at risk. Even if reintroduced, KOSA would primarily protect minors, failing to extend necessary safeguards to vulnerable adults—many of whom also face online exploitation.
Another legislative effort, the Take It Down Act, recently passed unanimously in the Senate, and provides an important mechanism for removing explicit content of minors from the internet—demonstrating bipartisan recognition of the dangers these platforms pose. However, these efforts alone are insufficient. Holding platforms accountable through stronger regulations, liability measures, and oversight is essential to combating online exploitation.
This piece is part of our first-year law student blog series. Congratulations to author Amisha Mirchandani on being chosen!
All views expressed herein are personal to the author and do not necessarily reflect the views of Villanova University Charles Widger School of Law or of Villanova University.