With the rise of traffickers utilizing online video games to groom and exploit vulnerable persons, Microsoft has taken major steps to address the problem. Specifically, Microsoft launched “ Project Artemis,” which is an artificial intelligence program aimed at identifying common language, terminology, and trends indicative of grooming individuals for trafficking. Grooming is the process a human trafficker uses to identify and eventually control someone for the purpose of trafficking. Project Artemis strives to recognize, address, and report online predators as they attempt to lure victims. Based on Microsoft’s patented AI technology as well as its collaboration with Meet Group, Roblox, KiK, and Thorn, the software is available free of charge to qualified online service providers that offer chat functionality. Microsoft stated that as a technology company, “we must innovate and invest in tools, technology, and partnerships to support the global fight needed to address online child sexual exploitation.”
The development of Project Artemis began in November of 2018 when Microsoft entered a partnership with Thorn, a technology nonprofit that builds technology to defend children from sexual abuse. In conjunction with WePROTECT Global Alliance and Child Dignity Alliance, Microsoft hosted multiple cross-industry competitions for algorithm development and collaborative coding, known as “Hackathons.” This industry-wide contest focused on technology and engineering development as well as legal, and policy implementation. Along with Project Artemis, the teams developed “Photo DNA”, a free tool used by companies and organizations throughout the world to detect, disrupt, and report millions of child sexual exploitation images.
Based on the feedback provided by these nonprofits and collaborative coding achievements, Microsoft developed Project Artemis’s groundbreaking algorithm. Based on conversation characteristics, Project Artemis calculates a probability rating of abuse and assigns a score. This rating can then be used as a determiner, set by individual companies implementing the technique, as to when a flagged conversation should be sent to human moderators for review. Human moderators are then capable of identifying imminent threats as well as incidents of suspected exploitation and can then refer these situations to law enforcement or to the National Center for Missing and Exploited Children (NCMEC). NCMEC, along with ECPAT International, INHOPE, and the Internet Watch Foundation (IWF), provided valuable feedback throughout the collaborative process. As Microsoft has stated, Project Artemis is a significant step forward, but it is by no means a solution. Sexual exploitation and abuse online are pervasive problems. However, Microsoft has made certain tools available and has invited engagement from other technology companies and organizations to help prevent trafficking.
This piece is part of our first-year law student blog series. Congratulations to author Sarah Urie on being chosen!
All views expressed herein are personal to the author and do not necessarily reflect the views of the Villanova University Charles Widger School of Law or of Villanova University.