A former FBI contractor has pleaded guilty to sexual exploitation charges of boys and production of child sexual abuse material (CSAM), according to the Department of Justice.
Beginning in February, Brett Janes, 33, from Arlington, Virginia, allegedly began enticing minors to submit CSAM content, according to a Nov. 7 press release by the DOJ.
In August, he was charged with child exploitation offenses.
On Tuesday, Mr. Janes “pleaded guilty to one count of sexual exploitation of children, including using children to produce CSAM, and one count of receipt of child pornography. He is scheduled to be sentenced on Feb. 27, 2024, and faces a mandatory minimum of 15 years in prison.”
According to court documents, Mr. Janes is alleged to have contacted roughly a dozen minor boys over Discord and Snapchat. He is accused of grooming by telling them that he worked for U.S. intelligence and threatening the kids with suicide if they did not continue to communicate with him.
The former FBI contractor is also said to have purchased hundreds of CSAM videos and images through Telegram.
After his arrest on May 31, Mr. Janes appeared in court the same day. As he didn’t have a lawyer, a public defender was appointed by the court.
On Aug. 23, a federal grand jury in the Eastern District of Virginia charged him with two counts of sexual exploitation of children and production of CSAM, one count of attempted coercion and enticement, and one count of receipt of child pornography.
In addition to a mandatory minimum of 15 years in prison, Mr. Janes faces the possibility of a maximum penalty of life imprisonment. Actual sentences for federal crimes typically tend to be less than the maximum penalties.
“A federal district court judge will determine any sentence after considering the U.S. Sentencing Guidelines and other statutory factors,” according to the release.
Grooming Victims
One of the victims in the case was a 13-year-old boy who met Mr. Janes while playing an online shooter game.After the duo began messaging each other, the contractor asked the boy to play a version of the game in which the loser would remove a piece of clothing on video chat. He gave the boy around $500. The two also allegedly engaged in sexual acts.
Mr. Janes used “tactics consistent with ‘grooming’ to entice [the minor] to play strip video games,” according to FBI agent Paul Fisher, who authored an affidavit in support of an arrest warrant.
In one case, Mr. Janes allegedly attempted to get a boy to visit his house even after knowing that he was only 14 years old. In another instance, he allegedly threatened to commit suicide after a boy took the money he offered, but then ghosted him.
“if you ever need [expletive] i’m here. But it’s still [expletive] up that you took money while I’m hammered. Guess im happy that it happened sooner rather than later and im just going to kill myself cause no one likes me,” Mr. Janes allegedly wrote in a message after the boy stopped responding to his messages.
Child Exploitation Laws
Mr. Janes’s arrest comes as calls for stronger child abuse laws have gathered momentum in recent times.“AI tools can rapidly and easily create ‘deepfakes’ by studying real photographs of abused children to generate new images showing those children in sexual positions,” it said. “Deepfakes can also be generated by overlaying photographs of otherwise unvictimized children on the internet with photographs of abused children to create new CSAM involving the previously unharmed children.”
“Whether the children in the source photographs for deepfakes are physically abused or not, creation and circulation of sexualized images depicting actual children threatens the physical, psychological, and emotional wellbeing of the children who are victimized by it, as well as that of their parents.”
The letter called for expanding restrictions on CSAM to “explicitly cover” AI-generated material. It asked for Congress to establish an expert commission to study how AI can be used to exploit children and identify solutions to prevent such actions.
This leaves the National Center for Missing and Exploited Children (NCMEC) and law enforcement in a position where they are unable to locate and rescue the child, it stated.
The legislation makes it mandatory for online platforms to include information that would help in identifying and locating a child depicted in CSAM as well as the individuals behind such content.
Last year, the NCMEC CyberTipline received more than 32 million reports of online CSAM—an 89 percent jump from 2019. In over 50 percent of these cases, there was not enough data that could aid law enforcement in investigating the matter, said the summary.