A US-based clearinghouse for the reporting of child sexual, abuse material, the National Centre for Missing & Exploited Children (NCMEC), has revealed that Child sexual exploitation is on the rise online and taking new forms such as images and videos generated by artificial intelligence.
According to the annual CyberTipline report released by the organization, the number of reports to the NCMEC regarding online child abuse witnessed a significant increase of over 12% in 2023 compared to the previous year. The total number of reports exceeded 36.2 million.
The majority of these reports were associated with the distribution of child sexual abuse material (CSAM), including photos and videos. Additionally, there was a rise in reports concerning financial sexual extortion, where online predators entice children into sharing explicit images or videos and subsequently demand money.
The NCMEC reported that predators utilized AI-generated CSAM to extort money from certain children and families for financial gain.
READ ALSO: Will US SEC Investigate OpenAI CEO Altman And Microsoft’s Nadella?
A spokesperson from the centre said that they received 4,700 reports of images or videos of the sexual exploitation of children made by generative AI, which is a category it only started tracking in 2023.
The NCMEC is deeply concerned about this quickly growing trend, as bad actors can use artificial intelligence to create deepfaked sexually explicit images or videos based on any photograph of a real child or generate CSAM depicting computer-generated children engaged in graphic sexual acts,” the NCMEC report states.
For the children seen in deepfakes and their families, it is devastating.”
More…
AI-generated child abuse content also impedes the identification of real child victims, according to the organization.
Creating such material is illegal in the United States, as making any visual depictions of minors engaging in sexually explicit conduct is a federal crime, according to a Massachusetts-based prosecutor from the Department of Justice, who spoke on the condition of anonymity.
Cybertipline Record Of Child Abuse
In total in 2023, the CyberTipline received more than 35.9m reports that referred to incidents of suspected CSAM, more than 90% of it uploaded outside the US. Roughly 1.1m reports were referred to police in the US, and 63,892 reports were urgent or involved a child in imminent danger, according to Tuesday’s report.
Companies That Submitted Reports To NCMEC
A total of 245 companies have submitted reports to the NCMEC through the CyberTipline, out of the 1,600 companies globally that are part of the cybertip reporting program. Internet service providers based in the US, including social media platforms, are required by law to report any instances of CSAM to the CyberTipline upon becoming aware of them.
Summary
The NCMEC has identified a disparity between the quantity of reports received and the caliber of those reports. Both the center and law enforcement are bound by legal constraints that prevent them from taking action on certain reports, even those generated by content moderation algorithms, without human intervention. This legal technicality can hinder the ability of the police to access reports pertaining to possible child abuse.
“The relatively low number of reporting companies and the poor quality of many reports marks the continued need for action from Congress and the global tech community,” the NCMEC report states.