Description: An investigation disclosed that Instagram's recommendation algorithms are promoting accounts that facilitate and sell child sexual abuse material (CSAM). The study, conducted by The Wall Street Journal and researchers at Stanford University and the University of Massachusetts Amherst, indicates that Instagram's algorithms not only allow for the discovery of such accounts through keyword searches but also actively recommend them to users within the network. The issue is especially concerning given Instagram's popularity among teenagers.
Entities
View all entitiesIncident Stats
Incident ID
583
Report Count
1
Incident Date
2023-06-07
Editors
Applied Taxonomies
CSETv1 Taxonomy Classifications
Taxonomy DetailsIncident Number
The number of the incident in the AI Incident Database.
583
Notes (special interest intangible harm)
Input any notes that may help explain your answers.
4.2 - Child sexual abuse content is illegal
Special Interest Intangible Harm
An assessment of whether a special interest intangible harm occurred. This assessment does not consider the context of the intangible harm, if an AI was involved, or if there is characterizable class or subgroup of harmed entities. It is also not assessing if an intangible harm occurred. It is only asking if a special interest intangible harm occurred.