Author(s): Beauty Ilayira and Precious Nkomo
Paper Details: Volume 3, Issue 4
Citation: IJLSSS 3(4) 17
Page No: 176 – 190
ABSTRACT
An overview of AI in addressing child sexual abuse Child Sexual Abuse (CSA) represents a widespread and profoundly traumatic infringement upon a child’s physical and psychological integrity. It is a significant form of violence that impacts children from all societies, cultures, and socioeconomic backgrounds. The abuse can take place in both physical and virtual environments, and its effects are enduring, influencing the emotional, social, and legal well-being of victims.
Child sexual abuse (CSA), often referred to as child molestation, is a type of child abuse where an adult or older adolescent exploits a child for sexual gratification. Forms of child sexual abuse encompass engaging in sexual activities with a child (whether through solicitation, coercion, or other methods), indecent exposure, child grooming, and child sexual exploitation, including the use of a child to create child pornography.
As stated by the World Health Organization (WHO) “Child sexual abuse refers to the engagement of a child in sexual activities that they do not fully understand, are incapable of providing informed consent for, or are not developmentally ready to participate in, or that contravene the laws or social norms of society.”
The Protection of Children from Sexual Offences (POCSO) Act, 2012 “Any individual who, with the intent of sexual gratification, touches the vagina, penis, anus, or breast of a child, or compels the child to touch the vagina, penis, anus, or breast of themselves or another individual, or engages in any other act with sexual intent that involves physical contact without penetration, is considered to have committed sexual assault“
With the advent of the internet and digital platforms, the reach of Child Sexual Abuse (CSA) has significantly broadened. Children are at risk not only in physical environments but also in the online realm.
KEY WORDS: CHILD, ABUSE, ARTIFICIAL INTELLIGENCE, SEXUAL ASSAULT, RAPIST, PREVENTIVE.
UNDERSTANDING CHILD SEXUAL ABUSE (CSA)
Child Sexual Abuse (CSA) can lead to significant and enduring physical, emotional, and psychological consequences for its victims. It is essential to comprehend CSA for effective prevention, identification, and support for those affected.
CSA represents one of the gravest infringements on a child’s rights, impacting their physical, emotional, and psychological growth. This abuse includes various sexual acts forced upon children through coercion, manipulation, or violence. As the internet and digital technologies become more pervasive, CSA has expanded beyond physical environments into the online sphere, complicating its detection and necessitating urgent preventive measures.
Under the Indian law, CSA is protected under the Protection of Children from Sexual Offences (POSCO) ACT, 2012.
SECTION 7 OF THE ACT (POSCO) ACT, 2012
Sexual assault: Whoever, with sexual intent, touches the vagina, penis, anus, or breast of the child or makes the child touch the vagina, penis, anus, or breast of such person or any other person, or does any other act with sexual intent which involves physical contact without penetration, is said to commit sexual assault
WHAT IS CHILD SEXUAL ABUSE?
Child sexual abuse constitutes a type of child maltreatment that involves sexual interactions with a minor. A child is incapable of giving consent to any type of sexual activity, unequivocally. When an offender interacts with a child in this manner, they are perpetrating a crime that can result in enduring consequences for the victim over many years. It is important to note that child sexual abuse does not necessarily require physical contact between the offender and the child, but can include any active which involves an individual who, with the intent of sexual gratification, touches the vagina, penis, anus, or breast of a child, or compels the child to touch the vagina, penis, anus, or breast of themselves or another individual, or engages in any other act with sexual intent that entails physical contact without penetration, is considered to have committed sexual assault.
It can also be Exhibitionism, which involves exposing oneself to a minor, Fondling, Intercourse, Masturbation in the presence of a minor or compelling the minor to masturbate, Obscene discussions, phone calls, text messages, or digital communications, Creating, possessing, or distributing pornographic images or films featuring children, Any form of sexual activity with a minor, including vaginal, oral, or anal sex, Sex trafficking, Any other form of sexual contact that involves a minor.
TYPOLOGIES OF CHILD SEXUAL ABUSE
Child Sexual Abuse (CSA) can take on various forms, which include:
- Intrafamilial abuse: This refers to abuse perpetrated by family members or caregivers.
- Extrafamilial abuse: This type of abuse is committed by individuals who are not part of the family, such as acquaintances or strangers.
- Online abuse: This encompasses abuse that takes place through digital platforms, including grooming, sextortion, and the distribution of child sexual abuse material.
- Institutional abuse: This form of abuse occurs within institutions, such as schools, religious organizations, or sports clubs.
- Commercial sexual exploitation: This involves abuse that is carried out for financial or material gain.
Recognizing these categories can aid in identifying potential risks and in formulating targeted prevention strategies.
DIFFERENCES BETWEEN CHILD SEXUAL ABUSERS AND RAPISTS
Although both child sexual abusers and rapists engage in severe sexual offenses, they vary significantly regarding victim characteristics, psychological motivations, behavioral patterns, and legal repercussions. It is crucial to recognize that not every rapist is a child sexual abuser, nor is every child sexual abuser a rapist in a legal context. Child sexual abuse (CSA) can encompass non-penetrative actions such as fondling, exposure, and online grooming, whereas rape typically pertains to penetrative acts. Offenders of CSA frequently reoffend due to the clandestine nature of their actions and prolonged access to their victims. The trauma experienced by victims in both scenarios is intense; however, CSA often results in developmental and enduring psychological harm stemming from abuse at a young age. Here is a comparative overview:
ASPECTS | CHILD SEXUAL ABUSER (CSA OFFENDER) | RAPIST (ADULT VICTIM) |
Victim Profile | Minors (under 18 years, frequently much younger), encompassing prepubescent children | Adults or teenagers (typically above the age of consent) |
Consent | Legally and developmentally unfeasible (children are incapable of giving consent) | The victim may resist, consent may be forcibly obtained, or manipulated. |
Motivation | Often involves pedophilic attraction, a desire for control, grooming, and at times, opportunistic behavior. | Power, aggression, sexual gratification, revenge, or dominance |
Method of Approach | Frequently non-violent; involves grooming, manipulation, and building trust | Often entails physical force, threats, or coercion. |
Relationship with Victim | Typically known to the child as a family member, educator, caregiver, or neighbor. | May or may not be acquainted with the victim; includes strangers, acquaintances, or intimate partners |
Psychological Profile | May exhibit a persistent sexual interest in children (pedophilia or hebephilia); some may lack impulse control. | Driven by anger, antisocial tendencies, or a sense of entitlement |
Crime Dynamics | May occur repeatedly over time; can take place in private or domestic settings. | It can be a singular or repeated offense; more likely to happen in public or secluded areas. |
Social Concealment | Child Sexual Abuse (CSA) is often concealed due to social stigma, fear, family pressure, or the child’s silence. | Rape may be reported more swiftly due to physical violence or evidence. |
Legal Framework (India) | The POCSO Act, 2012, exclusively regulates sexual offenses against minors | The Bharatiya Nyaya Sanhita (BNS), Sections 63-64, govern laws about rape |
Punishment | The POCSO Act prescribes stringent penalties, often more severe for CSA, which shall not be less than three years but may extend to five years, and shall also be liable to a fine | Punishment under the BNS ranges from rigorous imprisonment of either description for a term which shall not be less than 10years, but which may extend to life imprisonment, and shall also be liable to a fine. |
Rehabilitation Challenges | Offenders of CSA may require specialized treatment for deviant sexual inclinations. | Rapists may necessitate behavior correction therapy focused on aggression and control. |
Classifying child sexual abusers has proven to be challenging due to their diverse backgrounds, which include variations in economic status, gender, marital status, ethnicity, and sexual orientation.
Typically, child sexual abusers are identified by their lack of social skills, feelings of inadequacy or isolation, increased sexual issues, or a passive approach to relationships. Unlike rapists, they exhibit distinct thought processes and emotional responses, often perceiving their criminal behaviors as uncontrollable, consistent, and intrinsic; in contrast, rapists tend to blame their actions on external, variable, and manageable factors.
ARTIFICIAL INTELLIGENCE-POWERED DETECTION AND PREVENTION OF CHILD SEXUAL ABUSE (CSA)
AI IN DETECTION OF CHILD SEXUAL ABUSE MATERIAL (CSAM)
The advancement of artificial intelligence (AI) has rapidly accelerated, leading to a significant expansion in the availability of this technology. The development of AI technologies has also reached the domain of child sexual abuse (CSA), where the magnitude of the issue, especially online has become too extensive for manual, human-led methods to handle effectively. Nevertheless, the application of AI in this area has gone beyond mere prevention. In early 2023, the US National Center for Missing and Exploited Children reported instances of ‘fake’ child sexual abuse materials (CSAM) that offenders created with the help of generative AI tools. Likewise, Australia’s eSafety Commissioner has observed reports of children utilizing AI to produce sexually explicit images of other minors, indicating a potentially more pervasive problem.
The Internet Watch Foundation (IWF) has consistently been at the leading edge of recognizing the misuse of emerging technologies, and artificial intelligence (AI) is no exception. AI technologies are increasingly taking on various roles in the production of child sexual abuse material (CSAM), which includes the creation of deepfake CSAM, the alteration of images to undressed or clothed children, and the manipulation of photos or videos to portray both known and unknown children in sexually exploitative situations. Existing CSAM has been utilized to train AI models, indicating that offenders are leveraging AI to generate new representations of previously victimized children. Additionally, reports suggest that perpetrators are employing AI to modify images sourced from victims’ social media accounts and other online content, subsequently using these manipulated images for sexual extortion of the victims.
The Internet Watch Foundation (IWF) has recognized a significant and escalating threat in which AI technology is being misused to create child sexual abuse material (CSAM). Our initial report, published in October 2023, disclosed the existence of over 20,000 AI-generated images on a dark web forum within a single month, with more than 3,000 of these images depicting criminal activities related to child sexual abuse. Since that time, the situation has intensified and continues to develop.
AI IN PREDICTING AND PREVENTING ONLINE GROOMING ON CSA
Online grooming, the method through which perpetrators establish trust with minors for sexual exploitation, represents an increasing danger in the digital era. Artificial Intelligence, especially Natural Language Processing (NLP) and behavioral analysis algorithms, has emerged as an essential resource for forecasting and thwarting grooming efforts before their progression into abuse.
Online platforms frequently lack adequate monitoring systems to identify grooming activities in real-time. AI can be incorporated into these platforms to automatically flag grooming attempts by examining textual interactions, voice patterns, and even visual content. By employing predictive algorithms, AI can assist in forecasting grooming behavior prior to its escalation, facilitating timely interventions.
AI is currently being researched and utilized to anticipate and avert online grooming, mainly by analyzing communication patterns and recognizing potentially harmful content. AI-driven tools can scrutinize text, images, and videos to uncover warning signs and risky interactions, potentially flagging grooming attempts before significant harm is inflicted. Nevertheless, ethical considerations, including potential biases and privacy issues, are essential to address during the development and implementation of these technologies.
Natural Language Processing (NLP): Natural language processing (NLP) is a branch of computer science and artificial intelligence (AI) that employs machine learning techniques to allow computers to comprehend and interact with human language. NLP empowers computers and digital devices to identify, interpret, and produce text and speech by integrating computational linguistics, which involves rule-based modeling of human language, along with statistical modeling, machine learning, and deep learning.
Research in NLP has facilitated the advent of generative AI, enhancing the communication capabilities of large language models (LLMs) and enabling image generation models to interpret requests. For many individuals, NLP has become an integral part of daily life, driving search engines, activating chatbots for customer service through spoken commands, operating voice-activated GPS systems, and powering question-answering digital assistants on smartphones, including Amazon’s Alexa, Apple’s Siri, and Microsoft’s Cortana.
Behavior Pattern Analysis: Artificial Intelligence (AI) models are essential in detecting online grooming by examining user interaction patterns. These models are specifically designed to identify behavioral warning signs that are frequently linked to predatory behavior.
KEY BEHAVIORAL INDICATORS ANALYZED
- Frequency of Messaging: An excessive or rapid exchange of messages, particularly from adults to minors.
- Time of Interaction: Regular communication during late hours, which is often regarded as a grooming strategy.
- Age Discrepancies: Adult users reaching out to significantly younger profiles.
- Escalation of Content: A gradual transition from neutral to sexually suggestive or explicit language.
- Isolation Attempts: Efforts to shift conversations to more private or encrypted platforms.
- Request Patterns: Repeated solicitations for personal information, images, or confidentiality.
- Automated Preventive Measures Enabled by AI
- Real-time Warnings: Individuals suspected of grooming or minors at risk may receive automated notifications or safety prompts (e.g., “Are you sure you want to share this?”).
- User Flagging and Reporting: Accounts deemed high-risk are automatically flagged for human review or referral to law enforcement.
- Temporary Restrictions or Bans: Platforms can automatically disable accounts involved in predatory actions while an investigation is conducted.
This proactive strategy enables platforms to act before abuse takes place, ensuring a quicker response and enhanced protection for at-risk users, particularly children. Nevertheless, these systems require regular updates to reduce false positives and uphold user privacy.
AI IN VICTIM AND PERPETRATOR IDENTIFICATION IN CHILD SEXUAL ABUSE CASES
Artificial Intelligence (AI) has emerged as a revolutionary instrument in aiding law enforcement and child protection organizations in the identification of both victims and offenders of Child Sexual Abuse (CSA), especially when such abuse is shared online through images, videos, or chat communications. By scrutinizing multimedia content and digital traces, AI significantly improves the efficiency and precision of investigations.
FACIAL RECOGNITION TECHNOLOGY (FRT)
Victim Identification: AI-driven facial recognition systems analyze faces present in abusive content and compare them with databases of missing children, educational records, or social media profiles. This technology is utilized to rescue victims who may not have been reported as missing but are found in CSAM (Child Sexual Abuse Material). Additionally, these tools can identify partial facial features, account for aging changes, or reconstruct obscured visuals using sophisticated techniques.
Perpetrator Identification: Facial analysis of individuals suspected of CSAM or grooming conversations is cross-referenced with law enforcement databases or open-source intelligence (OSINT). This process aids in revealing offenders who may be operating anonymously or attempting to conceal their identities.
Background and Object Recognition: AI models possess the capability to examine the surroundings in CSAM images and videos, facilitating the identification of: Room types (for instance, hotel, basement, bedroom), Furniture, electronics, or toys, Geographic indicators such as street signs, text on posters, or recognizable landmarks. This functionality empowers investigators to geolocate the scene of the abuse, thereby refining search parameters and connecting various cases.
Image Metadata Analysis: AI tools meticulously analyze metadata (EXIF data) from digital content, which encompasses: Date and time stamps, Device IDs, and GPS coordinates (provided that location services were activated). This process enables law enforcement to trace the source of the content, associate it with particular devices, and establish digital chains of custody, Etc.
AI-POWERED PREVENTION CAMPAIGNS AGAINST CHILD SEXUAL ABUSE (CSA)
In addition to detection and investigation, Artificial Intelligence (AI) is being increasingly utilized in preventive education and awareness initiatives designed to safeguard children from sexual abuse, both in physical settings and online. These AI-driven strategies aim to empower children, inform caregivers, and prevent abuse before it happens by utilizing predictive analytics, interactive platforms, and adaptive learning.
AI-Driven Educational Tools for Children: AI is employed to customize learning modules and recreate risky situations in a non-threatening, gamified manner. These resources instruct children on making safe choices by engaging with role-play scenarios (e.g., someone requesting a photo), quizzes on digital conduct, and interactive videos that teach how to recognize grooming tactics. These tools are tailored to a child’s age, language, and understanding, ensuring inclusivity.
Parental Guidance and Monitoring Tools: AI tools also assist parents and guardians by offering behavioral alerts if a child interacts with suspicious individuals, insights into the use of risky applications, websites, or chat content, and educational prompts to encourage offline discussions about safety. Notable tools such as Qustodio, Bark, and Net Nanny utilize AI to oversee children’s device usage for indications of grooming or abuse. Alerts are generated for potentially harmful interactions, aiding in the prevention of escalation.
Social Media and Platform-Based Awareness Campaigns: AI algorithms are employed by digital platforms to: target awareness messages to at-risk users (e.g., children exhibiting risky behaviors or language), deliver tailored safety reminders or educational content when specific keywords or actions are identified, and flag accounts displaying predatory behavior while directing victims to support resources. For instance, Instagram and Facebook use AI to identify grooming behavior and provide in-app warnings or links to safety education for users.
ADVANTAGES AND LIMITATIONS OF USING AI IN CHILD SEXUAL ABUSE CRIMES
ADVANTAGES OF USING AI IN CHILD SEXUAL ABUSE CRIMES
Speed and Scalability: AI systems possess the capability to process extensive datasets within seconds, a task that would require human investigators days, weeks, or even months.
Example: AI can examine millions of images and videos uploaded daily on platforms such as Facebook or Google to identify Child Sexual Abuse Material (CSAM). This ensures quicker detection and alleviates backlogs in investigations.
Real-Time Detection and Intervention: AI technologies, including natural language processing (NLP) and computer vision, can identify grooming behaviors or CSAM as they are being uploaded or transmitted. If an AI system detects that a user is sending sexually explicit messages to a minor, it can block the message and notify moderators immediately. This enables preventive measures before abuse escalates.
Victim Identification: AI facial recognition and image-matching technologies can compare faces in CSAM with databases of missing children or social media profiles. INTERPOL and NCMEC utilize AI to identify unknown victims appearing in abusive content. This aids in rescue operations, sometimes even before public reports of abuse.
Perpetrator Tracking and Profiling: AI can scrutinize digital footprints, including metadata, writing styles, and behavioral patterns, to assist law enforcement in tracking offenders who operate anonymously. AI may identify an offender using multiple aliases across various platforms by connecting behavioral similarities. This improves the capacity to unmask and apprehend repeat offenders.
Minimizing Human Exposure to Harmful Content: The manual review of CSAM can inflict psychological trauma on moderators and investigators. AI assists in automatically categorizing or filtering the most egregious content. PhotoDNA or Google’s CSAI Match pre-sorts images/videos, allowing human reviewers to focus solely on flagged content. This lessens the emotional burden on child protection personnel.
Early Grooming Detection and Prevention: AI technology can identify grooming behaviors by analyzing the progression of conversations, for instance, through flattery, demands for confidentiality, and inappropriate inquiries. Instagram employs AI to alert minors when they receive messages from adults displaying predatory tendencies. This approach helps avert abuse by intervening prior to any physical interaction or explicit content sharing.
Automation of Reporting and Moderation: AI systems streamline the process of flagging and reporting Child Sexual Abuse Material (CSAM) and suspicious activities to authorities and organizations such as NCMEC. Services like YouTube and Facebook automatically notify relevant legal entities about suspected CSAM uploads. This enhances the effectiveness and scope of global networks for reporting child sexual abuse.
Global Coordination and Hash Matching: AI enables the distribution of hash databases with unique digital identifiers for known CSAM among various countries and organizations. PhotoDNA hashes allow companies like Microsoft and Google, along with NGOs, to automatically prevent the upload of recognized CSAM. This fosters international collaboration in identifying and eliminating abusive content that circulates globally.
Empowerment Through Child-Focused AI Tools: AI drives the development of chatbots, interactive games, and safety applications that instruct children on recognizing abuse and seeking assistance. AI chatbots such as “Wysa” or “ReThink” provide real-time education and counseling to children regarding online safety. This encourages digital literacy and proactive self-defense among young users.
Enhanced Resource Allocation: AI assists in determining which cases require urgent human intervention by assigning risk scores based on the severity or frequency of incidents. Law enforcement can prioritize cases involving live-streamed abuse or multiple victims, as identified by AI. This allows for a more strategic distribution of limited investigative resources.
LIMITATIONS OF USING AI IN CHILD SEXUAL ABUSE CRIMES
While Artificial Intelligence provides substantial capabilities in identifying and preventing child sexual abuse, its implementation also introduces intricate challenges and limitations across ethical, technical, legal, and operational aspects.
Privacy and Surveillance Concerns: AI systems employed to analyze user-generated content (such as chats, photos, and videos) may violate individual privacy, particularly when applied to encrypted or personal communications. Apple’s plans (currently on hold) to scan pictures on users’ devices for CSAM raised significant concerns regarding surveillance and potential misuse. This raises constitutional and legal issues surrounding the right to privacy, consent, and data protection, particularly in democratic societies.
False Positives and Misidentification: AI models, particularly those that depend on pattern recognition, can erroneously flag innocent content or users as abusive. An AI tool might incorrectly identify medical images, family photos, or conversations containing misunderstood slang as CSAM or grooming. This may result in unwarranted account bans, reputational damage, legal complications, and unnecessary distress for users.
Inability to Access Encrypted Platforms: AI is unable to monitor messages or content shared over end-to-end encrypted platforms such as WhatsApp, Signal, or Telegram. A significant amount of online child sexual abuse, including grooming and CSAM distribution, occurs in these secure environments, restricting AI’s reach and effectiveness.
Evasion Tactics by Offenders: Perpetrators adapt to AI detection by utilizing code words, emojis, or foreign languages. Altering images to bypass hash-based detection. Employing ephemeral or private platforms. AI systems require continuous updates to keep pace with evolving tactics, or they risk becoming ineffective over time.
Ethical and Legal Dilemmas: The application of AI in CSA investigations presents significant ethical challenges, including the use of facial recognition technology on minors. The implementation of predictive policing models could unjustly label behaviors as criminal. The utilization of AI tools without appropriate judicial oversight or public transparency. This can undermine public confidence, particularly if AI-generated decisions lack clarity or are disproportionately directed at marginalized communities.
Bias and Limited Training Data: AI systems developed using datasets that are predominantly Western or exclusively in English may not perform adequately in multilingual or culturally varied environments. An AI designed to identify grooming behaviors in English may not recognize similar actions in languages such as Hindi, Tamil, or Arabic. This results in unequal protection across different regions and demographic groups, thereby exacerbating digital inequalities.
Resource and Infrastructure Demands: High-performing AI systems necessitate: Extensive datasets, Robust computational infrastructure (such as cloud computing and GPUs). Qualified personnel for training, auditing, and ongoing maintenance. Agencies with limited resources, particularly in developing nations, may find AI technologies to be unattainable or impractical without external assistance.
Over-Reliance on Technology: An exclusive dependence on AI could diminish the role of human judgment, which is crucial in the nuanced and context-sensitive cases of CSA. This may result in mechanical decision-making that fails to account for emotional, cultural, or situational nuances in cases of abuse.
CONCLUSION
AI is reshaping the worldwide approach to Child Sexual Abuse, particularly in the identification and elimination of online threats. Although it cannot substitute human judgment, when applied ethically and responsibly, AI greatly enhances protective systems and bolsters a child-centered justice response.
AI technologies such as PhotoDNA and NLP-based grooming detectors have transformed child safety in the digital realm. Nevertheless, technology must function within a robust legal and ethical framework to safeguard both children and individual rights.
AI-driven detection and prevention tools are changing the global response to CSA. While they do not replace human oversight, AI significantly enhances the speed, scale, and accuracy of protective measures. The responsible and ethically sound application of these technologies can play a crucial role in ensuring that both digital and physical environments are safer for children.
AI’s capacity to comprehend language and identify behavioral patterns renders it a formidable preventive instrument against online grooming. Although it is not a standalone solution, it considerably fortifies efforts to protect children in online environments by detecting threats before the occurrence of abuse.
AI has transformed the capability to identify and combat Child Sexual Abuse Material. It offers speed, precision, and scalability, allowing authorities and platforms to respond promptly to offenders. However, effective implementation must be accompanied by strong ethical safeguards, comprehensive legal frameworks, and international collaboration.
AI-powered identification tools have greatly improved the ability to rescue victims and apprehend offenders. By integrating facial recognition, metadata analysis, and behavioral tracking, AI empowers authorities to act quickly, often converting anonymous or concealed offenders into identifiable entities. However, these tools must be deployed with stringent legal oversight, data protection measures, and a child-centric focus to ensure justice while safeguarding individual rights.
AI-driven prevention initiatives represent a proactive, scalable, and child-focused strategy for addressing Child Sexual Abuse (CSA). By integrating interactive education, digital safety resources, and predictive analytics, these systems empower children, support parents, and foster safer online environments. Nonetheless, they must operate under robust ethical guidelines, child rights protections, and inclusive design principles to achieve the greatest effectiveness.
Artificial Intelligence provides significant tools for addressing Child Sexual Abuse by improving detection, investigation, prevention, and education efforts. However, its application must be governed by stringent legal frameworks, data privacy regulations, and ethical oversight to ensure that the safety of children is prioritized without infringing on human rights. AI should serve to enhance, rather than replace, human expertise in the battle against CSA.
The 2024 report indicates a rise in the upload of child sexual abuse materials on the dark web, including AI-generated images. Offenders are utilizing AI to create more lifelike images that are categorized as child sexual abuse. They can manipulate existing pornographic videos from various websites, employing AI to alter them by superimposing the faces of children, particularly those who are well-known.
SUGGESTION
We know that AI cannot function in isolation; that is to say, whatever result it produces has been fed into it by the help of a human, who is either its manufacturer or its user, so it produces such results according to the data.
There is no doubt that AI is helping in solving criminal cases and assisting the criminal justice system, but at the same time, it is also creating rooms for more offences and crime to be committed and the only way to stop it from the negative aspect of the use of AI, is to stop feeding AI with valuable information and providing such platform where it relies on the data and give its algorism.
Since AI is presume to be an intelligent machine, it should be programmed in such a way that it can function on its own without the help or assistance of feeding it with any information before it produces a result, because the moment it stores any information, it cannot be erase from its database and so, it remains there and functions throughout its existence creating breach to individual rights to privacy.
In my opinion, I will say AI has done greater damage to society to an extent that it can be resolved only if its manufacturers can make this “Artificial intelligence“ a machine that will reject information that has to do with the moral and reputation of individuals when fed to it by its users.
REFERENCES
Delphine Collin-Vézina, P. 2. (May 2019, Éd. rév.). Child Sexual Abuse: An Overview. Encyclopedia on early childhood development.
Heather Wolbers, T. C. (January 2025). Trends & issues in crime and criminal justice: Artificial intelligence and child sexual abuse: A rapid evidence assessment. Australian institute of criminology, No. 711.
How AI is being abused to create child sexual abuse imagery. (October 2023). The Internet Watch Foundation (IWF).
Introducing Safer Predict: Using the Power of AI to Detect Child Sexual Abuse and Exploitation Online. (July 19, 2024). Thorn.
Simons, D. A. (July 2015). Adult Sex Offender Typologies. U.S. Department of Justice, Office of Justice Programs, Office of Sex Offender Sentencing, Monitoring, Apprehending, Registering, and Tracking.
The report of the Independent Inquiry into Child Sexual Abuse. (n.d.). IICSA published its final Report in October 2022. This website was last updated in January 2023.
Vera-Gray, F. (Mar 2023). The impacts of child sexual abuse. Child and Woman Abuse Studies Unit, London Metropolitan University.