Child sexual abuse is a devastating crime that can happen anywhere, to anyone, by anyone. Alarmingly, the documented number of cases continues to increase year after year.
It’s Not “Pornography,” It’s Evidence of Abuse
Child Sexual Abuse Material (CSAM) refers to any visual depiction of a child engaged in sexually explicit conduct, whether real or digitally altered. We use this term to replace the outdated and misleading phrase “child pornography” because no sexual content involving a child is ever pornographic. It is always evidence of abuse.
According to UNICEF, millions of new CSAM files are reported globally each year, with the Internet Watch Foundation identifying over 250,000 child sexual abuse web pages in 2023 alone. Technology has made the production and sharing of CSAM alarmingly easy. Smartphones, encrypted messaging, and cloud storage mean that exploitation can occur anywhere and often by people children know. A 2022 Interpol report warns that self-generated sexual content by minors, often created under coercion or deceit, now makes up a growing portion of CSAM.
Children are not always aware they are victims. In Nigeria, NAPTIP has documented a worrying rise in online child exploitation, particularly through messaging apps and social media platforms. Predators groom them emotionally, offering gifts or attention before demanding images. Once shared, these images can spread uncontrollably, leaving victims with lifelong trauma and a crippling fear of exposure.

The fight against CSAM intersects with broader societal challenges: Poverty, limited digital literacy, and weak enforcement. The National Cybercrime Centre (NCC) has warned that online child exploitation is rising, particularly in urban centres where mobile access is high.
Despite laws like the Cybercrimes Act (Prohibition, Prevention, etc.) 2015 and the Violence Against Persons (Prohibition) Act (VAPP), enforcement remains slow. Many cases go unreported due to stigma, fear, and a lack of awareness about what constitutes online abuse. We must also ensure confidentiality and protection for those who report abuse to encourage more people to step forward without fear.
What Parents and Guardians Must Do
Children affected by CSAM or online grooming may show sudden behavioural changes ranging from withdrawal, secrecy about online activity, fear of certain people, or unexplained anxiety. Parents and guardians must stay alert without being intrusive. Conversations about online safety should begin early, normalising topics like consent, privacy, and boundaries.
Crucial Actions for Protection:
- Educate early: Teach children that no one should ever request or share their private images.
- Supervise gently: Maintain open communication and set healthy boundaries around device use.
- Adult Digital Literacy: Adults, including parents, teachers, and leaders, must educate themselves on how technology works, including the specifics of encrypted apps, privacy settings, and cloud storage, to supervise effectively.
- Use parental controls: Simple settings on phones and apps can limit risky exposure.
- Report immediately: In Nigeria, CSAM or suspected grooming can be reported confidentially to NAPTIP, Action Against Child Sexual Abuse Initiative (ACSAI) or the Nigeria Police Force Cybercrime Unit.
- Support survivors: Psychological support and counselling help children reclaim confidence and dignity.
Educators and community leaders play an equally vital role. Schools must integrate digital citizenship education and create safe reporting spaces. Journalists, too, must handle stories involving minors with extreme care, avoiding sensationalism and protecting victims’ identities.
Social media platforms and tech firms have a legal and moral duty to act. While companies like Meta, Google, and TikTok use AI to detect and remove CSAM, gaps remain, especially in localised contexts like Nigeria, where moderation systems may not fully recognise regional languages or platforms. Collaboration with local NGOs and regulators is key.
Ending child sexual exploitation online requires collective vigilance. Governments must enforce laws rigorously. Communities must educate and protect. And every internet user must recognise that clicking, sharing, or ignoring abusive material contributes to harm.
As UNICEF notes, “Every image represents a real child and every view repeats the abuse.”
We can all make the internet safer. Report suspicious accounts. Educate your community. Support organisations that protect children. If you suspect a child is at risk, act immediately; silence only empowers abusers.