Podcast – Episode 13 Companion

Written by Deanna D.
Updated April 7, 2022
 

Episode background: The episode centres on a young girl who feels helpless as a result of bullying and extortion she’s been harbouring. The cause of her pain stems from an initial choice to send one explicit image to a stranger who made her believe she was special and attractive. 

Mentality – The mindset to gain attention and attract as many eyeballs as possible is common for corporations, organizations, institutions and individuals. It can lead to fame and fortune in some cases. More often, for school-aged kids, it is an endless popularity contest measured by the likes and views a young person gets – and don’t worry, kids keep a close count. To a teenager, a camera on a cell phone or laptop is an innocent yet necessary piece of equipment, an important means of communication. Young girls and boys record themselves and post innocent material online easily. Teenagers may focus on instant messages with their peers, but criminals troll for these kinds of photos and/or videos especially of young girls. The criminal turns their focus not to the post, but to the child featured in the photos and/or videos.

The soon-to-be victim gains a criminal’s attention. This criminal sometimes poses as a teenager when in fact they are usually much older. The new-found friend hands out many compliments and flirts with their victim, appearing to be wholesome, sweet and nice. They can provide attention morning, evening and night in real-time from anywhere in the world. The relationship starts with simple greetings, check-ins and then can escalate to telling each other they missed one another and flirting. Everyone wants attention; it feels good. If a teenager is proactive and objects to the conversations and seduction, a predator drums up the cultural construct that girls cannot be rude or mean but rather must be nice, polite and obedient. Not everyone falls for this trap. A teenager is told to stay away from the sites, but most kids cannot control themselves and cannot stop. To make things worse, most teenagers are tech-savvy and are excellent at skirting around any parental controls. 

Sextortion – The new type of criminal is everywhere. The blackmailers, extortionists or sextortionists are usually adults preying on younger users, especially teenage girls. Often, offenders do not know how to deal with the stress of work and family. They progress from legal material to illegal Child Sexual Abuse Material (CSAM) through avenues such as pop-up advertisements. Predators often lack social connections, resulting in them using the computer for comfort and making faux connections. The CSAM provides an escape from their pain, allowing them to enter a sick fantasy world in which they fail to connect the images to reality and see their behaviour as a victimless crime. Social accountability is absent, as the predator is habitually not talking about their behaviour to anyone. So, the criminal has only their own internal point of view on their behaviour. Besides, many feel they are just watching the CSAM and not making the material. 

Most recently, young boys and college-aged men are increasingly becoming targets of sextortion. More often they are contacted on social media by someone appearing to be an attractive female, the male sends explicit pictures of himself and now becomes a target for extortion. On March 25, 2022, Jordan DeMay a 17-year old high school student from Michigan committed suicide as a result of blackmail. He had sent sexual images of himself to what he believed to be a young woman. Scammers demanded money from him and after threatening to send images of Jordan to his family and friends, he told the fraudsters: “You win, I’m going to kill myself.

Once one sexually explicit photo is captured, the criminal potentially uses extortion and threats to get more material or demand money. A threat to send private explicit material to all of the teenager’s contacts or friends, can be seen as life-altering. The criminal can hack the victim’s various accounts and emails. They may use other tactics to get to private files if the target is uncooperative. 

Child Sex Abuse Material (CSAM) – Offenders who create, distribute and watch CSAM act under the belief they will not get caught and that non offending caregivers and parents of the victims will never know or see what is happening to their children. The victim may appear cooperative, if only for a split second. If this is the case, it often means the victim is tricked or coaxed into performing an incidental inappropriate act captured in photos or on video. CSAM can be created when the victim is unaware of the recording, for instance, a child is changing, and a babysitter cunningly captures the moment on video or in photos. Other times, a criminal takes over a device’s camera with malware then works diligently behind the scenes to capture incidental sexual material. The victim is completely unaware, or they are too young and do not understand the situation.  

In communities around the globe, survivors of CSAM and extortion live with the debilitating fear that the photos and/or videos memorializing their sexual incident shared on the Internet will remain online forever for anyone to see. Many of these children are once more victimized as their images are shared again and again, often well into adulthood, even decades later. They constantly worry someone who has seen their images will recognize them in public. 

If the photos or video are permanently online the distribution of material is never-ending. The reality is that the platforms individuals use daily to connect and share information, including social media, online gaming, apps and email, are continuously being used to spread and collect CSAM. It can be found in any online realm on many devices.

A glimmer of hope is offered to the problem that once an image is out there, you cannot get it back because technological tools have been developed to assist companies in detecting inappropriate material on their servers so they can immediately remove it. Several organizations actively scour the Internet for these images and notify online platforms so they can be promptly removed. Regrettably, the efforts make little to no difference. Memorializing and distributing the incident through images and videos creates additional layers of victimization and trauma.

Random Communication – Random video chat sites are usually filled with adults looking to be sexual and obscene with anyone, including kids. Platforms, apps or websites that allow random communication between two strangers or allow any user to contact another user with a direct message like Instagram, WhatsApp or Omegle are often filled with predators of all kinds. Up until 2020, Omegle, a website known not to have any known safety features or very lax features, does not require registration, has no way of verifying a user’s age actually had a warning on the main page of their site warning of predators on their site. Often sexual imagery in the form of ads or live persons pollutes the landscape of these kinds of sites. Stated directly, the site is filled primarily with men masturbating and/or looking to get any female nude. Similar sites in recent months have come under fire for their lack of oversight and monitoring. Often men who seek to get females nude do so under the guise that nobody else will ever know or see.

Self-Harm – In many countries, self-harm is a major public health problem in children and adolescents. It is the act of deliberately hurting oneself (burning or cutting) often as an emotional coping mechanism. In adolescents, the presence of anxiety disorder and depression is highly correlated with self-harm. Sadly, in suicide cases, the strongest risk factor is the individuals who have a history of non-fatal self-harm. A major obstacle to self-harm is it is hidden from caregivers and the health care system. Thus, self-harm is considerably under-reported. In the UK from 2001 to 2014 researchers found a high incidence of self-harm. The study shows a statistically significant 68% increase in self-harm amongst females aged 13 to 16 years over a span of 13 years. The researchers Morgan et al. (2017) noted the following conclusion: Children and adolescents were at noticeably increased risk of dying at a young age following self-harm compared with their peers of the same age and sex without a history of self-harm, particularly by suicide and acute alcohol or drug poisoning (Morgan, et al, 2017, Conclusion section). Therefore, primary care for children and adolescents who engage in self-harm is critical. 

Advice for Parents and Caregivers

Communicate Openly

Appropriately discuss with children that bad people sometimes use mobile phone cameras to take photos and videos they should not take of kids. Sometimes bad photos can have detrimental consequences on their well-being. Parents and guardians need to build a comfort level with their children to discuss modern-day issues. Ideally, a child would go to their parents or guardians if they are uncomfortable with any interaction that involves photos or video. 

 

Show support

Be reassuring and compassionate, thus, believe the child’s disclosure, demonstrate emotional support, show disapproval of the perpetrator and take action to remove or report the offender. Remember research indicates support helps a child’s well-being. 

 

Avoid Oversharing and Maintain privacy

Children (and adults) need to be careful and not to give out any identifying information such as an address or a phone number to strangers. Keep your social profiles private but just by virtue of having social media, you’re leaving a trail behind for someone to piece together. Bad people may spend time grooming and building a relationship with a child. Then, as the relationship grows, they may exploit the child by capturing sexualized images of them in any manner they can. The young person is never to fulfil any inappropriate request.

 

Reach out to NCMEC or CCCP to help get images/media removed

Both NCMEC (National Center for Missing and Exploited Children) and CCCP (Canadian Centre for Child Protection) run programs where they may act as a broker for requesting images be taken down. They run Cybertip in Canada and the US. These are clearinghouses for CSAM and other online harm reports regarding children.

References

 

Merriam-Webster. (n.d.). Self-Harm. In Merriam-Webster.com dictionary. Retrieved March 31, 2022, from https://www.merriam-webster.com/dictionary/self-harm

Morgan, C., Webb, R.T., Carr, M. J., Kontopantelis, E., Green, J., Chew-Graham, C. et al. (11 September 2017). Incidence, clinical management, and mortality risk following self-harm among children and adolescents: cohort study in primary care. BMJ. 359:j4351 doi:10.1136/bmj.j4351

Morgan, S. & Lambie, I. (2019). Understanding men who access sexualised images of children: exploratory interviews with offenders. Journal of Sexual Aggression, 25:1, 60-73, DOI: 10.1080/13552600.2018.1551502 

National Center for Missing and Exploited Children. (n.d.). Child Sexual Abuse Material (CSAM): Overview. www.missing kids.org. https://www.missingkids.org/theissues/csam

Educate. Advocate. Protect.