Warning: Trying to access array offset on value of type null in /customers/d/1/a/ufmalmo.se/httpd.www/magazine/wp-content/themes/refined-magazine/candidthemes/functions/hook-misc.php on line 125 Warning: Trying to access array offset on value of type null in /customers/d/1/a/ufmalmo.se/httpd.www/magazine/wp-content/themes/refined-magazine/candidthemes/functions/hook-misc.php on line 125 The Other Pandemic - Pike & Hurricane
The Other Pandemic

The Other Pandemic


***Trigger warning: Contains explicit language on the subject of child sexual abuse***


At times one stands before a great abyss. A great expanse ordained by dark materials to which one has to decide to gaze into or turn away from. Why should we feel the urge to brand ourselves with the horrors emanating from worst of what is distressingly familiar: the human condition. “If you stare into the abyss long enough, the abyss stares back at you.” The slime dredged from the ugliest depths of human depravity invokes literal and spiritual trauma, and corrupts hope for redemption.

The doctrines of good versus evil are riddled with complacency and glib intellectuals would claim that the problem of evil is that we are inextricably bound by it. We are either bound to evil’s mast, driven momentarily insane by its song, or we shut it out, plug our ears and go about staying our course. But the problem of evil runs deeper.

Preachers provide no satisfactory answer for evil’s prevalence. Theodicies sideline evil’s sinister human face, yet ones that embed it chime like sermons of a sententious apologist. “The line dividing good and evil cuts through the heart of every human being.” Yes, but the real and visceral evil disfigures the heart with a thousand cuts. It is the narcosis of which a deep spiritual disorientation follows. Some will never find their way out of its depths. Some don’t want to. Others belong there.

The Damned

Meet Benjamin Faulkner. Born in 1991 in North Bay, Ontario in Canada—approximately 350 kilometers north of Toronto. According to his parents, he was well-liked and not a “partier by any means” but more of a gamer and obsessed with computers. He worked as a lifeguard and swim instructor at the local YMCA and held a merit of distinction at his post—praised and appreciated by colleagues and the children he instructed alike. In his mid-twenties he’d shared an apartment with a roommate who described him as pleasant to live with. Not noisy and kept to himself. Mainly because he was on his laptop all the time. In 2016, he was arrested and charged with the rape of a four-year-old girl and for hosting the largest website on the darknet for child sexual abuse content: Childs play.

At Faulkner’s sentencing hearing he delivers a grand speech: “For the first time in my life I am speaking in front of the people that I love about the wrongs I’ve committed. Living with pedophilic disorder is a life of perpetual anxiety, fear, and debilitating depression. […] I know that people were hurt and I am sincerely sorry. I’m sorry for who I’ve hurt and I’m sorry for the lives I’ve altered. I’m sorry for how things turned out. If I could go back, things would be different.” The sentencing proceeds with rather benign testimonies from his parents attesting to the character—”patient and kind”—of Faulkner.

Later, a chief investigator is called on to lay out the facts. Faulkner had built and run Childs Play, administrated another major child exploitation site The GiftBox Exchange, as well as a site called Private Pedo Club with access reserved strictly for content creators with access to minors. He has been the “tech guy” for Peter Scully’s international child sexual abuse ring offering pay-per-view video streams of the most heinous content imaginable—content where minors are abused, tortured, and killed.

While the investigator reads out the list of Faulkner’s activities on the darknet, Faulkner’s hand is covering his face, his shoulders shaking… but he’s not crying. He’s chuckling. Laughing quietly to himself. The grin takes a while to fade from his face.

Faulkner ran what was most likely the largest online network of child abuse material in the world. Yet, at the time of the arrest, none of this got much attention in the media.

The Problem

The awful truth about child sexual abuse material, or CSAM, is that it covers everything. The abuse of babies, even newborn babies. Toddlers. Schoolchildren through to teenagers up to the age of 18. It includes children being raped by adults, or adults directing a child to be abused in another country. It also covers grooming—which involves an adult establishing an emotional connection with a child, sometimes the child’s family, to lower the child’s inhibitions with the objective of sexual abuse—and live streaming of abuse. About one-third of this material is so-called self-generated material, involving either children believing that they are sharing a private moment unaware that they are being recorded and that material being shared, or even children being frightened or coerced to perform sexual acts in front of a webcam.

The scale of CSAM online is enormous, and it appears to be growing. Approximately 132,000 webpages were removed in 2019 by the Internet Watch Foundation (IWF) which accounts for millions of images and videos. Of them around 95% were girls. And 47% were images of children aged under 10. 1%, a large amount in terms of absolute numbers, of children aged under two. Simon Bailey of the National Police Chief’s Council (NPCC) Lead for Child Protection and Abuse Investigations notes that the worse level of abuse happens the younger the child. “So naught to two is generally the worst level of abuse.”

The Damage

Calling this abuse child pornography is misleading and conceals the scope of the crime being committed. Pornography is commonly associated with the adult industry where consent is given and the actors know what they are engaging in. In no way can that be applied in the case of a child. A child cannot consent to being raped. Nobody can consent to being raped. What you’re seeing in an image with a child is a child being sexually abused. Using the word “pornography” gives a sense of legitimacy to a criminal act. Child sexual abuse is a serious crime and it is important not to minimize the effect that this has on the child.

Because child sexual abuse is extremely harmful. Especially if you look at the long-term consequences. A survey by the Canadian Center for Child Protection shows that 70% of people whose abuse had been shared over the internet lived in constant fear of being recognized. For good reason. 30% of them had been recognized by someone who had seen images of their abuse. In addition to high rates of anxiety, depression, eating disorders, problems sleeping, relationship issues, 60% of respondents had attempted suicide.

The Abusers

Abusers come in kinds. The worst of these are the ones who don’t feel like there’s anything wrong with engaging in child sexual abuse. These people can be impossible to treat. According to Michael Bourke of the U.S. Marshals, the task with these people is to motivate them and get them to the place where they can at least recognize that what they’ve done was potentially harmful and had negative consequences. Most sex offenders that come in through the prison system are eventually going to leave—85-90% will eventually serve their sentences and return to communities.

Bourke likens the behavior to substance abuse. There’s no cure for alcoholism or opioid addiction. There’s no cure for sex offenders either. No way to change their fantasies. “We never take anything away in psychology without replacing it with something healthy. Their crimes are how offenders got their needs met. They were the means through which they coped with stress, sadness, anger, and all the other negative emotions. If you are to take away this coping mechanism—maladaptive and harmful as it is—what do we replace that with?” If there are no readily available ways to assuage the offender’s response when confronted with stressors, they will go back to their tried and true means of relief—the website, playground, water park, whatever it may be.

By conservative measures, about 1% of the adult population has some form of pedophilic attraction—representing mostly males. Against the total global male population that makes 35 million people. It may be that a small fraction of these are going to act against a real child, but even when excluding the non-participants, this encompasses an enormous amount of people across the world who might find pleasure in viewing child sexual abuse content. Arresting and consigning all these people to life behind bars is a naïve initiative if investments and labor are wholly delegated to essentially chasing the horizon. So, what can be done?

The Platforms

In 2018, tech companies reported over 45 million online photos and videos of children being sexually abused to the National Center for Missing and Exploited Children (NCMEC)—more than double what they found the previous year. In 2019, it had reached 70 million, and for the first time there were more videos of abuse than photos. And this year, child abuse reports have spiked during COVID-19.

“For the last 10-20 years the industry has been saying that it’s been doing everything they can to combat the proliferation of CSAM online. They clearly aren’t. If tens of millions of pieces of content are going through your services every year, there’s clearly a problem with the way you are approaching this problem,” says Hany Farid, a professor of Electrical engineering and computer science at University of California in Berkley a leading expert in the analysis of digital images.

At Microsoft, Farid helped develop PhotoDNA which has been widely successful in combating CSAM made freely available for other tech companies. It is used to create a digital fingerprint of known images of CSA which is then used to find duplicates and stop them being shared. Google has also used analytics to introduce a ban on certain search terms—a list that is constantly changing in response to how offenders are trying to keep ahead. Facebook were one of the earliest adapters of PhotoDNA and they use it across all their platforms: Instagram, Facebook, and the unencrypted spaces of WhatsApp.

Yet, it is important to note that the tools described are effective in unencrypted space. The groundwork for encrypting online services—implementing so-called DNS over HTTPS (DoH)—has already been lain. The aim is to increase privacy and security by preventing eavesdropping and manipulation of data by encrypting that data. “There are really good reasons to have end-to-end encryption, but we have to acknowledge it comes with trade-offs,” says Farid. Some consequences of encryption are that things like parental controls and filters used by the IWF and other internet block lists that allow the means of companies to block millions of images from ever reaching the public eye would be bypassed. As a result, potentially millions of internet users would be exposed to CSAM.

Among imagery reported from tech companies Facebook overshadows the rest. In 2019, Messenger was responsible for over 80% of all reports made. The numbers are, however, a reflection of companies that have put more effort into finding and removing the material from their platforms. In 2018, the company was responsible for more than 90% of reports that year according to law enforcement officials. Nevertheless, many people have expressed concerns over Facebook’s plan to encrypt its Messenger service which will mean that many of the detecting tools which have been so successful in finding CSAM and getting it removed won’t work.

Facebook says that it will provide better security and privacy for Facebook users. Antigone Davis, the Global Head of Safety at Facebook, says this, “One of the things that we really see on Facebook is that more people are using our services to have very personal and private conversations […] and one of the things that we want to do is to fundamentally ensure the data security and privacy for those kind of interactions. That’s where the market is headed and I think one thing to keep in mind is that 85% of the market—the messaging market—is already end-to-end encrypted.”

Fernando Ruiz Pérez, head of operations for cyber crimes at Europol, said Facebook was responsible for a “very high percentage” of reports to the European Union. He said that if Facebook moved to encrypt messaging, the “possibility to flag child sexual abuse content will disappear.”

Hany Farid argues that there are two options. “One such option is that we’re going to encrypt your messages so that we can’t see it, the government can’t see it, nobody can see it. The cost of that will be the hundreds of millions of pieces of sexual abuse of children, roughly from the age of two months old to 12 years old, can, without any possible chance of being caught, come through the services. Which one of those would you like?”

Encryption will create a blind spot. Baroness Joanna Shields, who served as the UK Minister for Internet Safety and Security and previously worked as EMEA VP & managing director at Facebook, says that she does not understand the decision to encrypt Messenger. “It doesn’t make any rational, logical, or business sense. The micro-targeting that is done on these platforms relies on information that people share and if you go to an encrypted message between two people then you can no longer leverage the business model of the companies. So, it makes you ask the question as to why? To me, the only answer […] is that the companies are [handling reports of child sexual abuse] and the problem is that once those are reported then it’s an acknowledgment that the problem is rampant on the platform. If you take away the ability to report it, then they can say that it’s increasing or decreasing and no one will know.”

The Way Forward

It seems as if the tech giants are each using their own services and not working together. Every company has its own engineering technology, business model, and intellectual property. There’s no one technical solution that you could build on that would work across every platform. Those are different technologies. You can set an outcome, a goal, for a company to remove or block the images we’re talking about but you can’t specify the technology that they should be using to deliver it because they are differently engineered companies. Part of that engineering is at the heart of that business model hence their business success. So, they can’t share the way they build their platform.

By insisting on total anonymity, we have created a platform through which total ideational anarchy thrives, and the taint this has wrought sometimes trickles through the cracks in society’s veneers. The rot runs deep and we are condemned to despair. This isn’t defeatist. In accepting despair, we resign ourselves to a promulgation of a welfare of antipathy. We must allow discourse that addresses the profusion of the worst crimes man can commit to enter the public sphere. The sewers are overflowing. How high will this pollution be allowed to rise before we save ourselves from drowning in it?

The internet is a technology. It doesn’t make people do anything. People do things. What the Internet contributed to was two things: One, it allowed pedophiles with an interest in sexual images of children to contact each other and find a sense of community which lets them normalize their behavior. It spurred them on, and emboldened them. Two, it made these images available with apparent anonymity. People who might in other cases never have bothered to try to find these images, all of a sudden, they’re readily available.

To achieve meaningful change, everyone has to play their part. Not just the tech industry. There are some things which there is no defense against and where privacy is not a kind of balancing factor. Child abuse is one of those things. We need to find a way where there is a mixture of technical standards and policy and legal work to make it very clear that as technology changes, we don’t accidentally undermine the sort of protection that has been put into place to try and protect children. Behind every report there are children crying for help and without those reports they go without help. They are out there on their own with no one to even know that they’re being harmed.

Engage:

Sign a petition to pressure social media companies to report the spread of CSAM on their platforms

Find out more:

CBC podcast Hunting Warhead

New York Times podcast The Daily – A Criminal Underworld of Child Abuse

IWF’s podcast – Pixels from a Crime Scene

Internet Watch Foundation’s website

WePROTECT Global Alliance’s website and Global Threat Assessment Report 2019

 

Photo credits:

When Kids Are Silent by Zhenya Oliinyk (CC BY-NC-ND 4.0)

Child Sexual Exploitation by Alina Tauseef (CC BY-NC-ND 4.0)

Print Friendly, PDF & Email

Leave a Reply

Your email address will not be published.

Captcha loading...