An article I recently published, “App Fetishizing Forcible Transitioning of Kids Available on Google Play”, highlights a transgender pornography application that uses photographs of children. One of these children is a 14 year-old girl named Molly Russell, who committed suicide after viewing self-harm content on Instagram. As a result of my research, Kathleen Richardson contacted the family’s solicitor and the photograph has since been removed from the app.
“Once we have surrendered our senses and nervous systems to the private manipulation of those who would try to benefit from taking a lease on our eyes and ears and nerves, we don't really have any rights left.”
— Marshall McLuhan,
Understanding Media: The Extensions of Man (1964)
In recent years there has been a rapid deterioration of individual privacy, aided in large part by emerging media technologies. The boundaries between the public self and the private self have blurred to near transparency as we are encouraged to share information about our day-to-day lives online. What this means for the rights of women and girls, whose bodies were already treated as public property prior to this ongoing erosion of privacy, is a myriad of new methods for turning female flesh, and even the very identity of women ourselves, into intellectual property.
Through pornography, spy cam voyeurism, deepfakes, revenge porn, ‘female’ AI assistants, sex dolls, gender ideology, and the trafficking of women and girls, the bodies and images of women are being copyrighted and sold by men for a profit. The real woman has been disassociated, split from her own humanity and reduced to a hyper-sexualized facsimile, or even an experience that men can try on for themselves. Advances in technology facilitate abuse against women with remarkably little oversight, and the speed of its development ensures that women’s rights advocates are consistently on the defensive.
There have been some major gains made recently by campaigners, in various countries, who are utilizing social media to draw attention to the breaches of women’s human rights enabled by the internet; however, unless the fundamental issue is confronted—the treatment of women as commodities, by men—women and girls will remain vulnerable and will be forced to react in self-defense to each new violation, bolstered by emerging media technologies that promote our dehumanization and motivate physical abuse.
Deepfakes, for example, are digitally constructed videos that are overwhelmingly of a pornographic nature and over 90% of victims are women and girls. Data-encrypted applications are used to host and proliferate this type of revenge porn, where users can anonymously request a pornography video or image to be constructed from a single photograph. In 2020, a deepfake bot on Telegram was found to be “undressing” women and some of the victims were underage girls. At least 104,000 women were targeted for this new form of sexual terrorism. The actual number of women targeted by the deepfake bot is likely much higher, according to deepfake detection company Sensity AI, which was only able to count images that were shared publicly, although the bot gives people the option to generate photos privately.
“Most of the interest for the attack is on private individuals,” says Giorgio Patrini, CEO and chief scientist at Sensity. “The very large majority of those are for people that we cannot even recognise.” Of the victims, 70% of targets were private individuals whose photos are either taken from social media or private material.
The Telegram bot was powered by a version of DeepNude software. Vice first reported on DeepNude in June 2019. The original creator deleted the app over concerns about how it could be abused, but not before it reached 95,000 downloads within the span of a few days.
The code was quickly copied and replicated. The DeepNude software uses deep learning and generative adversarial networks to generate what it predicts victims’ bodies should like. The AI is trained on a set of images of clothed and naked women and is able to synthesise a representation of body parts in final images.
Unlike other non-consensual explicit deepfake videos, which have racked up millions of views on porn websites, these images require no technical knowledge to create. The process is automated and can be used by anyone – it’s as simple as uploading an image to a website or messaging service. In 2019, tech giant Samsung developed the ability to create video from a single image; previously, deepfakes required a video or hundreds of photos to achieve similar results.
In addition to the threat of having one’s likeness stolen from social media and edited into explicit content, there is also the escalating threat of voyeurism and spy cam pornography. Lawmakers in the US and in Japan have already begun decriminalizing the non-consensual recording of women in public spaces. In September this year, a Tokyo court ruled that filming a woman without her consent or knowledge was lawful, so long as she was wearing pants rather than a skirt. The case involved a woman who complained of being filmed by a man, who, when questioned, told the police, “I love the bums of women wearing pants.”
In November 2020, the Indiana State Attorney General’s Office overturned a criminal case and ruled that the law against revenge porn was unconstitutional and violated free speech rights. In May of last year, a Tennessee court overturned a conviction for a man who followed women in public, filming their breasts and buttocks and groping women. David Eric Lambert was charged with unlawful photography and attempted sexual battery. However, three judges - all men - ruled in an appeals court that women do not have a right to privacy or any claim to images of themselves being taken by men and used for sexual gratification or profit, saying:
“Exposure to the capture of our images by cameras has become, perhaps unfortunately, a reality of daily life in our digital age. When nearly every person goes about her day with a hand-held device capable of taking hundreds of photographs and videos and every public place is equipped with a wide variety of surveillance equipment, it is simply not reasonable to expect that our fully clothed images will remain totally private.”
Even as Western countries are removing women’s boundaries and rights to privacy, often under the guise of ‘gender inclusivity’, the spy cam pornography, or ‘molka’, epidemic in South Korea is having a devastating impact on women, with some committing suicide as a result of the emotional trauma. In 2019, Lee Yu-jung became the first known casualty of South Korea’s spy cam pornography epidemic when she took her own life after a co-worker filmed her in the changing room of the hospital where they worked. Though footage of Lee was part of a larger cache of illicitly filmed women, the perpetrator received only a ten-month sentence.
According to a survey by the Korean Women’s Development Institute, the mental toll of digital sex abuse is devastating: nearly one in four women who has been secretly filmed has considered suicide.
“Any violation of a woman's body can become sex for men; this is the essential truth of pornography.”
— Andrea Dworkin, Intercourse (1987)
On July 7, 2018, tens of thousands of Korean women gathered in the streets of Seoul, wearing masks to hide their identities. It was the largest women-only protest in the nation in recent recorded history; demonstrators claimed about 55,000 women took part. Many wore red T-shirts which said, “Angry women will change the world,” and held signs that read, “My life is not your porn.”
The issue that drove thousands of women to gather in the streets and protest against a single issue is the current escalation of spy cam pornography in South Korea, called molka in Korean, a portmanteau of “spy camera.” Though digital sex crimes are increasing globally, South Korea has been described as the “global epicenter of spy cam,” with more than 6,000 cases of illegal filming reported in 2017. That figure jumped to 6,800 in 2018 and continues to increase; however, that year only one-third of the reported cases were taken to trial. In 2019, 5,500 people were arrested for spy camera offenses, 97% of whom were men.
The motivation for filming women without their knowledge is an increased demand for the recorded violation of women’s privacy as a form of pornography; those who stream women in public places of undress can earn money from paid subscription sites. Indeed, a new industry capitalizing on secretly recording women has been abetted by the recent explosion in streaming pornography. In 2019, two South Korean men were arrested for an organized spy cam syndicate that filmed and streamed the private activities of an estimated 1,600 female hotel guests. They would then broadcast the footage on a website with thousands of members, charging a $44.95 monthly fee.
The technology is already having a profound impact on democracy: in Myanmar, where women have emerged as leaders in a protest movement against February’s military coup, deepfakes and revenge porn are weaponized against women demonstrating against a male-led military dictatorship.
In September 2020, the UK reported a 22% increase to a government-funded revenge porn helpline, with cases surging during lockdown restrictions imposed due to coronavirus. A 60% increase is predicted for the end of 2021, and employees of the helpline service fear the trend will become “the new normal.” In the US, at least one in ten women have been threatened with the posting of explicit photos, placing them at risk of losing their jobs or being stalked, according to a 2016 survey. Women in Malaysia who infiltrated secret Telegram chat groups discovered illegal pornography ranging from upskirt photos, to child sexual abuse, to ex-boyfriends revealing contact information. One woman involved in the sting operation said, “There was so much child porn being traded openly. There was a father who secretly filmed his own daughter and sent it to the group. I’ve seen boyfriends participating in a competition by sending their own significant other’s photos to the group. I’ve seen Photoshopped photos of women made to look naked or contact information shared with men claiming they are sex workers.”
One particularly shocking report revealed that across Australia, New Zealand, and the UK, one in three people had been a victim of image-based sexual abuse, with women overrepresented as victims and men making up the majority of offenders. Revenge porn is on the rise among youth, as well, with over 500 cases occurring in England and Wales during 2019. Though the average age of victims overall was 15, there were instances of children as young as 8 years old coming forward.
In addition, reports indicate that digital sexual abuse crimes are not being taken seriously by law enforcement. Data from the state of Victoria, Australia, found a correlation between revenge porn crimes and domestic violence: three-quarters of all image-based sexual abuse charges were brought to court with at least one other offense. Despite this, those who perpetrated revenge porn crimes alone did not receive prison sentences.
Alongside this assault on women’s privacy, the field of VR pornography is predicted to be worth $26 billion by 2026. It is reasonable to assume that in the near future, women will have their likenesses stolen and uploaded into interactive pornographic programs — not only without their consent, but potentially without them ever becoming aware of the crime committed against them.
Gender Identity Ideology:
Fantasy Made Flesh
Women as a class are seeing their humanity reduced to pornography in real time; men are appropriating womanhood via gender identity ideology at the very same time that technology is being implemented to construct artificial, objectified women, both digitally and physically.
In the digital realm, there are applications for building an AI girlfriend, sites dedicated to hosting deepfake pornography, and private, data encrypted groups yet to be discovered where men are requesting pornographic videos from a single photograph of women and children.
In the physical realm, gender ideology is paving the way for men to enter, unchallenged, women’s public spaces of undress, where the vast majority of spy cam pornography is recorded.
The gender identity campaign also aggressively advocates a legal redefinition of womanhood that turns the very existence of women into products, — plastic surgery, makeup, clothing, hormones — the digital avatar made flesh.
The hijacking of a female likeness and of women’s identity has become one of, if not the most, widespread forms of identity theft today. It is not a coincidence that as women’s likeness is reproduced to chilling effect in robots and deepfake pornography, there is a concurrent movement to remove women’s legal and physical boundaries, to redefine women ourselves according to male sexual fantasies.
It is clear that the gender identity movement is the legal aspect of a multi-facted coup, driven by pornography, profits, and paraphilias. The intent of gender ideology is to defang any laws which serve as barriers that block men from laying claim to women’s bodies or likeness. The point of gender identity ideology is to grant physical, social, and legal support to the digital colonization of women’s identity, to reduce women to the intellectual property of men. When women are reduced to male creations, they can be said to be possessions: the product belongs to its inventor, and the idea belongs to its originator — to be used by, and to generate profits for, those who possess them.
A program can be made to do whatever is desired of it; similarly, reducing women to an idea in a man’s head negates their autonomy. Gender ideology works alongside pornography to groom women back into a subordinate role as a resource for male consumption and use.
If we concede that women are an idea in a man’s head, then we assert that women are the intellectual property of men. This belief system permits complete ownership of women — ownership of our bodies, but also control over our vital potential for self-determination: an idea lacks autonomy.
The belief that women, half the population, are a social construct is to suggest that women were created by men, who have for centuries dominated and controlled culture. Men are made in the bodies of women, and women were not made in the minds of men. Now more than ever, we women must assert our existence as an independent class, and declare that we have a right to our image, identity, and likeness, lest we lose our claim to our own humanity.
Women Fighting Back Against Image-Based Abuse
Noelle Martin is a law graduate and activist in Australia. When she was 17, she discovered that photographs of herself had been photoshopped into pornographic images and distributed across porn sites. She chose to speak out, which led to more online attacks. Her action has become a major factor behind recently-enacted laws that make the circulation of non-consensual intimate images illegal in Australia.
Danielle Keats Citron is a Jefferson Scholars Foundation Schenck Distinguished Professor in Law at the University of Virginia School of Law, where she teaches information privacy, free expression, and civil rights law. Citron is the author of Hate Crimes in Cyberspace (2014).