So the internet’s banning revenge porn. Or is it just a stunt by Google and Reddit?
Brilliant news! No images of a nude woman will ever appear on the internet again without her consent! Would someone please explain how this will work?
Lets enjoy the Vajazzled Pussy …
Revenge porn laws
A 24-year-old friend of mine tried to get sexual photos of herself removed from a number of platforms, including Facebook, Twitter, Reddit and Google Images. They had been posted by an aggrieved ex-boyfriend. Over and above the invasive and humiliating angle, she was concerned they would affect her job prospects, because heâd attached her full name to the data behind the pictures. The task proved impossible. She ended up paying someone to manipulate Google search results so that at the very least when potential employers Googled her name, which was distinctive, they wouldnât see photos of her naked, 16-year-old body at the top of the search results page. Itâs a technique employed by many celebrities to suppress less flattering images that appear online.
I clicked on Jennifer Lawrenceâs nude photos. Iâm not proud of it, but there you have it. I was sitting in the airport and I was tired and intrigued. Were they from behind-the-scenes on a photo shoot? Were they on set? Was it an âintentionalâ leak? Without really thinking, I clicked.
As soon as I saw the photos I knew the answer â and I felt sick. They were taken in the privacy of her own home and she looked so horrifyingly young as she pouted naively into the camera in varying states of undress. I spent the next 24 hours guiltily telling everyone I knew that they shouldnât click on any of the links they saw online.
There was as much widespread condemnation of the leak of the pictures as there was praise for Lawrenceâs reaction to their publication. She publicly lambasted platforms that housed the pictures and we all fervently agreed.
Now Reddit has announced that:
No matter who you are, if a photograph, video, or digital image of you in a state of nudity, sexual excitement, or engaged in any act of sexual conduct, is posted or linked to on Reddit without your permission, it is prohibited.
Meanwhile, Google says that from 23 March, users on its Blogger platform âwonât be able to publicly share images and videos that are sexually explicit or show graphic nudityâ. But they note that users will âstill allow nudity if the content offers a substantial public benefit. For example, in artistic, educational, documentary, or scientific contexts.â
Brilliant news! No images of a nude woman, taken or published without consent, will ever appear on the internet again!
Or, potentially, this is a stunt to show the platforms care. Either way, it will do very little to protect most women from the sort of emotional and professional carnage of having invasive, private photos put into the public sphere for public consumption, for no other reason than to titillate and humiliate.
No one has any interest in seeing me in my tighty whities, but Iâm not Jennifer Lawrence. Top tier celebrities of Lawrenceâs magnitude are likely to fall prey to high-profile leaks, but they can also afford world-class lawyers who can fire off cease and desist letters faster than you can say âdonât cloud it, babyâ.
The rest of us have to rely on the avenues for complaint which, to their credit, most publishing and social media platforms provide. But have you ever tried to use them? On the whole, theyâre difficult to navigate, almost completely automated and slow. Google Images usually has plenty of time to crawl and index whatever content it is youâre trying to get removed â and never forget that a screenshot only takes one click or keystroke.
A 24-year-old friend of mine tried to get sexual photos of herself removed from a number of platforms, including Facebook, Twitter, Reddit and Google Images. They had been posted by an aggrieved ex-boyfriend. Over and above the invasive and humiliating angle, she was concerned they would affect her job prospects, because heâd attached her full name to the data behind the pictures. The task proved impossible. She ended up paying someone to manipulate Google search results so that at the very least when potential employers Googled her name, which was distinctive, they wouldnât see photos of her naked, 16-year-old body at the top of the search results page. Itâs a technique employed by many celebrities to suppress less flattering images that appear online.
Then there are the technical limitations: many of the offensive images on Reddit arenât posted directly on the platform, but on external image sharing or hosting sites. Even if you accept the assumption that Reddit will be able to control the dissemination of non-consensual images on the SubReddits (channels, of which there are over half a million), will they block links to offending content? How quickly can they have images removed? What sort of resourcing will they put towards dealing with complaints? What proof will they require that the images are non-consensual?
The issue of consent in itself is hugely problematic. Will the person in the images need to prove the photos were taken without consent? Or that they were posted without consent? Will the onus be on the poster to prove consent? There needs to be a clear definition of what non-consensual means and how it will work in this context.
On the Blogger side, the new conditions stipulate that the restrictions â which are not for non-consensual images but for all images of a sexual nature â will apply to âpublicâ blogs. So on a private blog, which may still have thousands of subscribers, I can still post whatever I want and those subscribers can save or screenshot those images and share them elsewhere. We have a hole in the bucket, dear Liza!
Blogger is owned by Google. Google Images crawls the 60tn (and growing) web pages that make up the internet to find the results of your image search. While Blogger is in the top 100 most popular websites in the world, Google wonât tell us how many accounts they have or what the current split is between private and public. Also the all-encompassing âsexually explicit or show graphic nudityâ tag means that despite the stipulated exceptions, it will be likely to impede on usersâ creative license. As Zoe Margolis writes, it will be harder to share consensual images.
On a positive note, we can control education. We can educate â including those kids who grow up to be celebrities â that images are not safe anywhere.
Let me be clear that I am not in any way shaming or placing blame on the victims of photo leaks. I am saying we need to empower people to understand the risks, which whether we like it or not, exist. Photos are not safe if theyâre in the cloud, theyâre not safe on Polaroid or (have mercy) Snapchat. I wish that wasnât the case, but time and time again weâve been shown that it is with leaks, shaming sites and now the horrendous advent of ârevenge pornâ.
The announcement by Google and Blogger, though in some ways good in theory, wonât make the least bit of difference to the issue at hand. I wish we lived in a wonderful world where peopleâs privacy mattered to each and every one of us, but it doesnât. If people want to share content or images, they will find a way.
Managing a community the size of Reddit is almost impossible â it would take literally an army of content monitors to keep an eye on all the posts and test whether or not inappropriate material is going up, letc.one judge whether certain content has informed consent. This means they will ultimately have to rely on a complaints system.
Facebook uses this method and weâve seen enough instances of its failure to harbour serious concerns. The reality is that on platforms of this size, the task is so mammoth, itâs hard to envisage how a (mostly-automated?) complaints system can effectively work. That is, unless these platforms, who specialise in audience engagement, care to put their money where their mouths are and please explain.