Woman takes legal action against Google for leaked explicit content
In a landmark case, a woman, referred to as Laura, is suing Google in Ireland after intimate images and videos of her were stolen from her private cloud and uploaded to porn sites. The images were discovered by Laura when she looked up her own name online, and despite her efforts to have them removed, they continued to appear in Google search results.
The case, supported by HateAid, aims to build on the European Court of Justice's 11-year-old ruling on the 'right to be forgotten.' HateAid views Laura's case as an example of image-based sexualized violence against women, the misuse of intimate images, and the profit that companies such as Google make.
Laura's ID was also stolen, allowing the content to be found via a simple Google image search of her name. The issue at hand involves reverse image searches, a feature offered by companies including Google. A reverse image search involves uploading an image and asking a search engine to find similar images, but it's not 100% reliable and often delivers incorrect results.
Google did not respond to a request for comment regarding the current case. In response, HateAid claims that Google must be legally required to ensure that all previously reported and essentially identical intimate sexual images are permanently removed from Google search results. They argue that Google holds joint responsibility for the spread of such private stolen images and that a court can compel Google to implement effective measures to prevent reappearance by ruling on this case under data protection laws like the GDPR's 'right to be forgotten.'
The campaign, Our nudes are #NotYourBusiness, also aims to address the problem of deepfakes using AI. Deepfakes are manipulated media that can result in misrepresentation and harm, particularly in the context of image-based sexual violence. HateAid's goal is to force Google to provide better protection for victims of image-based sexual violence and to make the creation of deepfakes a criminal offense if consent is not given.
It's important to note that such attacks can target people who are not in the public eye, not just celebrities. Notable victims of image-based sexual violence and deepfake videos include US singer Taylor Swift and Italian Prime Minister Georgia Meloni. The woman, who has since moved homes, changed jobs, and suffers from post-traumatic stress disorder, wishes to remain anonymous.
Obligations for global search engine providers will arise in accordance with the EU's General Data Protection Regulation (GDPR) and the fundamental right to data protection. HateAid's case is a significant step towards ensuring that victims of image-based sexual violence receive the protection they deserve and that companies like Google are held accountable for their role in the spread of such content.
Read also:
- Connection Between ADHD and Trauma?
- West Nile Virus detected in Kentucky for the first time; authorities advise locals to adopt safety measures
- Pharmacy Bachelor's Degree: Career Overview, Educational Requirements, Essential Skills, Job Prospects, Choices, Future Trends, Obstacles Encountered
- Rural Medical Gap: Lack of Access to Contemporary Healthcare for Patients and Hospitals in Remote Areas