“Google closed my account for ‘sexual content.’ But they won’t tell me what it is and I’ve lost everything » | Scientific technology

Five years ago, after the death of a friend and band member, David Barbera decided to pay for a Google Drive cloud account. He wanted to save the music files so that his friend’s children could one day hear how their father played. “So I signed up for Google Drive,” he says. “It was the surest thing I could think of to make sure Javanese music wasn’t lost because the kids were so young at the time. »

Barbera, a 46-year-old high school teacher from Valencia, eastern Spain, didn’t expect a key detail: Google’s terms of service hide a system that deactivates accounts when it detects prohibited content, including sexual material involving children or terrorism. “The only thing I can think of is that I might download something I shouldn’t have downloaded, like movies I downloaded the other day. [peer-to-peer file exchange program] Emulation. Could there be child pornography or terrorism? It is possible,” Barbera explains in a long phone conversation.

At first, Barbera had no idea why his account had been banned. He started connecting the dots only after reading online forums and news articles. He describes a desperate experience of helplessness when he tried to interview a person at Google and find out how he had violated the company’s abuse policy.

– Advertising –

In July of this year, Barbera needed the music files he had on his old hard drives. To better organize his material, he started uploading everything to a Google Drive account, which he still pays monthly to have two terabytes of cloud storage. Within minutes of the process, Google canceled his account, saying that “malicious content” had been found.

He filed several complaints, answered emails from apparent Googlers asking for new details (whose names were Nahuel, Rocío, Laura), and called every company phone he could find without ever being able to speak to a human. At that point, he asked for help from a relative who works in journalism, and finally managed to talk to a supposed Google employee, who asked him to “be patient”.

sexual content

From this entire process, Barbera received only one specific response, and that was a message addressed to his wife’s email (which he added as a secondary account). The post read: “We believe your account contained sexual content that may violate Google’s terms of service and may also be prohibited by law. But then it added: “We have removed this content” and “If you continue to violate our policies, we may terminate your Google Account.” This message was received on August 26, and although it looks like a warning, the account is still suspended.

“I’ve had everything on there for 14 years, and I’ve only had it there for five years,” Barbera says, pointing out that he doesn’t store files on external drives. Losing your Google Account doesn’t mean your photos and videos are gone. Barbera also lost his class materials, the blog he kept and his YouTube account, not to mention the other services he signed up for via email, from Amazon to Netflix to a German music app.

in August, The New York Times published an article about two similar cases in the United States. Google told the reporter that the problematic images were photos of children’s genitalia that two parents took to a pediatrician because of a skin problem. When EL PAÍS asked about Barbera’s case, Google replied that they could not provide this information due to privacy laws because the user involved is European. The company says that they will share this information only with the appropriate party. But Barbera has yet to receive any details.

Google has offered access to this journal to employees on the condition that they not be identified and quoted verbatim. According to the company, which claimed it was not involved in this specific case, “sexual content” emails are only sent in cases of child abuse, not adult pornography. Why then, it “didn’t start again”? Google didn’t specify, except that it all depends on what was in that report. A Google employee asked if the newspaper would name the affected user, but did not say why he was interested in knowing.

EL PAÍS found three other similar cases in Barbera: two with other Google accounts and one with Microsoft. All the cases are from 2022 and only one case has the account restored. In this case, it was not about alleged sexual images of children, but about a password problem. The decision to restore it was never clarified either.

Big downloads

Another victim, who asked to remain anonymous because his company may have Google among its customers, reached out to a “close friend” who works for the company in Spain. This friend doesn’t work in a department related to content moderation, but he did some internal research, and the answer was less than optimistic: these cases were being handled overseas, and he had no idea anyone was actually reading the complaints.

This user’s account was terminated after downloading 40 gigabytes of photos, videos and WhatsApp conversations that he had on his hard drive. The file upload was so impressive that his company’s cybersecurity staff called him to ask what was going on. Google does not specify when or how it analyzes its users’ reports. But in two Spanish cases, as well as documented The New York Times, This happened when file movements were detected.

A third victim is suing Microsoft, desperate because he lost data from his personal life but also from work: “his master’s degree, his tax forms, his children’s birth photos and his professional databases. He is suffering,” explains his lawyer, Marta Pascual. “The judge could say that his right to privacy was violated, but I couldn’t find a case study.” »

Pascual’s client believes the suspicious files come from WhatsApp groups whose content has been automatically generated. All three victims have children, and although they don’t recall the pediatrician’s photos, they had typical images of children in the bath, in bed or in the pool.

Microsoft provides less information than Google. It only makes a few statements about how it combats child pornography in its systems: “First, we’re funding research to better understand how criminals abuse technology. Second, we are developing technology such as PhotoDNA to detect cases of child sexual exploitation. Third, our staff promptly investigates and removes reports of offensive content. And fourth, we work with other technology companies and law enforcement agencies to prevent crime. »

Like Microsoft, in this newspaper’s conversation with Google, the trust in these companies’ detection systems is noteworthy. Google’s software is finding more and more false positives: Between July and December 2021, it suspended 140,868 accounts, nearly double the number in the first half of 2020.

Google checks reports of child sexual material with two technologies: Known pornographic images have a numerical code that identifies them. If the systems find images that match these codes, it will disable the account. This is the PhotoDNA system provided by Microsoft.

The problem is the new photos. For these, Google has created a second system that interprets the images and assigns them the probability that they are child pornography. Then, in theory, they go to the reviewers, who decide whether or not the photo crosses the sexual line.

Google also talked to pediatricians, so the computer will be able to distinguish between images taken for medical purposes and others. But despite its welcome aim, the system can ensnare many innocent people, which may even involve police investigations.

“I have a friend who is a member of the national police and I called him to tell him about the case and he told me that he was questioning colleagues who specialize in computer crimes,” Barbera explains. “They said they didn’t know of a case like mine. In the United States, companies such as Google and Microsoft must report any suspicious findings to the National Center for Missing and Exploited Children (NCMEC), which in turn notifies the police. NCMEC sent 33,136 reports to Spain in 2021. These are mostly cases that are not investigated, and in all cases the police do not inform Google or Microsoft that this or that person is not a suspect. As a result, the companies make their own decisions, and it’s up to the victim to justify why the material was legitimate.

Leave a Comment

Your email address will not be published. Required fields are marked *