Thomas Claburn

Google says it’s having problems following the European censorship decision that lets users seek removal of embarrassing search results.
Previously, Google removed search results that included information deemed to be illegal, as well as malware and sensitive personal information such as bank details. Unlawful content varies around the world but includes defamatory, copyrighted, and obscene material.
But in Europe, Google now must also judge whether information is “inadequate, irrelevant or no longer relevant, or excessive,” while balancing public interest. As Drummond points out, this is a vague and subjective test.
Those asking Google to remove information include former politicians seeking the removal of posts criticizing their policies while in office; criminals seeking the elimination of articles about their crimes; architects and teachers seeking to hide bad reviews; and individuals who published comments online that they now regret.
The right to be forgotten also functions as a gag order: Google is trying to be transparent about removals by informing websites when their pages have been removed from Google’s index. But they cannot be specific about why they have removed the information because that could violate the individual’s privacy rights under the court’s decision.
There is no easy way to balance one person’s right to privacy with another’s freedom of expression and right to truthful information.

One thought on “Thomas Claburn

  1. shinichi Post author

    Google Struggles With ‘Right To Forget’

    by Thomas Claburn

    http://www.informationweek.com/mobile/mobile-applications/google-struggles-with-right-to-forget/d/d-id/1297238

    Since the Court of Justice of the European Union affirmed in May that Europeans enjoy a “right to be forgotten,” Google has received 70,000 search result removal requests related to 250,000 webpages.

    In an op-ed piece published in British, French, German, and Spanish newspapers, as well as on the Google website, Google SVP and chief legal officer David Drummond describes the difficulty Google faces trying to implement the court’s decision.

    “[The] challenge is figuring out what information we must deliberately omit from our results, following a new ruling from the European Court of Justice,” he wrote.

    Evaluating whether the information the company has been asked to hide from the public should really be made inaccessible has proven to be a difficult task. To make that determination, Google now has a team of people who individually review content removal requests and has formed an advisory council that includes experts from academia, the media, data protection authorities, civil society, and the tech sector.

    Previously, Google removed search results that included information deemed to be illegal, as well as malware and sensitive personal information such as bank details. Unlawful content varies around the world but includes defamatory, copyrighted, and obscene material.

    But in Europe, Google now must also judge whether information is “inadequate, irrelevant or no longer relevant, or excessive,” while balancing public interest. As Drummond points out, this is a vague and subjective test.

    Those asking Google to remove information include former politicians seeking the removal of posts criticizing their policies while in office; criminals seeking the elimination of articles about their crimes; architects and teachers seeking to hide bad reviews; and individuals who published comments online that they now regret.

    The right to be forgotten also functions as a gag order: Drummond says that Google is trying to be transparent about removals by informing websites when their pages have been removed from Google’s index. “But we cannot be specific about why we have removed the information because that could violate the individual’s privacy rights under the court’s decision,” he says.

    Drummond suggests that there are instances when removing information seems like the right thing, such as a man seeking to have Google omit a news article stating that he had been questioned in connection to a crime for which he was never charged. But he stresses that there is no easy way to balance one person’s right to privacy with another’s freedom of expression and right to truthful information.

    Although the EU’s mandate to forget has been criticized by news and advocacy organizations, it doesn’t appear to be particularly effective: Information removed from search results in Europe remains accessible in Google’s non-European search sites. As Google explains on its support site, those in Europe can access the US version of Google by visiting google.com/ncr. “Ncr” stands for “no country redirect.”

    Earlier this week, the Electronic Frontier Foundation argued that the European Court’s ruling is fundamentally flawed. “The real problem is the impossibility of an accountable, transparent, and effective censorship regime in the digital age, and the inevitable collateral damage borne of any attempt to create one, even from the best intentions,” the EFF said.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *