“Right to be Forgotten” (RTBF)
is a landmark European ruling that governs the delisting of personal information from search results. This ruling establishes a right to privacy, whereby individuals can request that search engines delist URLs across the Internet that contain “inaccurate, inadequate, irrelevant or excessive” information uncovered by queries containing the name of the requester. What makes this ruling unique and challenging is that it requires search engines, when contemplating the requested delisting of URLs, to decide whether an individual’s right to privacy outweighs the public’s right to access lawful information.
During the first five years after this ruling came into effect (May 2014 to May 2019), Google received ~3.2 million requests to delist URLs, from ~502,000 requesters. Google determined that 45% of those URLs met the criteria for delisting. Each delisting decision requires careful consideration of the balance between respecting user privacy and ensuring open access to information via Google Search.
Google RTBF transparency report: a brief history
To be as transparent as possible about this removal process, and to help the public understand how RTBF requests affect Search results while preserving requesters’ privacy, Google has documented its removal process since 2014 as part of its
This initial RTBF transparency report was a great first step toward detailing how the RTBF is used in practice. However, inside Google we’re constantly looking for ways to improve our transparency reports and make more information available.
In the case of the RTBF, providing additional information is challenging because we must respect users’ privacy and not reveal any details that could lead to de-anonymization.
So in January 2016, as a preparation to providing better statistical analysis, our RTBF reviewers started manually annotating each requested URL with additional data, including the site category, the type of page content, and the requesting entity.
By December 2017, with three full years of carefully categorised data, we had the means to deliver an improved transparency dashboard — which we
made publicly available
in February 2018.
This new data enabled us to conduct a longitudinal analysis of how
Europe’s right to be forgotten
(RTBF) is being applied in practice, and how Google is implementing the European Court’s decision. The result of this study was published in a research paper at
. This blog post details our
paper key findings
, and replaces my 2018 blog post on the subject with updated numbers and additional insights.
How many requests does Google get per month?
Overall, after an initial period of intense activity during the first year, the number of delisting requests has stabilized at about ~47,000 per month over the last four years. During the same period, the time taken to reach a decision on whether or not to delist a URL fell from 85 days in 2014 to 6 days in 2019.
Who uses the right to be forgotten?
In total, 84% of the RTBF requests were made by private individuals, which is our default label when no other special category is applied. That being said, in the last four years, non-government public figures such as celebrities requested the delisting of 76,602 URLs; politicians and government officials requested the delisting of another 65,933 URLs.
Aggregating delisting requests by requesters reveals that a small minority of the requesters make heavy use of the RTBF: The top 10,000 requesters (out of 502,000) are responsible for 34% of the URL requests.
Turning to usage per country, normalizing by the number of internet users reveals that the volume of requests varies greatly from one country to another. For example, as can be seen in the chart, French Internet users requested on average 12 URL delistings per 1,000 users, whereas in Italy there were 7 such requests, and in Greece only 3. This volume disparity highlights the disparities in attitudes towards privacy, media norms, and maybe knowledge of the RTBF process throughout Europe.
What is the RTBF used for?
Breaking down removal requests by site type revealed two dominant intentions.
- Removing personal information which include among other things the requester’s personal address, contact information, images and videos comprised 29% of all delisting requests (16% target directory sites and 13% social media).
- Removing legal history comprised 21% of all delisting requests (19% target news sites and 2% government-related sites).
For more detail on the various categories and their prevalence you can refer to the
paper table 3
Looking at longitudinal trends reveals (as can be seen in the chart above) that the number of delisting requests targeting social media URLs has dropped slightly, whereas the number of delisting requests of news sites has increased significantly.
We also observe that since the
General Data Protection Regulation (GDPR)
went into effect, only 55% of the top 500 requested directory sites remain online. As a result we have seen a sharp decrease in the number of delisting requests targeting this type of site since 2018.
What type of information is targeted?
Breaking down requested URL delistings by type of information targeted reveals that professional information is the most targeted type, accounting for 24% of all requests. Another 8% of the requests target personal information, and only 2% are related to sensitive personal information. Further, 9% of the requests are related to a crime, 8% to professional wrongdoing, and 4% to criticism of political or governmental activities. For a detailed description of what each of those categories entail, you can check out
the transparency dashboard FAQ
Looking at the delisting rates, requests targeting political activities have the lowest rate, with only 3.4% of all requests being accepted and a delisting being made. The relatively low rate for requested professional information delistings (20.7%) is best explained by the fact that many of those requests pertain to information that is directly relevant or connected to the requester’s current profession, and is therefore in the public interest to be indexed by Google Search.
Different countries, different usages
We observe that the way in which the RTBF is exercised across Europe varies from country by country. Variations in regional attitudes toward privacy, local laws, and media norms seem to strongly influence the type of URLs requested for delisting. Requesters from Italy and the United Kingdom are the most likely to target news media, in contrast to France and Germany, where the major concern is personal information exposed via social networks and directory aggregators.
RTBF requests mostly target local content
Finally, we observe that delisting requests mostly target local content, as illustrated by the table above, which shows the origin of requests targeting the top news sites of various European countries.
Looking back at the first five years of RTBF delisting requests shows that privacy is not viewed in the same way everywhere, and that the many nuances stem from regional attitudes toward privacy, media norms, and other underlying factors. Even at the individual level we observe wide disparities, with some people requesting the delisting of 5,000 URLs whereas others will make only one or two requests. This complexity highlights the need to carefully evaluate tradeoffs and needs, while designing privacy-related systems in order to accommodate all users’ sensitivities.
Thank you for reading this post till the end! If you enjoyed it, don’t forget to share it on your favorite social network so that your friends and colleagues can enjoy it too and learn how the Right To Be Forgotten is used through Europe.
*** This is a Security Bloggers Network syndicated blog from Elie on Internet Security and Performance authored by Elie Bursztein. Read the original post at: http://feedproxy.google.com/~r/inftoint/~3/9HPMgr28AC8/