Challenging automated filtering systems – The case of Yelp

Before choosing a restaurant, you will probably seek advice on TripAdvisor or Yelp. These platforms employ filtering algorithms to ensure reliability and authenticity of their reviews. As the livelihood of the business largely depends on its digital reputation, the correct functioning of these filters is fundamental. If the algorithm assesses reviews wrongly, this will damage the reputation of commercial activities. Now put yourself in the position of the entrepreneur: how could you challenge this filtering system to protect your business? A recent ruling of the German Federal Court of Justice (BGH) brings Yelp back into the spotlight.

An article by Francesca Palmiotto

Yelp’s filtering algorithm…

“Nowadays markets run on information” (Patterson 2017). Buyers make decisions by relying on their knowledge of the product and turn to sources that act as intermediaries for such information. Yelp is an information provider which offers a review forum of commercial activities. The digital reputation of businesses in this platform may determine their success in the real world. Even a half-star change in a restaurant rating can increase the likelihood of filling the seats by up to 49% (Eslami et al. 2019). Therefore, it is crucial to ensure that the information given is accurate, reliable and understandable in order to protect users and business owners from fake, shill or malicious reviews.

For this purpose, Yelp uses algorithms that filter false reviews and aggregate data with the aim of displaying only relevant and trustworthy content. In the website, a video explains why this software exists and how it works. Overall, the 75% of reviews are ‘recommended’ and those which are filtered out (as ‘non-recommended’) can still be seen via a link on the bottom of each profile page. However, they do not factor into the business total star rating or reviews count.

…and its opacity

 While filtering the information is necessary, it is unclear how it is done. The specific criteria on the basis of which reviews are sorted as ‘recommended’ or ‘non-recommended’ and their relative weights are not disclosed to the public, as to prevent malicious users from gaming the system. This is particularly worrisome as the opacity of these algorithms may hide biased or even arbitrary criteria, leading to a wrong overall assessment that can harm the reputation of business owners.

The secrecy of Yelp’s algorithm has caused many controversies among users that have accused the platform of having damaged their activity. In the US, the complaints escalated into almost 700 lawsuits, although all have been dismissed. In Germany, the owner of several fitness studios had filed a complaint against Yelp as she found the filtering system unfair as based on arbitrary criteria. Last January 2020, the German Federal Court of Justice (BGH) ruled in favor of Yelp.

LG Munich, judgement of the 12.2.2016, case no. 25 O 24646/14;

OLG Munich, judgment of the 13.11.2018, case no. 18 U 1280/16;

BGH, judgment of the 14.01.2020, case no. VI ZR 496/18

The case in Germany: balancing conflicting rights

In the lawsuit brought by a fitness studio operator against the rating portal, the owner complained that the selection of reviews created a distorted and incorrect overall picture that infringes her right of personality (Art. 1(1) and Art. 2(1) Basic Law) and with her right to conduct her business (Art. 12(1) Basic Law). As a result of the poor rating, in fact, she suffered a loss of customers. She argued that Yelp should be held responsible as the distinction between ‘recommended’ and ‘non-recommended’ is arbitrary and made on the basis of non-comprehensible criteria.

In response, the defendant argued that the assessment of contributions is a permissible expression of opinion (Art. 5 (1), first sentence Basic Law). The rationale is as follow: 1) users review the gym; 2) Yelp evaluates the reviews; 3) as a result, the overall assessment of the gym constitutes a judgement and thus an expression of opinion by Yelp. The fact that the average ranking is based only on recommended contributions does not constitute an abusive criticism (Schmähkritik).

In balancing between the general right of personality and the freedom of expression, the BGH held that the legally protected interests of the plaintiff do not outweigh the interests of Yelp worthy of protection. In this specific case, the classification of assessments made by Yelp as ‘recommended’ and ‘not recommended’ is protected by freedom of opinion, which prevails over the right of the applicant.

 

Algorithmic decision-making: a new probatio diabolica?

 Everybody agrees on the fact that filtering is desirable. The main concern, however, relates to the validity and objectivity of its functioning. In cases involving automated decision-making, substantiating claims in court and proving damages turned out to be extremely difficult, if not impossible (in legal terms this is referred to as probatio diabolica). The BGH ruling provides an opportunity for a threefold reflection.

  1. Freedom of speech prevents meaningful scrutiny of automated systems

As seen above, Yelp’s filter has been safeguarded as opinion. Granting this type of legal protection has precedents in several cases in the US. In Search King v Google Technology, for instance, the Google’s search results have been considered as opinions, thus granting full constitutional protection to the PageRank algorithm. In this case, the Court found that Search King has failed to state a claim and granted Google’s motion to dismiss. Nonetheless, using freedom of expression as a framework right can be problematic when applied to automated decision-making.

In German law, when assessing whether a statement is covered by the freedom of expression, the criterion of falsity applies, as false information is not worthy of protection. In my view, as far as algorithms are concerned, this test falls short. In fact, the choice to use some criteria or the way in which they are weighed cannot be judged as true or false, but rather as valid/invalid and objectively/arbitrarily applied to the specific case.

In this regard, in the second instance, the Munich Higher Regional Court (OLG) adopted the right approach, as they checked the list of the criteria disclosed by Yelp. According to the Court, some were doubtful (such as the one of ‘user’s activity’), and their implementation often contradictory. For instance, it was not clear why more than 95% of reviews was filtered out (considering that according to the Yelp’s video, only 25% of reviews are ‘non-recommended’). Additionally, there was no evidence that the filtered contributions were falsified.

In order to test how the system works, I performed a small experiment. According to the list disclosed by Yelp in the proceeding, one of the criteria is the so-called ‘501’, meaning that if a review has 5 stars and the user has 0 friends and has left only 1 review, it will be probably tagged as non-recommended. Interestingly, when I left a review following this criterion (5 stars, 0 friends, only 1 review), my contribution was displayed as ‘recommended’ on the very first page of results. This probably means that other criteria apply to my review. Yet it is not clear why my contribution is displayed at the top of the business page.

Notwithstanding the fact that platforms, like Yelp, should be free to choose whether and how filter user’s content as part of their business model, the validity and objectivity of the system should be verifiable when it negatively affects individuals. Freedom of expression, however, does not call for such scrutiny.

  1. Trade secret should not prevent access to evidence

This situation is further aggravated by the lack of means to obtain evidence. Generally, algorithms and information on their functioning enjoy legal protection as a trade secret (Maggiolino 2019). In the German case, the defendant submitted that the individual evaluation criteria and their weighting in relation to each other in the software is protected as trade secret. On this basis, Yelp disclosed a list of criteria, although not complete and lacking the indication of their mutual evaluation. As far as there is no obligation of disclosure, and the protection as trade secret generally prevails, the applicant can only trust the other party’s assertion that the model is objectively justified and indiscriminately applied among reviews and companies.

While this protection is necessary, it should not be invoked as an absolute justification against any form of disclosure, as if it often happens in these proceedings. The Directive (EU) 2016/943 on the protection of trade secrets, in fact, subordinates the protection of trade secrets to the protection of the public interest. Particularly, Article 1(2)(b) states that trade secrets should not affect the application of rules requiring disclosure, among others, to judicial authorities for the performance of their duties. After all, in the Google Shopping case (Case AT.39740, Google Search Shopping, 27 June 2017), the European Commission relied on internal and not publicly available information to prove that Google’s algorithm was violating competition law. The Commission could obtain all the relevant documentation, generally protected by trade secret,  through requests, inspections and interviews, as part of their powers as administrative authorities. Similar means to get access to information should be available also to private parties in order to support their claims and exercise the right to compensation.

  1. Transparency of internet platforms fulfills a public function

A final consideration relates to the public function of transparency of internet platforms. As a general principle enshrined in the GDPR, individuals affected by automated decision-making should be able to understand the reasons for such decisions. In the case of Yelp, business owners should be able to understand how their ranking has been assessed and how they can eventually improve it; reviewers should be able to know why their review has been filtered out. After all, their contributions are also protected as opinions. This principle is reaffirmed in the Council of Europe’s Committee of Ministers Recommendation on filtering and blocking where it underlines the importance of transparency in order to respect the freedom of expression and information with regard to Internet filters.

In this regard, it should be noted that judicial decisions involving online intermediaries have a crucial relevance for all the interests at stake, also for those who are not represented in the lawsuit. In addition to the reasons stated above, the protection of filtering algorithms under the freedom of expression and through trade secret laws can hinder such transparency, with negative effects on the freedom of users and on the interests of entrepreneurs.

Published under licence CC BY-NC-ND.

This Blogpost was written by

Author

  • Francesca Palmiotto

    Francesca Palmiotto is a PhD researcher at the Law Department of the European University Institute of Florence. My research focuses on the impact of technological changes on criminal procedural law.

Francesca Palmiotto Written by:

Francesca Palmiotto is a PhD researcher at the Law Department of the European University Institute of Florence. My research focuses on the impact of technological changes on criminal procedural law.