SIN vs Facebook

Online platforms such as Facebook, YouTube and Twitter increasingly control what you can see and say online. Algorithms follow users’ activity, while filters and moderators address alleged breaches of terms of service. These solutions are supposed to target harmful content, such as hate speech or incitement of violence. Unfortunately, there has also been a number of instances when legal and valuable content was removed, including historical photos, war photography, publications documenting police brutality and other human rights’ violations, coverage of social protests, works of art and satire.

Such unjustified and excessive removal of content by online platforms is often referred to as ‘private censorship’. It is particularly dangerous today, when the online environment is dominated by a handful of platforms with a global reach which have become the key channels of communication as well as the important sources of news.

As a result, they act as the ‘gatekeepers’ to online expression, thus gaining tremendous power over the information circulated on the Internet – power which they wield without an adequate accountability or responsibility.

Private censorship is further exacerbated by the lack of transparent rules on content moderation or effective appeal procedures, which makes it difficult to challenge a platform’s decision.

This needs to change! In the case SIN vs Facebook we are fighting to defend the rights of users whose freedom of expression was unduly restricted by arbitrary banning by the tech giant.

Update: We won before the court of first instance. Read more

Share on Facebook Share on Twitter

SIN vs FACEBOOK

The ‘Civil Society Drug Policy Initiative’ (‘Społeczna Inicjatywa Narkopolityki’, or ‘SIN’) is a Polish NGO which has for many years conducted educational activities concerning the harmful consequences of drug use as well as provided assistance to people who abuse such substances, including harm reduction activities.

In 2018, without any warning or clear explanation, Facebook removed fan pages and groups run by SIN. The platform had characterized them as ‘in violation of Community Standards’.

In January 2019, one of the SIN’s accounts on Instagram, a subsidiary of Facebook, was also removed in similar circumstances.

On 7 May 2019 SIN, supported by the Panoptykon Foundation, filed a lawsuit against Facebook, demanding restoration of access to the removed pages and accounts as well as a public apology.

Background of the case

Since 2011 SIN has run a Facebook fan page, which it used to warn against the dangers of substance abuse. Suddenly, in 2018 Facebook moderators (or its algorithms) raised objections to SIN’s activity: first one of the SIN-moderated groups and then the entire fan page, followed by 16,000 users, were found “in violation of Community Standards” and removed. SIN has attempted to use the mechanism provided by Facebook to challenge these removals, but to no avail. Members of SIN still do not know which particular content was deemed by Facebook as a violation of its Community Standards and for what reason.

SIN’s target audience are young people who are particularly at risk of experimenting with drugs and who are active in social media. Facebook was the key communication channel of the organisation, which SIN used to promote its activities and mission, to contact its volunteers, and to raise funds. Through Facebook, people using drugs could seek SIN’s help, while Instagram allowed SIN to reach younger users.

The removal of these pages, groups and accounts has made it considerably more difficult for the organisation to carry out its educational activities and other statutory tasks as well as reduced the reach of the published information and the possibility to communicate with a larger audience.

Given Facebook’s strategic role in SIN’s operations, towards the end of 2018 the organisation decided to set up a new fan page. So far the page has not been removed, despite the fact that SIN has not changed the way it operates it. The downside, however, is that SIN has had to rebuild the entire community of users that are interested in its mission (to date the new fan page has only gathered around 10% of the followers of the original page).Without knowing the reasons for the previous banning, SIN operates under constant uncertainty and fear that yet another fan page may be arbitrarily removed at any time and with no warning.

FAQ

Part 1. More about SIN vs Facebook

 What do you want to achieve by suing Facebook?
The story of SIN is a good example of the threats posed by private censorship online. This is why in the case SIN vs Facebook we are fighting for the court to recognize that non-transparent and arbitrary actions of the social network have led to an unjustified infringement of rights of that organisation. We hope that as a result of the case:
  • online platforms will improve the transparency of their decision-making process in cases of removal of content (in particular so that a banned user knows specifically which content was found inadmissible, exactly which clauses of the Community Standards it violated and why);
  • any bans imposed by the platforms will be proportionate to the infringement (while platforms should remove any harmful content, one problematic post should not, in principle, result in the removal of the entire page or account);
  • users whose rights have been infringed as a result of private censorship will find it easier to effectively challenge any removal decision which they believe to be wrong or unjustified. We want the online platforms to create an internal appeal mechanism which would permit an effective challenge of their decisions and respect banned users’ right to be heard. In addition, it should be possible to challenge the final decision of an online platform in court.

You will find more information about our long-term goals in the section: What has to change to curb the risks of private censorship?

 Why do you believe that suing Facebook is the best way forward?
  • A favourable judgment would serve as a helpful precedent for other persons who believe that they have been unfairly banned by an online platform and would make it easier for them to assert their rights in court.
  • The lawsuit will hopefully help change the practices of online platforms and usher in much needed legal reform (more information in the section: What needs to change to curb the risks of private censorship?). Using lawsuits to fight for systemic changes is often referred to as “strategic litigation”. Today, there are a number of examples of NGOs using strategic litigation successfully to strengthen human rights protection in the context of new technologies, for example:
    • Digital Rights Ireland has successfully litigated against the EU Data Retention Directive (e.g. phone records), convincing the Court of Justice of the EU to annul the Directive. In many EU countries the judgment sparked legal changes which boosted data protection and the right to privacy in the context of data storage;
    • Max Schrems (today a NOYB activist) filed a complaint against Facebook, which has led to the Court of Justice of the EU annulling the European Commission’s decision on the Safe Harbour scheme. The scheme allowed American companies to tranfer personal data of EU citizens onto their servers in the US without providing the necessary data protection safeguards. In fact, as demonstrated by the information released by Edward Snowden, that data could easily be accessed by American security agencies;
    • In a settlement with the American Civil Liberties Union, Facebook has promised to no longer allow advertisers of loans, job offers and real estate to target their ads in a discriminatory manner (e.g. on the basis of age or race).
  • The litigation in SIN vs Facebook complements other initiatives by Panoptykon (advocacy, education, research) which are aimed at promoting responsible and transparent operation of online platforms, including content moderation, for example:
    • Letter by NGOs from across the world (including Panoptykon) to Marka Zuckerberg concerning due process in content moderation;
    • Panoptykon's opinion on the proposal of the EU’s Regulation on preventing the dissemination of terrorist content online submited to the Polish Ministry of Digitization;
    • Panoptykon’s complaints against Google and IAB concerning data protection in the context of digital advertising.
 What are your arguments against Facebook?

The arbitrary banning by Facebook of SIN’s fan pages, accounts and groups has infringed SIN’s personal rights (articles 23-24 of the Polish Civil Code).

What are personal rights?

In the case of legal entities (e.g. companies, organisations, etc.), personal rights are “non-pecuniary assets which allow [an entity] to carry out its statutory operations (Judgment of the Polish Supreme Court from 14 November 1986, case no. IICR 295/86).” To achieve its objectives, public interest organisations such as SIN need to reach the broadest audience possible. In practice this is not feasible without Facebook.

What personal rights of SIN where infringed?

  • freedom of speech (the ban imposed by Facebook prevented SIN from freely expressing its opinions, disseminating information and communicating with its audience);
  • reputation and recognition (banning SIN on Facebook suggests that the organisation’s activity was harmful and thus undermines SIN’s trustworthiness);
  • sense of security (Facebook did not provide any reasons behind its decision to ban SIN; this has left SIN disoriented and uncertain as to its ability to control its key communication channel).

Were Facebook’s actions unlawful?

Yes. Facebook infringed the principles of freedom of speech, due process and the right to an effective remedy. These principles should be respected not only by states, but also by private entities including, as is the case here, global tech giants.

Actions of online platforms cannot undermine users’ fundamental rights. Internal regulations which allow for arbitrary censorship of content are void, and a user’s consent (expressed for example when accepting the terms and conditions) is ineffective. This is especially the case where users are faced with monopolies and have no real choice but to agree to the Terms & Conditions that online platforms impose on them.

 Where will the court proceedings take place?
The case was brought before the Warsaw District Court.
 Who is SIN and what exactly does it do?

SIN (Civil Society Drug Policy Initiative) is a Polish NGO which, since its creation in 2011, has focused on drug abuse education and harm reduction, i.e. efforts aimed at reducing the negative consequences of substance abuse. SIN works primarily in clubs and at music festivals as well as online, where it provides assistance and raises awareness of the risks of particularly dangerous drugs.

Harm reduction focuses on protection of the health and life of drug users. According to this approach, while the use of drugs should be discouraged, those who cannot be dissuaded from drug abuse should be encouraged to do so in a manner least harmful to themselves and those around them.

Over the years, harm reduction has helped save countless lives around the globe. It is the approach recommended by the United Nations, the European Union (EU drugs strategy 2013-2020), the National Bureau for Drug Prevention, the Red Cross, Médecins du Monde, and hundreds of other institutions and organizations that work in the field of drug abuse prevention.

More information can be found at www.sin.org.pl.

 Social networks block users every day, why have you chosen SIN?
  • We wanted to support a public interest organisation that uses Facebook to advance its goals.
  • The important factor is also the importance of Facebook for SIN’s statutory work, given for example its target audience (predominantly young people). Removing the fan page considerably interfered with SIN’s everyday work. Research confirms that drug users are often “immune” to institutional, top-down education (from experts, in schools). Direct communication and engagement is a more effective way of reaching this group. It also shows Facebook is crucial to the NGO, conducting public health campaigns, especially when it comes to connecting with young people, who tend to be more difficult to reach via traditional media.
 What is the role of Panoptykon in SIN vs Facebook?

SIN is the plaintiff (the suing party) in these proceedings, and Facebook is the defendant (the sued party). As a banned entity, SIN has suffered direct damage and hence it is entitled to sue the perpetrator of the wrongdoing (Facebook).

Panoptykon decided to provide legal support to SIN. We have joined forces with Wardyński and Partners, a law firm, who have agreed to represent SIN in court pro bono.

The case SIN vs Facebook is also a part of our wider campaign aimed at protecting human rights in the online platform environment (check the Why do you believe that suing Facebook is the best way forward? section)

 Why do you stick your nose into a private company’s business? And why Facebook?

Facebook has over 2 billion users globally, 16 million in Poland alone. It is one of the most powerful companies in the world and yet it has no real competition. In Poland it is the most popular social network, used by roughly 80% of internauts and thus having a greater reach than two of the largest online information services in Poland, Onet and Gazeta.pl, combined. Facebook controls the online world not only because of its financial power and market dominance, but also thanks to the special role it has come to play in our society (as a crucial channel of communications, vital forum of public debate, leading source of information). Consequently, the way it decides to moderate content shapes how we see the world.

It is time for Facebook to finally accept that with great power comes great responsibility. We firmly believe that a company with such an incredible influence over our freedom of expression has to respect it, and we have the right to hold it accountable should it fail.

Of course the issue of private censorship is not limited to Facebook – it affects other online platforms as well, in particular those that, together with Facebook, control an important part of the information circulated online (for example YouTube, Twitter and Google Search). A 2016 study analysed the Terms and Conditions of 50 internet platforms and concluded that as many as 88% of them have the right to ban an account without a warning or any means of challenging such decisions. While our lawsuit concerns Facebook, we hope that it will usher in systemic changes to also regulate other platforms (check the What needs to change to curb the risks of private censorship? section).

 Wouldn't it just be easier to stop using Facebook altogether?
We could just ask you to quit using Facebook but have chosen a different approach. Quitting Facebook may work out for individuals, but from an NGO’s perspective, it is not a really effective strategy. Why?
  • First, we are well aware that even if every person reached by our request actually removed their Facebook accounts, in view of the scale of the company’s operations this would have no practical effect on Facebook. Thus we wanted to do something that would have a real impact.
  • Secondly, we believe that Facebook can be used for the common good by organisations, journalists, artists etc. We hope that in the future the market for social networks will be more competitive, allowing everyone to chose the platform with the strongest ethics and best standards of service. Since this is not the case today, we want to do something to help those for whom Facebook is an important tool in their research, charitable or professional activities, and who do not have a real alternative to it because of its dominant position.
  • Thirdly, the problem does not only concern Facebook, but also other internet platforms. So even if you are not a Facebook user, you are not safe from private censorship online.
 Why do you want to raise 9,000 PLN (2,000 EUR) in a crowdfunding campaign?

When we filed our lawsuit in May 2019, we knew that Facebook would not be making our case easier. However, we are surprised at how hard the company tries to avoid confronting our core arguments about private censorship on its platform. For instance, Facebook has refused to accept our lawsuit because ‘there are no Polish-speaking employees in its litigation team’. Yes. The company with almost 20 million Polish users claims that it ‘does not understand Polish’.

As a result, the court commissioned the official translation of the case documents and ordered SIN to pay the costs: 8,841.56 PLN* (2000 EUR). We launched a (successful) crowdfunding campaign to raise this amount and continue our struggle in court.

* Still, we believe that everyone has the right to argue their case in their own language in a dispute with global corporations such as Facebook. Thus, we lodged an appeal against the court decision to translate the documents. If our appeal is accepted, we will use the collected funds (and any surplus, should it occur) to cover further case-related costs.

Part. 2 Find out more about private censorship online

 Who else has been a victim of private censorship online?
  • The Pulitzer-winning photo of a naked girl fleeing napalm bombing during the Vietnam War, known as the Napalm girl. When the 1973 picture by Nick Ut was recently republished by one of the Norwegian newspapers, it was promptly taken down by Facebook for allegedly promoting child nudity. The same happened to an archive photograph of Jewish children who had been stripped and starved by the Nazis during the Second World War, posted by the Anne Frank Centre in Amsterdam. Ironically, the Centre’s post commented on dwindling Holocaust awareness in the US.
  • Pictures of the 2017 Marsz Niepodległości (the Independence March – an annual demonstration to mark the Polish Independence Day) taken by Chris Niedenthal. The photos depicted participants of the March: some of them were masked young men wearing symbols of extreme nationalist organizations and holding red burning flares. It seems that Facebook may have concluded that the pictures promoted totalitarian symbols.
  • Works of art such asThe Origin of the World by Gustave Courbet or The Descent from the Cross by Peter Paul Rubens , both of which featured female nudes and other naked figures. Facebook found the art problematic because it allegedly “promoted nudity”.
  • Pictures of people whose appearance is “unusual”: a picture of a plus size model Tess Holiday, promoting an event organized by an Australian feminist organization or a picture of a severely burnt Swedish man, Lasse Gustavson, a former firefighter who lost his hair, eyebrows and ears in the line of duty. In the case of Holiday’s photo, Facebook apparently concluded that the photo violates its rules on the promotion of a healthy lifestyle. The reasons for removing the picture of Gustavson are unknown.
  • The fan page of a popular British satirical magazine VIZ, which has existed since 1979. The magazine, famous for its provocative humour, creates parodies of British comic books and tabloids. It is not clear which exact post led Facebook to ban the page. In Poland YouTube removed one of the episodes of the satirical show “Przy kawie o Sprawie, (loosely translated as “Discussions over coffee”) titled “Is it OK to hit men?”. The episode concerned violence and discrimination against women. It was blocked because it allegedly incited violence.

Many of these take-down cases have caused a public outcry, leading Facebook to admit that it was wrong and restore the removed content. Unfortunately, not every author of a removed post can count on the support of public opinion. We hope our lawsuit will change this and that every user will gain a real possibility to successfully challenge private censorship.

 What are the risks of private censorship online?

Arbitrary and non-transparent content moderation by online platforms such as Facebook limits our freedom of speech, including the right to information. Even if we find a decision to remove content unfair, wrong or harmful, we have no tools to question it.

This may have a number of negative consequences:

  • Quality of information

    A private company decides what a user can see or share. Such a company is driven by its own profits, not by public interest. Therefore, it may tend to promote content that is profitable (e.g. emotive posts which help boost advertisement revenue), rather than content of high quality and informative value (content that is not “click-bait” may be at a greater risk of being banned).

  • Informal government pressure

    Governments can restrict access to politically inconvenient content. Instead of following the required procedures (e.g. getting a court order), they can “choose the easy way” and start using the self-regulation mechanisms of the online platforms to achieve the same result without the procedural restrictions. Such incidents have been already reported.

  • Pluralism in public debate

    Non-transparent rules of content moderation mean that online platforms, when deciding to remove certain posts or pages, can be influenced by their private beliefs and political opinions, e.g. to favour the right or the left of the political spectrum.

  • Discrimination

    Moderation criteria, which are introduced and enforced by online platforms in an arbitrary manner, may result in discrimination of certain communities (e.g. LGBT people or members of religious groups). Minorities are particularly exposed to the threat of the so-called abusive flagging campaigns – coordinated efforts by large groups of their opponents to ban content generated by minorities. These communities already find it difficult to make their voices heard in the public debate. Banning them on Facebook only worsens their marginalization.

  • Technical errors

    Tools used by online platforms to moderate content (filters, moderators) are not infallible. Errors can result from incorrect interpretation of the context of the publication (like in the case of the “Napalm girl” photo - check the Who else has been a victim of private censorship online? section for more examples). Facebook and YouTube are known to have removed publications documenting war crimes in Syria or violence against the Rohingya in Myanmar. Thus, the portal obstructed the work of prosecutors and human rights organizations who could use such materials as evidence against the perpetrators in court proceedings.

 What needs to change to curb the risks of private censorship?
We want to put an end to dominant online platforms arbitrarily dictating the limits of freedom of expression without any accountability for their decisions. What needs to change to achieve this?
  • Online platforms should comply with the freedom of expression standards developed by, for example, the European Court of Human Rights on the basis of the European Convention of Human Rights.

    What does this mean?

    • If a particular statement is acceptable in public debate in accordance with the freedom of expression standards, it should be permitted also on online platforms.
    • Even if a particular post or video exceeds the limits of free speech, any “sanctions” imposed by the internet platforms should always be foreseeable and proportionate (e.g. posting one abusive post cannot result in removal of the whole page or account).
    • Freedom of expression is not an absolute right and therefore it can be restricted to prevent abuse. This is why online platforms have the duty to prevent harmful content (e.g. hate speech). Removal of such content is justified and cannot be considered an attack on freedom of speech (you can read more about this topic in the Isn’t the fight against excessive blocking tantamount to allowing more hate speech and other harmful content online? section).
  • Online platforms should create internal mechanisms (“due process”) which will ensure that their decisions are made in a transparent and non-arbitrary manner.

    What does this mean?

    • A user whose content was removed should receive an explanation stating the reasons for the decision, including:
      • Identification of the content that it deemed unacceptable;
      • A reasoned statement identifying the clause of the Community Standards that was breached;
      • Information on how the particular piece of content was identified as potentially abusive and how it was assessed (e.g. whether it was notified by another user, or whether it was “caught” by an automatic filter; whether the decision to remove the content was made by a human moderator or by an algorithm);
      • Information concerning what the user can do if he or she disagrees with the platform’s assessment.
This would allow each user not only to dispute the objections raised by a platform, but also to avoid similar situations in the future.
  • In order to be able to effectively challenge a decision, each user should have the opportunity to present arguments in their “defence”. The user’s appeal should be considered by persons who were not involved in the making of the original decision and it should be carried out within a clearly pre-determined timeframe.

All the moderation procedures mentioned above should be described in a transparent manner and be easily accessible to users (e.g. clearly resulting from the platforms’ terms of service), so that everyone can easily check them and enforce their “rights”.

More about due process on online platforms can be found in the Santa Clara Principles, a set of recommendations from academic researchers and representatives of civil society organizations dealing with digital rights.

  • Users should have the possibility to have the final decisions of platforms verified by an independent external body, such as a court of law.

    What does this mean?

    • If a user believes that the removal of his or her content by a platform was wrong and that he or she had no real opportunity to defend himself/herself, he or she should be able to turn to the courts to analyse the matter and to order a revision of the decision (i.e. restoring the removed content or account). This has been recommended by, inter alia, the Council of Europe. It is the court, not a private company, that has the required authority to competently assess whether a particular statement exceeds the limits of free speech. This is why the courts should have the final say in these matters.
     Hasn’t Facebook itself recently introduced solutions aimed at improving the transparency of content moderation?
    Indeed, in recent months (already after banning SIN), Facebook has somewhat increased its transparency when it comes to content removal, for example:
    • in most of the categories of content that are not allowed on Facebook, the platform has introduced the possibility to “appeal” a decision to remove particular posts (previously only a decision to block an entire profile or page could be challenged);
    • it claims to provide more information about the reasons for their decisions to block;
    • it publishes statistics on the removal of certain content that breaches its Community Standards;
    • it has announced further changes, e.g. creating an “oversight board”, an “independent” body charged with deciding disputes concerning controversial content.

    However, these solutions remain insufficient, and they are not reflected in the Terms of Service of the portal. According to the Terms, Facebook may remove users’ accounts or user-added content whenever it determines that the content was shared, inter alia, in breach of its Terms, Standards or Policies. Facebook will only inform a user about the removal of his or her content “where appropriate”. The Terms still mention no reasoning for the decisions, no details of an appeal procedure and no guarantees protecting users from the arbitrary removal of content. Consequently, users will continue to struggle to find key information about the moderation process.

    We hope that SIN vs Facebook will incentivise the portal to make further changes and implement “due process”, thus establishing the standards also for other platforms. In addition, with SIN vs Facebook we strive not only to persuade the platforms to create better internal procedures, but also to ensure that users who do not agree with their decisions can challenge them before an independent, external body, such as a court (check the What needs to change to curb the risks of private censorship? section)

     Isn’t the fight against excessive blocking tantamount to allowing more hate speech and other harmful content online?

    No! An Internet that respects users’ right to freedom of expression does not equal allowing hate speech, incitement to violence and other harmful content. Without a doubt, online platforms are an important link in the fight against this type of breach and they should remove such content. Private censorship, however, is not the solution to this problem.

    This has been clearly spelled out by the authors of the joint declaration of the UN Special Rapporteurs on freedom of expression and on violence against women, who stress the importance of combating cybercrime while at the same time avoiding excessive removal of legal content.

    They note that private censorship not only will not help us to effectively combat hateful content, but quite the opposite, it will result in greater discrimination (check the What are the risks of private censorship online? section). This is because, like hate crime, it could target communities that are particularly at risk of discrimination.

    The solutions that we are fighting for will not undermine the fight against harmful content on online platforms. Rather, they will ensure the necessary protection to users whose freedom of expression was wrongfully restricted (check the What needs to change to curb the risks of private censorship? section).

     The Polish Ministry of Digital Affairs has agreed with Facebook to create a point of contact for banned users. Hasn't this solved the problem?

    UPDATE: On August 25, 2023 the “point of contact” was closed.

    Towards the end of 2018, the Polish government announced that it had signed a Memorandum of Understanding with Facebook. In accordance with the MoU, Polish users would have an additional right to challenge the portal’s decision to remove content through a designated “point of contact”. Facebook will reconsider each case and may even change its original decision. It is laudable that the Polish government notices the issue of excessive content removal by Facebook. However, while the “point of contact” may help restore banned content in some individual cases, it fails to resolve the problem of private censorship at a systemic level.

    Why? The procedure is afflicted with the very flaws that lay at the root of the litigation in SIN vs Facebook.

    In particular, the portal continues to hold arbitrary power to decide what should be taken down, and it continues to exercise this power based on unclear criteria and without sufficient safeguards protecting users from abuse. Equally, the MoU does not impose any reporting duties on Facebook (e.g. to publish regular reports on how the point of contact operates) and it does not establish any independent, external control over the portal’s decisions.

    We explain what reforms are necessary to effectively boost users’ rights in What needs to change to curb the risks of private censorship? section.

     Where can I find out more about private censorship?

    The Santa Clara Principles – a set of recommendations on “fair procedures” of moderation of content on online platforms. The Principles were established by academics and representatives of NGOs that work in the field of digital rights.

    Side-stepping rights: Regulating speech by contract - policy brief by ARTICLE 19 which examines the compliance of dominant social media platforms – Facebook, Twitter, and YouTube with international freedom of expression standards;

    Open letter to Mark Zuckerberg concerning due process and content moderation signed by over 100 organisations from across the world and Facebook’s response.

     The answer to my question is not on this list!
    You can contact us online at: fundacja@panoptykon.org. We do our best to address every message, however please mind that sometimes it may take us a while to respond.
    Actors and allies

    SIN

    The ‘Civil Society Drug Policy Initiative’ (‘Społeczna Inicjatywa Narkopolityki’) is a Polish NGO which has for many years conducted educational activities concerning the harmful consequences of drug use as well as provided assistance to people who abuse such substances, including harm reduction activities. In 2018, without any warning or clear explanation, Facebook removed fan pages and groups run by SIN. In January 2019, one of the SIN’s accounts on Instagram, a subsidiary of Facebook, was also removed in similar circumstances.

    On 7 May 2019 SIN, supported by the Panoptykon Foundation, filed a lawsuit against Facebook, demanding restoration of access to the removed pages and accounts as well as a public apology.

    Panoptykon Foundation

    Panoptykon Foundation is a Polish civil society organisation working to protect freedom and human rights in the context of new technologies. We diagnose threats resulting from surveillance practices, intervene in the cases of abuse, develop alternative legislative solutions, stimulate critical reflection and encourage action for change.

    Panoptykon provides legal support to SIN and runs communication activities around the case.

    Wardyński & Partners

    Wardyński & Partners, founded in 1988, is one of the largest independent law firms in Poland, committed to promoting the civil society and the rule of law. The law firm participates in non-profit projects and pro bono initiatives. Its lawyers are active members of Polish and international legal organisations.

    Wardyński & Partners, at the request of Panoptykon, represents SIN in the proceedings pro bono. The legal team involved in the case includes: Łukasz Lasek, Piotr Golędzinowski (both representing SIN in court), Angieszka Lisiecka and Bartosz Troczyński.

    The Digital Freedom Fund

    The Digital Freedom Fund (DFF) supports strategic litigation to advance digital rights in Europe. DFF provides financial support and seeks to catalyse collaboration between digital rights activists to enable people to exercise their human rights in digital and networked spaces.

    DFF has provided funding to Panoptykon to support the case SIN vs Facebook.

    What's new
     
    At the hearing, the court heard the last witness (the president of SIN at the time their Facebook account was blocked), closed the court's proceedings, and announced that it would deliver its verdict in a month.
    The court questioned a former SIN member of the Board who was responsible for managing the organisation’s social media accounts at the time of the removals. The final witness in the trial, the SIN's former President of the Board in 2018, will be questioned during the third (and most likely the last) hearing which will take place on February 13, 2024, at 10.00 (room 219). We expect the court to deliver the judgment soon after. More about the second hearing (in Polish).
    The court questioned the president of SIN who told about the activities of the organization and the negative effects caused, in particular, by the removal of the main SIN’s Facebook page. The court also admitted evidence from the hearing of two former board members of the organization who, at the time of the removals, managed the SIN's social media accounts. They will be questioned during the next hearing on the specific circumstances around the removals of the SIN's Facebook and Instagram accounts and groups. More about the first hearing.
    The Society for Civil Rights e.V. (Gesellschaft für Freiheitsrechte e.V. or “GFF”), in its brief filed with the Warsaw District Court, summarised the recent case law of German courts, including the Federal Tribunal, in cases similar to SIN vs Facebook, i.e. regarding arbitrary censorship in social media. According to the German Federal Tribunal, online platforms cannot arbitrarily block content or remove social media accounts. It confirmed that users should be entitled to receive justification of platform’s content moderation decisions and have an effective right to appeal. Moreover, if the platform intends to remove the entire account/page, in principle it should hear the user’s counterarguments before the removal. Meta requested that the Polish court refuses to accept the amicus curiae brief. More about the brief (in Polish).
    SIN is to cover the cost of the English translation of the lawsuit and the interim measures ruling after the court dismissed SIN’s appeal in this regard (8,841.56 PLN, approx. 2,000 EUR) The court’s decision in this regard is final. However, if Facebook (now Meta) eventually loses the case, it will be obliged to reimburse the fee. Meanwhile, we paid the fee thanks to individual donations we received for this purpose in 2020 (big thanks to everyone who contributed to it!).
    The Appelate Division of the District Court in Warsaw quashed Facebook’s appeal in which the company questioned the interim measures ruling favourable to SIN, delivered in June 2019. The decision is final and binding for Facebook! It means that for the duration of the proceedings:
    1) Facebook is prohibited from removing current fan pages, profiles, and groups run by SIN on Facebook and Instagram, as well as from blocking individual posts;
    2) SIN’s profiles, fan pages, and groups removed in 2018 and 2019 are to be securely backed up, so that – if SIN eventually wins the case – they can be restored together with the entire published content, comments by other users, as well as followers and people who liked the fan pages;
    3) the Court has also confirmed that it has a jurisdiction to hear the case and that the Polish law applies to it.
    Read more about the Court’s decision.
    SIN filed a response to Facebook’s appeal from the interim measures ruling. SIN addressed Facebook’s arguments and asked the court to quash the company’s appeal and to uphold the first instance decision from 2019.
    Facebook filed a response to the SIN’s lawsuit and an appeal against the interim measures ruling from 2019.
    Facebook questioned both the infringement of SIN’s personal rights, as well as the jurisdiction of the Polish court. The company asked the appellate court to dismiss the SIN’s lawsuit and to overturn the interim measures ruling.
    In less than 5 days we raised enough funds amount needed to cover the costs of translating the lawsuit and the interim measures ruling ordered by the court: 9,305.80 PLN (the exact cost of the translation is 8,841.56 PLN). Big thank you to everyone who contributed to our campaign!
    Still, we believe that in a dispute with global corporations such as Facebook everyone should have the right to argue their case in their own language. Thus, we lodged an appeal against the court decision to translate the documents. If our appeal is accepted, we will use the raised funds to cover further case-related costs.
    According to the decision issued by the court reviewing our case, SIN must pay 8,841.56 PLN (c.a. 2000 EUR) for the English translation of the lawsuit and the interim measures ruling. The court also decided to deliver the translated documents to Facebook via the court in Ireland.
    Despite the fact that the court has not yet settled the dispute regarding the language of the proceedings, the lawsuit (including interim measures ruling) was sent to the translator at the beginning of January. The work on translating the 28-page document has been going on for over two months. That is why SIN asked the court to urged the translator to speed up their work. The lack of progress in this case so far is a success of Facebook's strategy, which aims to extend the proceedings using procedural questions. Unfortunately, due to the current epidemiological situation, this strategy has a good chance of winning.
    Facebook’s attorney argued that Facebook was entitled to refuse to accept the documents as “none of the Facebook Ireland employees on the litigation team are Polish speakers”. The court summoned SIN to make an advance payment for the translation of court documents. On 6 August 2019 SIN sent to court claimant`s position in which SIN points out that Facebook has directed its services to Polish users in polish which proves that they can protect their rights without translating into English the statement of claim. SIN requested the court to determine whether Facebook justifiably refused to accept the court documents. More information (in Polish) about the language of the case can be found here.
    The District Court in Warsaw, in its interim measures ruling delivered on 11 June 2019, has temporarily (for the duration of the proceedings) prohibited Facebook from removing fanpages, profiles and groups run by SIN on Facebook and Instagram, as well as from blocking individual posts. The court has furthermore obliged Facebook to store profiles, fanpages and groups deleted in 2018 and 2019 so that – if SIN wins the case eventually – they can be restored together with the entire published content, comments by other users, as well as followers and people who liked the fanpages. The court has also confirmed its jurisdiction and that the Polish law applies to this case. The ruling is not final - Facebook has a right to appeal it. More information about the ruling can be found here.
    On 7 May 2019 SIN, supported by the Panoptykon Foundation, filed a lawsuit against Facebook, demanding restoration of access to the removed pages and accounts as well as a public apology.

    GET INVOLVED

    Join the #blocked campaign