Image for Fundamental liberties at stake in copyright plan
Avatar image of Ruth Coustick-Deal

Fundamental liberties at stake in copyright plan

Fundamental rights are under attack in EU plans for content filtering: freedom of expression, rule of law, and right to privacy.

Thousands of you have joined us in fighting for positive change in the EU copyright reform process, and thanks to you there has been some real movement against the link tax with two majorreports proposing ditching the plan altogether. We’re so excited to finally see some nails in the coffin of this terrible idea. But unfortunately the censorship machines idea (Article 13) still lingers.

What are we talking about here?

This part of the overall proposal wants websites to be forced to build tools that automatically stop content being uploaded, if it has been “identified by rightsholders”. This way, supporters of the proposal argue, copyright infringing material will never even appear on the web, and therefore won’t need to be taken down after the fact.

We just signed onto an open letter with 30 other groups against these censorship machine plans, which explains it like this:

“In effect, the proposed upload filter obligation will build a system where citizens will face Internet platforms blocking the upload of their content, even if it is a perfectly legal use of copyrighted content.”

Time for justice

When you take a look at the proposal from a civil liberties angle, the list of problems is just immense. It’s sad that MEPs are elected to by citizens to do what’s best for their constituents, but user rights is considered some kind of wild idea, and only one small group of politicians are assigned to consider our rights.

However, the European Parliament Committee on Civil Liberties, Justice and Home Affairs is about to give their views. And, boy do we need them here.

There are 3 fundamental rights that are being attacked in this filtering proposal: freedom of expression, rule of law, and right to privacy. So let’s break them down.

Free expression

Using automated tools to block content from being created will build an infrastructure of censorship. The very concept is a serious attack on fundamental right to free expression.

Furthermore, these proposals are not even concerned with whether the content involved is legal or not. The wording refers to to blocking speech that has been flagged by rightsholders. It creates an expectation that perfectly legal and acceptable speech will blocked.

The only system that exists like this so far is YouTube’s ContentID Program - which cost Google $60 million and still often results in false positives and unwarranted censorship. For small innovative companies and nonprofits who are affected by this law, there is no requirement for how much money a platform must be making to qualify for these monitoring and censorship programs, which begs the question: how can anyone make a tool that won’t censor?

There are actually plenty of legal reasons to use an excerpt of a text that has been copyrighted.

It is incredibly unlikely that in enforcing these proposals companies will be able to build something capable of taking account of the different copyright permissions in different countries. Our friends at Communia did extensive research on what is permitted in different nations in the EU in this interactive map. And they discovered that there are thousands of combinations of copyright rules in the EU. For example, you can make parody in Spain, but not Portugal; you can take photos of sculptures in public places in Germany, but not France; you can make quotes for criticism everywhere except Slovenia. There is no evidence that automated tools are capable of making these kinds of sophisticated judgements, so they will harm freedom of expression.

Right to privacy

In order to identify which content to filter and block internet hosting services will be obliged to monitor all uploads. Spying for the sake of copyright is another form of mass surveillance. Big tech powers are used against us, for the sake of handing over power and control to the moneyed few.

In fact, these proposals should never have been published, given the existing EU case law which states that “monitoring and filtering content is a breach of freedom of expression and of privacy (Scarlet/Sabam ruling and Sabam vs. Netlog).”  (EDRi guide to Copyright for the Perplexed.)

It’s reasonable to assume that once technology which analyses posts on the Internet is in place, function-creep steps in and governments can ask platforms to use the same ‘filtering’ technology to block dissident content, or whatever the latest rulers of X country deem ‘immoral’.

We already have the evidence from the UK where an Internet content filtering scheme was proposed in order to block ‘adult content’ online. Once enacted, what was a doubtfully-useful measure against online porn was quickly expanded into a filter that blocked alcohol sites (pub websites, alcoholics anonymous) self-harm (mental health support sites) and classed LGBT sites as automatically sexual - and blocked them.

Rule of Law

The EU Commission state that “respect for the rule of law is a prerequisite for the protection of all fundamental values”. The Rule of Law is the set of basic principles of justice: that we all innocent until proven guilty; have a right to fair trial; and that we are all equal before the law.

The Article 13 proposals chip away at these principles. An automated system of online take-downs reverses the standard by sidestepping court proceedings, and assuming you are guilty before proven innocent.

There’s no fair hearing in this process. Rather, it aims to avoid judges and the burden of proof altogether. This is all happening between corporations and over the heads of citizens and court.

There are also very important questions about the lack of due process and legal responsibility, which La Quad Du Net raise:
“Who will be able to certify that the robots have the analytic finesse to distinguish between a work's illicit use and its parody? Who will be able to validate that there will be no abuse, no excess, no abusive interpretation of copyright?”

Whose rights matter?

The goal of this proposal is to protect creativity and artists online, and there are genuine important concerns behind this proposal about ensuring artists are legitimately rewarded for their work.

Unfortunately, despite the emphasis on protecting creators in this law-making process, it’s only a certain kind of sanitised creator who will be viewed as legitimate.

One MEP, Mary Honeyball, made this explicit in one debate when she said, “User Generated Content is great, but we need to ensure there is quality as well.”

‘User generated content’ is the term for content that was made by the ‘users’ of any given web platform. It could be podcasts, posts, blogs, vines, art, stories, videos.

Her comments show the success of lobbying pressure  that argues some forms of creativity are worth more than others.  It’s ageist - this is usually young people’s creativity that is not “quality”. It’s biased towards corporations who can afford to enforce their rights. And it’s a dated concept of what makes up a real artist. You are only “quality” if part of your income is going to a 3rd party company.

As our friends at EDRi say: “With respect to the very principle of these tools, they flagrantly neglect the status of amateur creators, who can only be acknowledged and protected when registered with a rights management company responsible for supplying the fingerprints of works to "protect" on sharing platforms.”

For example, here’s a cosplay from my favourite new game, Overwatch. It’s so good it could be a still of the game.

A huge amount of work went into this picture - hours of costume making, a professional photographer, image editing. It’s a new use, a transformative work - one which game companies frequently share back to show love to the fans. But a pixel-detecting auto tool might detect it as infringement of copyright and stop it being uploaded.

A vibrant, beautiful culture of learning and creativity is seen as a fair sacrifice for massive rightsholder corporations in exchange for giving them control of the system.

It’s not just individuals we are talking about being blocked here - but specific categories of cultural creation will be censored. This approach would dramatically shrink the playing field for new companies in the user-generated content space. As Joshua Lamel of Re:Create put it, “Remix culture and fan fiction would likely disappear from our creative discourse.”

The infringement of our rights is completely disproportionate to the intended goals of the copyright updating process - of making copyright fit for the 21st century.

If Europe moves forward with this law, we will be moving in the direction of the least amount of human right protections  --  enforcement by corporations and the defense of  creative rights of only the largest rightsholders, all in the name of copyright.


TOPICS
Take action now! Sign up to be in the loop Donate to support our work