[Plura-list] How copyright filters lead to wage-theft

Cory Doctorow doctorow at craphound.com
Sat May 8 12:27:06 EDT 2021


I'm doing two live events next Wednesday, May 12:

* Interoperability and Alternative Social Media, a panel for the Knight
Center's Reimagine the Internet event

* Book launch for Aminder Dhaliwal's Cyclopedia Exotica


Today's links

* How copyright filters lead to wage-theft: A Red Queen's race with

* This day in history: 2001, 2011,

* Colophon: Recent publications, upcoming/recent appearances, current
writing projects, current reading


🧑🏻‍🎤 How copyright filters lead to wage-theft

Last week, "Marina" - a piano teacher who publishes free lessons her
Piano Keys Youtube channel - celebrated her fifth anniversary by
announcing that she was quitting Youtube because her meager wages were
being stolen by fraudsters.


Marina posted a video with a snatch of her performance of Beethoven's
"Moonlight Sonata," published in 1801. The composition is firmly in the
public domain, and the copyright in the performance is firmly Marina's,
but it still triggered Youtube's automated copyright filter.

A corporate entity - identified only by an alphabet soup of initialisms
and cryptic LLC names - had claimed Ole Ludwig Van's masterpiece as
their own, identifying it as "Wicca Moonlight."

Content ID, the automated Youtube filter, flagged Marina's track as an
unauthorized performance of this "Wicca Moonlight" track. Marina
appealed the automated judgement, which triggered a message to this
shadowy LLC asking if they agreed that no infringement had taken place.

But the LLC renewed its claim of infringement. Marina now faces several
unpleasant choices:

* She can allow the LLC to monetize her video, stealing the meager wages
she receives from the ads that appear on it

* She can take down her video

* She can provide her full name and address to Youtube in order to
escalate the claim, with the possibility that her attackers will get her
contact details, and with the risk that if she loses her claim, she can
lose her Youtube channel

The incident was a wake-up call for Marina, who is quitting Youtube
altogether, noting that it has become a place that favors grifters over
creators. She's not wrong, and it's worth looking at how that happened.

Content ID was created to mollify the entertainment industry after
Google acquired Youtube. Google would spend $100m on filtering tech that
would allow rightsholders to go beyond the simple "takedown" permitted
by law, and instead share in revenues from creative uses.

But it's easy to see how this system could be abused. What if people
falsely asserted copyright over works to which they had no claim? What
if rightsholders rejected fair uses, especially criticism?

In a world where the ownership of creative works can take years to
untangle in the courts and where judges' fair use rulings are impossible
to predict in advance, how could Google hope to get it right, especially
at the vast scale of Youtube?

The impossibility of automating copyright judgments didn't stop Google
from trying to perfect its filter, adding layers of complexity until
Content ID's appeal process turned into a cod-legal system whose
flowchart looks like a bowl of spaghetti.


The resulting mess firmly favors attackers (wage stealers, fraudsters,
censors, bullies) over defenders (creators, critics). Attackers don't
need to waste their time making art, which leaves them with the surplus
capacity to master the counterintuitive "legal" framework.

You can't fix a system broke by complexity by adding more complexity to
it. Attempts to do so only makes the system more exploitable by bad
actors, like blackmailers who use fake copyright claims to extract
ransoms from working creators.


But it would be a mistake to think that filterfraud was primarily a
problem of shadowy scammers. The most prolific filter scammers and
wage-thieves are giant music companies, like Sony Music, who claim
nearly *all* classical music:


The Big Tech companies argue that they have an appeals process that can
reverse these overclaims, but that process is a joke. Instagram
takedowns take a few seconds to file, but *28 months* to appeal.


The entertainment industry are flagrant filternet abusers. Take Warner
Chappell, whose subsidiary demonetizes videos that include the numbers
"36" and "50":


Warner Chappell are prolific copyfraudsters. For decades, they
fraudulently claimed ownership over "Happy Birthday" (!):


They're still at it - In 2020 they used a fraudulent claim to nuke a
music theory video, and then a human being working on behalf of the
company renewed the claim *after* being informed that they were mistaken
about which song was quoted in the video:


The fact that automated copyright claims can remove material from the
internet leads to a lot of sheer fuckery. In 2019, anti-fascists toyed
with blaring copyrighted music at far right rallies to prevent their
enemies from posting them online.


At the time, I warned that this would end badly. Just a month before,
there had been a huge scandal because critics of extremist violence
found that automated filters killed their videos because they featured
clips of that violence:


Since then, it's only gotten worse. The Chinese Communist Party uses
copyfraud to remove critical videos from Youtube:


and so does the Beverley Hills Police Department:


But despite all that, the momentum is for *more* filtering, to remove
far fuzzier categories of content. The EU's Terror Regulation has just
gone into effect, giving platforms just *one hour* to remove "terrorist"


The platforms have pivoted from opposing filter rules to endorsing them.
Marc Zuckerberg says that he's fine with removing legal protections for
online platforms unless they have hundreds of millions of dollars to
install filters.


The advocates for a filternet insist that all these problems can be
solved if geeks just *nerd harder* to automate good judgment, fair
appeals, and accurate attributions. This is pure wishful thinking. As is
so often the case in tech policy, "wanting it badly is not enough."

In 2019, the EU passed the Copyright Directive, whose Article1 7 is a
"notice and staydown" rule requiring platforms to do instant takedowns
on notice of infringement *and* to prevent content from being re-posted.

There's no way to do this without filters, but there's no way to make
filters without violating the GDPR. The EU trying to figure out how to
make it work, and the people who said this wouldn't require filters are
now claiming that filters are fine.


Automating subtle judgment calls is impossible, not just because
copyright's limitations - fair use and others - are grounded in
subjective factors like "artistic intent," but because automating a
flawed process creates flaws at scale.

Remember when Jimmy Fallon broadcasted himself playing a video game? NBC
automatically claimed the whole program as its copyrighted work, and
thereafter, gamers who streamed themselves playing that game got
automated takedowns from NBC.


The relentless expansion of proprietary rights over our virtual and
physical world raises the stakes for filter errors. The new Notre Dame
spire will be a copyrighted work - will filters block videos of protests
in front of the cathedral?


And ever since the US's 1976 Copyright Act abolished a registration
requirement, it's gotten harder to figure out who controls the rights to
any work, so that even the "royalty free" music for Youtubers to safely
use turned out to be copyrighted:


We need a new deal for content removal, one that favors working creators
over wage-thieves who have the time and energy to master the crufty,
complex private legal systems each platform grows for itself.


Back in 2019, Slate Future Tense commissioned me to write an sf story
about how this stuff might work out in the coming years. The result,
"Affordances," is sadly still relevant today:


Here's a podcast of the story as well:


Meanwhile, governments from Australia to the UK to Canada are adopting
"Harmful Content" rules that are poised to vastly expand the filternet,
insisting that it's better than the alternative.



🧑🏻‍🎤 This day in history

#20yrsago Denmark plans to legalize music trading

#10yrsago NRA and Florida gag pediatricians: no more firearm safety
advice for parents

#5yrsago Conservative economics: what’s happened to the UK economy after
a year of Tory rule

#1yrago Volcano gods demand workers


🧑🏻‍🎤 Colophon

Today's top sources: Webshit Weekly (http://n-gate.com/hackernews/).

Currently writing:

* A Little Brother short story about pipeline protests.  RESEARCH PHASE

* A short story about consumer data co-ops.  PLANNING

* A Little Brother short story about remote invigilation.  PLANNING

* A nonfiction book about excessive buyer-power in the arts, co-written
with Rebecca Giblin, "The Shakedown."  FINAL EDITS

* A post-GND utopian novel, "The Lost Cause."  FINISHED

* A cyberpunk noir thriller novel, "Red Team Blues."  FINISHED

Currently reading: Analogia by George Dyson.

Latest podcast: How To Destroy Surveillance Capitalism (Part 05)

Upcoming appearances:

* Interoperability and Alternative Social Media, Reimagine the Internet,
May 12, https://knightcolumbia.org/events/reimagine-the-internet

* Book launch for Aminder Dhaliwal's Cyclopedia Exotica (Indigo), May
12, https://www.crowdcast.io/e/udbva8py/register

* Seize the Means of Computation, Ryerson Centre for Free Expression,
May 19,

Recent appearances:

* In conversation with John Scalzi at the Gaithersburg Book Festival

* Hexapodia XIII with J Bradford De Long and Noah Smith

* Podcapitalism Podcast

Latest book:

* "Attack Surface": The third Little Brother novel, a standalone
technothriller for adults. The *Washington Post* called it "a political
cyberthriller, vigorous, bold and savvy about the limits of revolution
and resistance." Order signed, personalized copies from Dark Delicacies

* "How to Destroy Surveillance Capitalism": an anti-monopoly pamphlet
analyzing the true harms of surveillance capitalism and proposing a
(print edition:
(signed copies:

* "Little Brother/Homeland": A reissue omnibus edition with a new
introduction by Edward Snowden:
https://us.macmillan.com/books/9781250774583; personalized/signed copies

* "Poesy the Monster Slayer" a picture book about monsters, bedtime,
gender, and kicking ass. Order here:
https://us.macmillan.com/books/9781626723627. Get a personalized, signed
copy here:

Upcoming books:

* The Shakedown, with Rebecca Giblin, nonfiction/business/politics,
Beacon Press 2022

This work licensed under a Creative Commons Attribution 4.0 license.
That means you can use it any way you like, including commercially,
provided that you attribute it to me, Cory Doctorow, and include a link
to pluralistic.net.


Quotations and images are not included in this license; they are
included either under a limitation or exception to copyright, or on the
basis of a separate license. Please exercise caution.


🧑🏻‍🎤 How to get Pluralistic:

Blog (no ads, tracking, or data-collection):


Newsletter (no ads, tracking, or data-collection):


Mastodon (no ads, tracking, or data-collection):


Medium (no ads, paywalled):


Twitter (mass-scale, unrestricted, third-party surveillance and


Tumblr (mass-scale, unrestricted, third-party surveillance and advertising):


"*When life gives you SARS, you make sarsaparilla*" -Joey "Accordion
Guy" DeVilla

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 195 bytes
Desc: OpenPGP digital signature
URL: <http://mail.flarn.com/pipermail/plura-list/attachments/20210508/4f32ca7c/attachment.sig>

More information about the Plura-list mailing list