Background of Notice and Takedown
Since the birth of the internet, online service providers (OSPs) have butted heads with copyright holders over whether OSPs should be responsible for copyright-infringing material posted by their users. Should Google be liable for infringement when it provides links to websites that post photographs without a copyright license? Should YouTube owe damages for hosting a video that plays a song or shows a clip from a movie protected by copyright? In an effort to balance the interests of the copyright industry, OSPs, and the internet-using public, Congress passed the Digital Millennium Copyright Act (DMCA) in 1988. The DMCA grants several forms of “safe harbor” protection to OSPs, allowing them to avoid secondary liability for copyright infringement committed by users of their services. The DMCA also creates several streamlined forms of protection for copyright holders. Namely, Section 512 of the DMCA establishes a “notice and takedown” process. Following this process, copyright holders can obtain a court order, ordering an OSP to discontinue providing access to infringing material or activity on a website in its network (e.g. ordering YouTube to take down an infringing video). While the takedown process was intended to balance interests, modern takedown practices have led to widespread abuse that has given rightsholders unintended control over web content.
Modern Notice and Takedown: Automated Notices
Copyright law today deals with new technologies of a scale and type that could not have been anticipated by Congress when it passed the DMCA thirty years ago. When the DMCA was enacted, copyright law did not have to deal with internet behemoths like Amazon and Google. These OSPs are too vast for a copyright holder to be able to comb through using conventional methods. For example, every minute, four hundred hours of content are uploaded on YouTube, meaning that brute manpower alone will not be enough to survey YouTube for infringing videos. Instead, the copyright industry has resorted to using new methods for identifying infringing material, such as “automated notice-sending systems.” These systems rely on “crawler” programs that autonomously search through the internet (whether looking through YouTube videos or Google search results) for content that infringes a copyright. After identifying infringing material, the system sends a takedown request to the OSP, requesting that it remove the material or else the copyright holder will pursue Section 512 litigation.
Automated Notice Systems and Potential Abuse
In the past five years, automated notice systems have become exceptionally popular in the copyright industry. In 2014 alone, automated notice systems generated 345 million takedown requests for Google. As a result, many OSPs are unable to vet all of the requests they receive. Instead, OSPs have developed a common practice of simply removing all material targeted by a takedown request, whether or not the material was actually infringing, constituted Fair Use, or qualified for some other protection. Other OSPs have gone beyond statutory requirements by developing content filtering systems. For example, YouTube’s ContentID and Audible Magic prevent content from being uploaded without employing takedown notices at all.
One major concern about automated notice systems is that they often mistake non-infringing content for infringing content, but still serve a takedown request to the OSP. Before a copyright holder can serve a takedown request to an OSP, Section 512 requires the copyright holder to provide several documents, including a statement of “good faith belief” that the specified material has infringed its copyright and a statement that the takedown notice is accurate. Despite this requirement, automated notice systems often send takedown requests without sufficient documentation. These requests are rarely reviewed by human eyes before they are sent to an OSP. Yet still, out of fear of liability, many OSPs comply with these requests and will almost never make use of statutorily available remedies. Consequently, the public suffers by being stripped of access to many online materials, while the OSPs may suffer from loss of web traffic and loss of subscribers. This abuse of the DMCA notice and takedown system shifts control over online content to the hands of copyright holders.
Accountability for Notice and Takedown Abuse
Arguably, automated takedown systems are not the problem in themselves. From the copyright holder’s perspective, they may be the only feasible tool for fighting online privacy. Rather, automated takedown systems create problems because: (1) the systems are used to produce a constant stream of broad requests, generating too many requests for the OSP to actually review and (2) OSPs and internet users rarely take advantage of statutory remedies for pushing back against takedown requests. One solution to these problems would be to empower OSPs to push back more aggressively against erroneous takedown requests, holding copyright holders accountable in extreme cases for improving their systems while allowing them to continue using them generally.
In their paper, Urban, Schofield and Karaganis similarly suggest that stronger liability for reckless or malicious notice use could prevent some of the abuse behind widespread takedown tactics. The DMCA itself already provides OSPs with rights against misrepresentation, creating liability for any damages caused by a person who knowingly and materially misrepresents information in a takedown request. Increasing statutory damages for these kinds of misrepresentations could help deter notice abuse. The authors further suggest that reforming the DMCA to require a higher quality of notice claims would also incentivize rightsholders to improve their automated notice systems.
In their current state, automated takedown notices are flawed methods for handling copyright infringement that disrupts the balance between OSPs and the copyright industry. However, with some amendments to the DMCA, automated notices could play an important, responsible role.
 See Perfect 10, Inc. v. Amazon.com, Inc., 508 F.3d 1146 (9th Cir. 2007).
 See Viacom International, Inc. v. YouTube, Inc., 676 F.3d 19 (2d Cir. 2012).
 17 U.S.C. § 512(c)(3)(A)(i)-(vi).
 Zoe Carpou, Robots, Pirates, and the Rise of the Automated Takedown Regime, 39 Colum. J.L. & Arts 551, 564-66 (2016).
 Id. at 583-84.
 17 U.S.C. § 512(f).