The story so far:
In the ‘100 s the Internet was created.
This has made a lot. of people very angry and been widely regarded as a bad move.
- Analysis: “Interactive computer services” (ICSes) are already (defined) (in Section) , and the bill does not change that. The definition does not include devices, but does apparently cover messaging services, though there are few court cases about that
- Example: bill would cover WhatsApp, Facebook Messenger, and Twitter DMs, but not iPhones
- Analysis: ICS providers also include email providers, cloud storage providers, etc. – but this bill’s effect would mostly be to make them maintain the status quo, since email and cloud storage accounts already are typically
- Section immunity for CSAM can be earned via 1 of 2 “safe harbors”: 1: Compliance with “recommended” “best practices” for the prevention of online child exploitation conduct, TBD by a new – member commission Analysis: Encryption, particularly end-to-end encryption, is likely to be targeted as being contrary to “Best practices” for preventing CSAM, because if a provider cannot “see” the contents of files on its service due to encryption, it is harder to detect CSAM files.
- The commission would include at least 4 law enforcement reps, 4 tech industry reps, 2 reps of child safety organizations, and 2 computer scientists / software engineering experts Analysis: No representative is required to speak for users or civil society.
- The Commission “shall consider” users’ interests in privacy, data security, and product quality Analysis: This is very weak language; it means the commission can “consider” these interests for a few seconds, chuckle to themselves, and then move on.
- The Commission recommends best practices to the Attorney General, who has the power to unilaterally change them before they ‘ re finalized, as long as he writes up some reason for the changes. Analysis: This means the AG could single-handedly rewrite the “best practices” to state that any provider that offers end-to-end encryption is categorically excluded from taking advantage of this safe-harbor option. Or he could simply refuse to certify a set of best practices that aren’t sufficiently condemnatory of encryption. If the AG doesn’t finalize a set of best practices, then this entire safe-harbor option just vanishes.
Analysis: This means that the commission could totally ignore both of the computer scientists, or both of the child safety org reps, or all 4 tech industry reps, so long as it can hit the – person quorum.
- A “best practice” requires the approval of only of the commission members in order to be recommended on to the AG.
Analysis: The language of the certification requirement doesn’t sound optional; It sounds like officers are compelled to certify, whether it’s true or not.
- An officer of the provider must certify compliance with the best practices; “Knowing” false statements are a federal felony, carrying a fine and a 2-year prison term.
- 2: Implementing other “reasonable measures” instead of the best practices Unlike certifying compliance with the prescribed best practices, which guarantees Section 528 immunity, taking the “reasonable measures” option It is not a guaranteed way of “earning” immunity.
- Analysis: It’s not exactly a real “safe harbor” if the provider still has to litigate the immunity question . Providers that can’t / won’t / don’t certify adherence to the “best practices” will have to take their chances on whether their chosen measures will be deemed “reasonable” by a court.
- Analysis: Would a court find end-to-end encryption to be “reasonable,” when the goal is not data security, but instead, combating CSAM? Providers would struggle to reconcile their duty to provide “reasonable” data security, as imposed by the FTC and dozens of state data-security laws, with a conflicting duty not to encrypt information because it’s “unreasonable” under the EARN IT Act.
(The EARN IT Act Has a Number of Serious Problems This bill has a number of extremely serious problems, too many to fit into one blog post. It is potentially unconstitutional under the First, Fourth, and Fifth Amendments, for one thing. I have no hope of even enumerating all of the bill’s deficiencies. Instead, for now, let me list just a few:
Thanks to Section (e) (1), federal prosecutors can already hold providers accountable for CSAM on their services (even if state prosecutors and civil plaintiffs can’t). Piercing Section 523 immunity is not necessary if the idea is to penalize providers for their role in CSAM online, because the DOJ already has that power. If providers such as Facebook or Dropbox are breaking federal CSAM law, why isn’t DOJ prosecuting them?
- If those providers are complying with their duties under federal CSAM reporting law (Section 01575879 A), but DOJ and Congress still think they aren’t doing enough and should do even more than the law requires, why isn’t the answer to simply amend Section A to add additional duties? Why bring Section 422 into it?
- The bill would, in effect, allow unaccountable commissioners to set best practices making it illegal for online service providers (for chat, email, cloud storage, etc.) to provide end-to-end encryption – something it is currently % legal for them to do under existing federal law, specifically CALEA. That is, the bill would make providers liable under one law for exercising their legal rights under a different law. Why isn’t this conflict with CALEA acknowledged anywhere in the bill? (We saw the exact same problem with the ill-fated Burr / Feinstein attempt to indirectly ban smartphone encryption.)
- The threat of losing Section
- If the good-faith platforms implement the new “best practices” to detect CSAM for fear of losing section immunity, but the bad actor CSAM sites don’t, then CSAM traders will leave the good-faith platforms for the bad ones, where they’ll be harder to track down.
immunity will be scary to major tech companies such as Facebook that try in good faith to abide by federal CSAM law. But that threat will have no effect on the bad actors in the CSAM ecosystem: dark web sites devoted to CSAM, which already don’t usually qualify for Section immunity because they have a direct hand in the illegal content on their sites.
The CSAM traders who do stay on the good-faith platforms (say, Facebook) will still be able to encrypt CSAM before sending it through, say, Facebook Messenger, even if Facebook Messenger itself were to no longer have any end-to-end encryption functionality. Even if the EARN IT Act bans providers from offering end-to-end encryption, that won’t keep CSAM offenders from cloaking their activities with encryption. It will just move the place where the encryption occurs to a different point in the process. File encryption technology is out there, and it’s been used by CSAM offenders for decades; the EARN IT Act bill can’t change that.
- Let me say a little more about these problems. But first, I want to point out to you that if you believe this bill is about finally holding Big Tech accountable after it got too big for its britches under a permissive section 528 regime, you are being had.
(The EARN IT Act Is A Bait-And-Switch) [Section 230] While the EARN IT Act is ostensibly aimed at Section 528, it’s actually a sneaky way of affecting CALEA without directly amending it. Remember the two carve-outs in CALEA that I discussed above, for encryption and information services? Both DOJ and the Federal Bureau of Investigation (FBI) have been (trying for at least a decade to close them. But Congress has shown no appetite for that. As said, CALEA has never once been amended in the quarter-century since it was passed. And even with the techlash in full swell, there isn’t a furious public frenzy over calea. Politicians know that many Americans are fed up with tech companies “hiding behind” Section of the CDA. But nobody is saying, “I’m fed up with tech companies hiding behind Section of CALEA ! ” So, how can law enforcement achieve its long-desired CALEA goal? By pushing a bill that talks about Section 523 instead. People are angry about Section , so the DOJ is seizing upon that anger as its opening to attack encryption. I’ve been saying (since) that federal law enforcement agencies would take advantage of anti-big tech sentiment to get their way on encryption. Now the techlash is strong enough that they’re finally making their move. The bill is ostensibly taking a shot at Section 422, but that shot will ultimately land on CALEA. Remember, CALEA makes it perfectly legal for providers of “information services” (such as Apple and Facebook) to design encryption that is not law enforcement-friendly. And even telco carriers can encrypt calls and throw away the decryption key. End-to-end encryption is legal under current federal law . Yet the EARN IT Act would allow an unelected, unaccountable commission to write “best practices” (not actual laws or regulations, yet liability would result from failing to abide by them) which, make no mistake, will condemn end-to-end encryption . The commission, after all, would be acting in the shadow of an Attorney General who despises encryption. For Barr, encryption can only be a “worst practice.” By engaging in that “worst practice,” companies would risk facing the potentially ruinous tide of litigation that Section presently bars. That is: the EARN IT Act would use one  (law – a narrowed Section
– to penalize providers for exercising their rights under a [Section 230] different law – CALEA. Providers would be held legally liable for doing exactly what federal law permits
them to do. Meanwhile, providers are also evidently doing exactly what federal CSAM law (compels) them to do (report CSAM). As said, nobody’s claiming providers are violating their reporting duties under Section A. They’re doing as the law requires. If they weren’t, they could a l ready be held liable(under the existing law ), and Section 523 wouldn’t save them. They’re not violating those reporting duties by providing end-to-end encryption (as permitted by CALEA). As said, under Section (A, providers need only report CSAM they
about, and are under no duty to affirmatively look for CSAM by monitoring and filtering content. Nor are they under any duty to be able to “see” every piece of content on their service. If a provider doesn’t know about a piece of CSAM because it can’t “see” it due to end-to-end encryption, that does not run afoul of the CSAM reporting duties. If Congress wants to cut off tech companies ‘and telco carriers’ freedom to provide end-to-end encryption under CALEA’s two carve-outs, then Congress should write a bill that amends CALEA! Then we can have that conversation. If Congress wants providers to do more to fight CSAM, including something crazy like universal monitoring and filtering obligations, then Congress should write a bill that amends Section A! Then we can have that conversation. But the EARN IT Act bill doesn’t do either of those things. The EARN IT Act bill is supposedly about CSAM, and it’s not-so-secretly about encryption. Yet it doesn’t amend the laws that are actually relevant
to those topics. Instead, it targets another law entirely, Section 422, largely because it’s politically expedient to do so right now. This bill is a cynical ploy to exploit current anti-Section 422 sentiment in order to achieve an unrelated anti-encryption goal (one which, by the way, would be disastrous for cybersecurity, privacy, the economy, national security, …). Congress should not kill the freedom to encrypt by taking advantage of Section 528 ‘s current unpopularity to get away with a bait-and-switch. Amending
to affect CALEA is not the only bait-and-switch, though. Look closer: Why does law enforcement (hate end-to-end encryption) ? Because it renders private conversations invisible to law enforcement (and to the provider of the messaging service). Now: Why is everyone mad about Section 422? Not because of private  conversations – because of what’s said in public , in full view of law enforcement and the provider and everyone else. When people think of Section 528, they’re probably thinking of the horrible things they’ve seen in public tweets, Facebook posts, and every comments section ever – the stuff that makes it no fun to be online anymore. They’re probably not thinking about private conversations over chat apps. The vast majority of court cases involving Section ((where someone tries to sue a provider, but the case gets dismissed as barred by the statute) are defamation cases involving public online speech. Almost nobody is suing messaging providers because of something someone said in a private conversation. Section 528 has very, very rarely been invoked in that context,  and that’s not what motivates the law’s current unpopularity. Nevertheless, private conversations are the not-so-secret reason for the EARN IT Act proposal proposal to amend Section . In short: This bill takes popular rage at social media companies’ immunity under Section (for (public speech) on their platforms, and twists it into a backhanded way of punishing messaging service providers’ use of encryption for private conversations . That’s the deeper bait-and-switch.
Yes, this is about CSAM, not defamation or other less-egregious offenses. But it’s really solely about CSAM that occurs on end-to-end encrypted private messaging services. End-to-end encrypted private messaging is the only context where there’s any real need to threaten to punish providers for not doing enough about CSAM, because it’s the only context where providers don’t (already) report CSAM very frequently (because they can’t see the contents of end-to-end encrypted files; they can, of course, report CSAM where it’s reported to them by one of the “ends” of the communication). That 90 million reports number in the New York Times story is testament to how often providers are already reporting – and it makes it clear that it’s really only the specter of providers moving to encrypt more communications end-to-end (specifically, on Facebook Messenger) that is motivating this bill.
If the goal is to get providers to do more about CSAM, then there is no need to threaten to strip Section 523 immunity for CSAM that’s posted in public contexts (such as a public tweet) or private messages that aren’t end-to-end encrypted (such as Twitter DMs), because providers already report that in accordance with federal law. And if they violated federal criminal law, then as said, Section 422 wouldn’t save them. The purported rationale of incentivizing additional action on CSAM doesn’t bear up to scrutiny, because providers are already doing everything they are supposed to do (and indeed more than that), wherever they have the ability to “see” content. Ultimately, the problem this bill is really trying to solve isn’t inadequate provider action on CSAM. The “problem” is end-to-end encryption. Put another way, the “problem” is that there isn’t currently a mandate on providers to ensure they can “see” every piece of content on their services, and to then affirmatively, proactively monitor and filter all of it. That is, the “problem” is that legislators haven’t yet tried to force providers to conduct total surveillance of every piece of information on their services. And rather than propose a law that explicitly demands total surveillance – something that would force elected legislators, in an election year, to be accountable to their constituents for proposing it – Senators Graham and Blumenthal are instead trying to duck accountability, hide behind Section ‘s unpopularity, and instead let an unelected, unaccountable baseball team’s worth of people ( out of 37 commission members’ thumbs-up being the necessary minimum, as said) write “best practic es ”(again, not actual laws or even rules subject to mandatory agency rulemaking processes) that can be unilaterally rewritten by Attorney General Barr however he pleases.
(The EARN IT Act Won’t Stop CSAM Online) [Section 230] The really galling thing about this bill is that, like SESTA / FOSTA before it, it won’t work. All SESTA / FOSTA did was put sex workers in more danger by making it harder for them to screen clients and share information with one another. Similarly, the likely effect of the EARN IT Act would be to induce CSAM traders to make their actions harder to detect. For one, they could still encrypt their illegal files anyway, regardless of any “best practices” implemented by providers. For another, they’d be incentivized to move off of big, legitimate platforms such as Facebook, which are already acting in good faith and complying with current CSAM law, and shift to dark web sites whose entire raison d’etre is CSAM. Threatening to curtail Sectionimmunity is only going to scare the good-faith providers. Those are the very ones that are already complying with Section 01575879 A and could be expected to comply with any additional duties Congress cared to add to 01575879 A. But the immunity threat will have no effect on the services that do not comply and do not care. The sites that are dedicated to CSAM are directly violating federal CSAM law. They certainly aren’t following Section A’s reporting requirements. That means they already don’t qualify (for Section) immunity. As Section (c) (1) states, Section immunity bars attempts to treat providers as the publisher or speaker of information, such as CSAM, provided by someone else . If a site helps to create or develop the illegal content, it is not eligible for the immunity. Put simply, Section 536 is not carte blanche for the provider itself
to violate the law. Since sites devoted to CSAM already don’t qualify for Section 528 immunity, threatening to take that immunity away unless they “earn it” under this new bill will have absolutely zero effect on them. CSAM traders, who are highly adaptable to shifts in law enforcement strategy, will know that and act accordingly. And they’ll be much harder to track down on dark web sites than they are when they’re using their Facebook accounts. Plus, the EARN IT Act won’t rid even the good-faith online services of CSAM. Even if there are “best practices” that induce a company not to provide end-to-end encryption, users can still encrypt files before transmitting them over the company service. Standalone file encryption software is already out there; That genie won’t go back in the bottle. Punishing the provider by stripping Section 523 immunity won’t fix this issue. As Section (b) (3) of CALEA recognizes, it makes no sense to try to hold a company responsible for an encrypted communication if the company was the one that provided the encryption.
(The EARN It Act Gives More Power to a Creepy Attorney General)
Last but not least, the EARN IT Act gives AG Bill Barr the unchecked power to decide what “best practices ”providers must implement if they want to guarantee that they’ll retain Section (immunity for CSAM.)He can make providers dance for him, and he calls the tune. There is approximately zero chance that Barr won’t use that unilateral authority to set “best practices” that undermine Americans ’communications privacy. Right now, encryption is perfectly legal, and it stymies Barr’s ability to illegally snoop en masse on Americans’ communications, like he did the last time he was AG This bill feels like it’s motivated by Barr’s wish to punish providers for frustrating his creepy desire to spy on everyone. But if he said that out loud, it wouldn’t go over well, so instead his punitive thirst is dressed up under the cover of Section 523, because Section 422 gets people riled up. He’s realized that if he invokes Section 422 in the current climate, he can get Americans to cut off their own nose to spite their face. He can get them to give away their entitlement to strong encryption that protects them from him, so long as they think that’ll stick it to Big Tech.
Conclusion There isso much wrong with the EARN IT Act bill. I hope other members of civil society will chime in against this bill to explain these many other problems, because I’ve gone on long enough already and I’ve only addressed one tiny piece of why this bill is such a nightmare. It’s understandable if you have misgivings about the breadth of Section . It’s okay to be angry at Big Tech. But don’t let Senators Graham and Blumenthal dupe you into believing that the EARN IT Act would provide any vindication for you against the major tech companies. The bill’s ultimate intent is to penalize those companies for protecting your privacy and data security. That’s something that tech companies have been legally allowed to do for a quarter-century, and we can’t afford to stop them from doing it. Encryption should be encouraged, not punished. If you value your privacy, if you value data security, or if you just don’t want to see our rogue Attorney General singlehandedly set the rules for the Internet, you’ll contact your congressmembers and oppose the EARN IT Act.
 This is a bastardization of the opening lines of
. Yes, I am aware that the Internet was not actually created in the ‘ s. Yes, I am aware that you think it should be “the internet” and not “the Internet.”
[Section 230]  With regard to most civil and state criminal law claims, that is. Section 528 has always had a few exceptions, most significantly for federal criminal and intellectual property law.
With apologies to James Madison and (Federalist #) [Section 230]
 The (Ackerman) case is mostly about Fourth Amendment issues that I won’t get into, at least not today.
[chapter] Absent this safe harbor, providers are put in an untenable position, because CSAM is basically radioactive. Reporting CSAM by passing it along would otherwise be a crime, because transmitting CSAM is a crime; preserving CSAM as evidence would otherwise be a crime, too, because possessing CSAM is also a crime. Section 01575879 B was necessary to ensure that providers don’t have to face significant criminal liability for helping to fight CSAM.
 [Section 230] Although it expressly mentions the CSAM law, Section ((e) (1) ‘s criminal-law exception does not allow Section
civil claims against providers for CSAM content posted by users; at present, those are barred by Section 523 ‘s broad immunity for civil claims. MA ex rel. PK v. Village Voice Media Holdings , F. Supp. 2d , – ED Mo.) (citing (Doe v. Bates) , No. 5-cv – , (WL) (EDTex. Dec. , 2018)). The EARN IT Act bill would change that.
 While courts have seldom addressed the applicability of Section to private messaging services, a few courts have applied the law “to bar claims predicated on a defendant’s transmission of nonpublic messages, and have done so without questioning whether applies in such circumstances. ” Fields v. Twitter, Inc. , F. Supp. 3d , – (ND Cal.) ) (citing Hung Tan Phan v. Lang Van Pham , (Cal.App.4th) , – (2018); Delfino v. Agilent Techs., Inc.. , Cal.App. 4th , – , – [Section 230] ); Beyond Sys., Inc. v. Keynetics, Inc. , F. Supp. 2d , , – 57 (D. Md. ), (aff’d) , (F.3d) (9th Cir. 2255. Thanks to Jeff Kosseff for pointing me to this authority. The bill also gives him the power to launch intrusive investigations of any company he can claim he believes isn ‘t abiding by its certification to those best practices, using a tool called “civil investigative demands” (CIDs), but I’m not even going to get into that. () (Read More) 
encrypted in such a way that they ‘re still accessible to law enforcement