Alert

NTIA Files Rulemaking Petition Seeking “Clarification” of Section 230, as Required by Recent Executive Order

July 28, 2020

The National Telecommunications and Information Administration (NTIA), an executive agency in the U.S. Department of Commerce, filed a Petition for Rulemaking (Petition) yesterday with the Federal Communications Commission (FCC or Commission) urging the FCC to “use its authorities to clarify ambiguities” in Section 230 of the Communications Decency Act, a 1996 law that shields online platforms such as social media companies from liability for third-party content hosted on their platforms. This highly anticipated petition comes in a charged political environment in which some policymakers in Congress and the Executive Branch are looking to regulate social media and technology companies.

The Petition contains a detailed set of proposed regulations that would impose extensive limitations on the ability of online platforms to take advantage of the protections of Section 230. It was filed at the direction of an Executive Order (EO) issued in May entitled “Preventing Online Censorship.” The EO directed agencies under the control of the President to take steps to regulate decisions made by online platforms in response to what the EO characterized as “political bias” and “selective censorship” by online platforms in dealing with content posted by third parties, and directed NTIA to file a petition for rulemaking of this kind with the FCC.

As we previously explained, the interpretations of Section 230 in the EO were novel and in tension with court decisions, and the overall purpose of those interpretations—to shape the content-related practices of online platforms—raised significant First Amendment concerns (which were the subject of a prompt lawsuit that remains pending). The NTIA Petition cites the EO extensively, reiterating the EO’s concerns and adopting its statutory interpretations, asserting that “free speech faces new threats” from “large online platforms” that “appear to engage in selective censorship that is harming our national discourse,” and that “[m]uch of social media’s overarching influence and power stems from the immunities it enjoys under expansive interpretations of section 230[.]”

The EO directed NTIA’s Petition to ask the FCC to clarify two substantive aspects of Section 230: (i) that the “good faith” language found in subsection (c)(2) of the statute imposes some type of limitation on when the immunity under independent subsection (c)(1) applies; and (ii) that acting in “good faith” for purposes of the statute excludes practices that are deceptive or fail to provide fair notice to users. As explained more fully below, after a lengthy argument about why the original circumstances that prompted adoption of Section 230 no longer apply, the 55-page Petition makes these requests, and also proposes interpretations of other terms in the statute (“otherwise objectionable,” “interactive computer service,” and “treated as a speaker or publisher,” among others) to further the EO’s objective of limiting the immunity afforded to online platforms under Section 230. Finally, the Petition asks the FCC to subject online platforms to disclosure requirements that it claims are similar to those imposed on broadband internet service providers under existing FCC regulations.

NTIA’s proposed rules are at odds with the language of the statute, court decisions interpreting that language, the FCC’s traditional practice in issuing implementing regulations and clarifying statutory ambiguity, and the First Amendment’s protection of the freedom of speech, which extends to speech made by private companies in dealing with the content on their platforms. While the NTIA petition seeks to address some concerns over FCC authority by requesting the adoption of rules defining statutory terms, rather than by imposing stand-alone obligations, the result is an awkward attempt to implement substantive obligations in the guise of complex restrictions on immunity from suits. Although FCC adoption of NTIA’s legal interpretations and proposed rules would invite numerous court challenges and create uncertainty surrounding the content moderation practices of online platforms, the FCC is under no obligation to act on the Petition.

If the FCC does act, it will likely first seek comment on the Petition itself, and then would provide another opportunity for public comment after issuing a notice of proposed rulemaking.

The Commission’s Authority to Interpret Section 230

The Petition grounds its assertion of FCC authority in Section 201 of the Communications Act, which gives the FCC the power to “prescribe such rules and regulations as may be necessary in the public interest to carry out this chapter.” Citing the Supreme Court decisions in Iowa Utils. Board and City of Arlington, which upheld the agency’s authority to implement Sections 251, 252, and 332 of the Act, respectively, the Petition argues that because Section 230 was incorporated into the Communications Act by Congress, the FCC has general rulemaking and implementation authority over the statute, notwithstanding that it was adopted as part of the 1996 Telecommunications Act. Thus, according to NTIA, "[t]hat broad rulemaking authority includes the power to clarify the language of that provision, as requested in the petition.”

Prior commentary published during consideration of the EO has called into question the authority of the FCC to issue regulations pursuant to Section 230, noting that the statute does not expressly contemplate a role for the FCC in implementation, and arguing that the seemingly self-effectuating nature of the statute—which deals not with agency regulation of online platforms, but the circumstances under which parties can bring private lawsuits against them because of unlawful third party content—leaves little room for the FCC to claim it has the authority to impose substantive regulations implementing the statute. With respect to the first objection, the NTIA petition argues that express authorization is not required; depriving the FCC of authority over Section 230, it claims, would have taken an affirmative act, and “[n]either section [sic] 230’s text, nor any speck of legislative history, suggests any congressional intent to preclude the Commission’s implementation.” The NTIA appears to be attempting to sidestep the second objection, about lack of authority to adopt substantive rules, by characterizing its request as solely for interpretation of what it claims are “ambiguous” statutory terms in Section 230. Compliance with the NTIA’s proposed regulations would not be enforced directly by the FCC, but instead online platforms would be expected to conform in order to avail themselves of the protections of Section 230 against private tort actions. Setting aside whether this mechanism, if adopted, would actually work to change behavior, it would almost certainly be subject to numerous challenges arguing, among other things, a lack of statutory authority to adopt the regulations, an absence of ambiguity in the statutory language, and First Amendment concerns about regulation of speech.

Proposed Regulations to Clarify Claimed Ambiguities in Section 230

In the part of its submission dealing with Section 230, NTIA asks the FCC to adopt regulations that would fundamentally change and substantially narrow Section 230’s immunities from civil liability. In essence, NTIA requests the FCC to transform the very broad immunity provided by Section 230(c)(1) to an immunity only for internet platforms that act as mere conduits of third party information, and to strictly limit the immunity provided by Section 230(c)(2) to platforms that remove content only in accordance with a highly rigorous, newly-defined good faith standard, and when they have a reasonably objective belief that the content is “obscene, lewd, lascivious, filthy, excessively violent, harassing” or “similar in type.” Perhaps of greatest concern, NTIA’s requests to the FCC, if adopted, would deem internet platforms to be the publishers or speakers of content posted by third parties, and thus outside Section 230 immunity, if the platforms are “substantively contributing to, modifying, altering, presenting or prioritizing with a reasonably discernible viewpoint, commenting upon, or editorializing about” such third-party content. Moreover, an internet platform would fall outside the immunity and thus be civilly liable when it “reviews third party content already displayed on the internet and affirmatively vouches for it, editorializes, recommends, or promotes such content on the basis of the content’s substance or message.”

Claimed Uncertainties Regarding the Interplay Between Sections 230(c)(1) and 230(c)(2) 

NTIA’s requests to the FCC are premised on NTIA’s claim that “Section 230 contains a number of ambiguities that courts have interpreted broadly in ways that are harmful to American consumers, free speech, and the original objective of the statute.” In its effort to greatly circumscribe the immunity provided by Section 230(c)(1), NTIA claims that there is “uncertainty about the interplay between section 230(c)(1) and (c)(2),” such that these statutory provisions do not make clear “at what point a platform’s moderation and presentation of content becomes so pervasive that it becomes an information content provider and, therefore, outside of section 230(c)(1)’s protections.”

In fact, little if any uncertainty exists regarding the interplay between sections 230(c)(1) and (c)(2). As the statutory text itself makes clear, section 230(c)(1) provides internet platforms with a broad immunity from civil liability by stating that they shall not “be treated as the publisher or speaker” of any content posted to the platform by a third party. In contrast, section 230(c)(2), generally referred to as the “Good Samaritan” provision, provides internet platforms with a specific, narrow immunity for “any action taken voluntarily in good faith” to remove content that the platform “considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.”

Circumscribing the Immunity Provided by Section 230(c)(1) by Treating Internet Platforms as Publishers or Speakers

The NTIA requests that the FCC effectively nullify the broad immunity provided by Section 230(c)(1) by adopting regulations that treat internet platforms as publishers or speakers of information posted by third parties, thereby exempting them from the immunity. To this end, NTIA requests that the FCC “clarify when interactive computer services become information content providers through developing and creating content through the presentation of user-provided material.” NTIA requests that the FCC promulgate a regulation interpreting the statutory definition of “information content provider” to include platforms “substantively contributing to, modifying, altering, presenting or prioritizing with a reasonably discernible viewpoint, commenting upon, or editorializing about content provided by another information content provider.”

Furthermore, NTIA requests that the FCC put in place regulations that take internet platforms outside section 230(c)(1)’s immunity if the platform “reviews third party content already displayed on the internet and affirmatively vouches for it, editorializes, recommends, or promotes such content on the basis of the content’s substance or message.” This recommendation appears to be expressly and specifically designed to prevent actions such as those recently taken by Twitter in response to tweets by President Trump, including Twitter’s restricting user access to a tweet by the President that Twitter deemed to “glorify violence,” and Twitter’s adding a fact check to certain tweets by the President.

The NTIA recommends that the reach of section 230(1)’s immunity be limited to instances where an internet platform merely (1) displays third party content precisely in the manner requested by the third party, or (2) transmits, displays, or otherwise distributes such content, while applying the platform’s terms of service. Thus, the NTIA recommendations seek to limit section 230(1)’s immunity to internet platforms that act as mere conduits for third-party content.

Circumscribing the Immunity Provided by Section 230(c)(2)    

With respect to the Good Samaritan immunity provided by Section 230(c)(2), NTIA recommends that the FCC adopt regulations that limit the scope of the immunity in two principal respects: first, by limiting the scope of the term “otherwise objectionable,” and second by imposing a rigorous good faith requirement that will be difficult for an internet platform to meet. With respect to the term “otherwise objectionable,” NTIA recommends that it be defined as “material that is similar in type” to “obscene, lewd, lascivious, filthy, excessively violent, [or] harassing” content. The NTIA makes this recommendation based on its claim that if “‘otherwise objectionable’” means any material that any platform ‘considers’ objectionable, then section 230(b)(2) offers de facto immunity to all decisions to censor content.” That claim appears to be exaggerated, because the statutory language expressly requires an internet platform to act in “good faith” when it removes content that it considers to be “otherwise objectionable.”

With respect to the “good faith” requirement of Section 230(c)(2), NTIA requests that the FCC issue new regulations providing that a platform restricts access to content in “good faith” only if it:

(1) acts “consistent with publicly available terms of service or use that state plainly and with particularity the criteria the interactive computer service employs in its content-moderation practices,”

(2) “has an objectively reasonable belief that the material” is “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable [defined as indicated above],”

(3) “does not restrict access to or availability of material on deceptive or pretextual grounds, and does not apply its terms of service or use to restrict access to or availability of material that is similarly situated to material that the interactive computer service intentionally declines to restrict,” and

(4) “supplies the interactive computer service of the material with timely notice describing with particularity the interactive computer service’s reasonable factual basis for the restriction of access and a meaningful opportunity to respond, unless the interactive computer service has an objectively reasonable belief that the content is related to criminal activity or such notice would risk imminent physical harm to others.”

This highly rigorous good faith standard will be difficult for any internet platform to meet when the platform acts to remove content that the platform considers to be “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.” The proposed good faith standard fundamentally changes the scope of section 230(c)(2) immunity by changing the statutory requirement of a subjective good faith belief to an “objectively reasonable belief.” The proposed good faith standard also requires platforms to treat all objectionable content in the same manner or risk losing the immunity. This appears to place an enormous burden on internet platforms by prohibiting them from removing a single instance of objectionable content unless they, at the same time, identify all other instances of similarly objectionable content on the platform and remove them as well.

The proposed good faith standard also appears to impose further enormous burdens on internet platforms by requiring them to provide all third parties whose content the platforms remove with notice—including a detailed description of the internet platform’s reasons for removing the content—and an opportunity to respond, unless the platform has “an objectively reasonable belief that the content is related to criminal activity or such notice would risk imminent physical harm to others.” To give just one example of the burdens that this notice requirement could impose, in the first three months of this year YouTube removed over 6 million videos for violating YouTube’s policies.1 Requiring YouTube to provide notice and an opportunity to respond in most or all of these cases would appear to be unworkable.  

Proposed Disclosure Requirements

In addition to requesting “clarification” of language in Section 230, the NTIA Petition also asks the FCC to impose substantive disclosure obligations on online platforms; those offering interactive computer services “through a mass-market retail offering to the public” would, under NTIA’s proposal, be required to “publicly disclose accurate information regarding its content-management mechanisms as well as any other content moderation, promotion, and other curation practices,” to the degree that consumers were empowered to “make informed choices” regarding the use of such services and “entrepreneurs and other small businesses” could “develop, market, and maintain offerings by means of such service.”

Unlike the requested interpretations of Section 230, this part of the Petition seeks an independent Commission rule that would mandate these disclosures directly, rather than simply tying them to loss of immunity under Section 230. As a result, the Petition offers a separate legal authority argument for this request. NTIA argues that online platforms are “information services” under the Communications Act, and that the FCC has the power to mandate disclosures by information services under Section 163, which “charges the FCC to ‘consider all forms of competition, including the effect of intermodal competition, facilities-based competition, and competition from new and emergent communications services, including the provision of content and communications using the Internet” and assess whether laws and regulations pose a barrier to competitive entry, and Section 267, which “requires the FCC to examine market entry barriers for entrepreneurs and other small businesses in the provision and ownership of telecommunications services and information services.”  

Although it cites Section 163, the Petition offers essentially no argument about its application. The Petition is more fulsome with respect to Section 267, noting that the FCC relied on that provision in the 2018 Restoring Internet Freedom Order in imposing parallel disclosure obligations on broadband service providers. There, the FCC “reasoned that doing so would reduce entry barriers.” NTIA argues that the same logic applies to requiring disclosures by online platforms. The Petition ties this argument back to Section 230, in part, asserting that “empowering consumers with blocking technologies that they choose and control—rather than accepting a platform’s top-down centralized decisions, would directly advance section 230’s policy of encouraging ‘the development of technologies which maximize user control over what information is received by individuals, families, and schools who use the Internet and other interactive computer services.’” It also argues that “[i]ncreasing transparency about online platforms’ content moderation practices would…enable users to make more informed choices about competitive alternatives,” and “ensure that consumers can choose to consume social media whose policies they agree with without fear that manipulations to which they did not consent are happening behind the scenes.”

The disclosure obligations sought in the Petition are far-reaching. While the NTIA petition suggests that it would be a simple matter to import the type of disclosure obligations imposed on broadband providers to online platforms, it is not at all clear this would be so. Unlike broadband providers, a central feature of most online platforms is a set of sophisticated algorithms designed to direct, prioritize, and serve content to users; it is not obvious from the draft rule nor from the Petition itself how much detail about these mechanisms would be required to satisfy the ambiguous “sufficient to enable…consumers to make informed choices…[and] entrepreneurs…to develop…offerings” standards, which could presumably lead to lengthy, case-specific litigation if the rule were adopted as written. If the FCC decided to adopt the rule, it would also almost certainly be subject to challenge, with online platforms likely arguing both that Section 267 offers far too tenuous a justification for such a sweeping set of regulatory obligations and that the First Amendment bars speech mandates of this type. 

Next Steps

There is no required timetable for FCC action, so the FCC will address the Petition in due course and have to grapple with serious political oversight and influence efforts from Congress and others in the Executive Branch, including the Attorney General and others across government who are directly accountable to the President.

Other implementation efforts are underway related to the EO, which suggested activity across the government, including at the Federal Trade Commission, which has a number of options on how it might proceed.


1 https://transparencyreport.google.com/youtube-policy/removals

Read Time: 17 min
Jump to top of page

Wiley Rein LLP Cookie Preference Center

Your Privacy

When you visit our website, we use cookies on your browser to collect information. The information collected might relate to you, your preferences, or your device, and is mostly used to make the site work as you expect it to and to provide a more personalized web experience. For more information about how we use Cookies, please see our Privacy Policy.

Strictly Necessary Cookies

Always Active

Necessary cookies enable core functionality such as security, network management, and accessibility. These cookies may only be disabled by changing your browser settings, but this may affect how the website functions.

Functional Cookies

Always Active

Some functions of the site require remembering user choices, for example your cookie preference, or keyword search highlighting. These do not store any personal information.

Form Submissions

Always Active

When submitting your data, for example on a contact form or event registration, a cookie might be used to monitor the state of your submission across pages.

Performance Cookies

Performance cookies help us improve our website by collecting and reporting information on its usage. We access and process information from these cookies at an aggregate level.

Powered by Firmseek