The Supreme Court may be poised to weigh in on a subject that, even more than other far-reaching topics in its purview, affects nearly every citizen nearly every day: the internet. In doing so, the justices have the opportunity to make a muddled area of governance less murky. They also have a chance to do great harm along the way.
A divided panel of the U.S. Court of Appeals for the 5th Circuit last week upheld a Texas law that prohibits online platforms from removing user-generated material on their sites based on a user’s viewpoint or the viewpoint expressed in a post. Earlier this year, a unanimous panel of the U.S. Court of Appeals for the 11th Circuit determined a Florida law that similarly restricted technology companies violated the First Amendment. Now Florida has asked the Supreme Court to reconsider. The court, if it agrees to take the case, will confront questions about governments’ ability to regulate speech in the digital age, which both sides so far have approached as all-or-nothing — but that really demand nuance and care.
Those two attributes were glaringly lacking in Judge Andrew Oldham’s majority opinion in NetChoice v. Paxton, the 5th Circuit case, which denies any First Amendment protection for what most people call content moderation by platforms but what its author insists on calling censorship. This conflicts with plenty of precedent on corporations’ right to decide what kind of speech they will host. But most alarming are the blatant mischaracterizations of social-media sites that the opinion uses to justify this position. The assertion that neo-Nazi and terrorist material are “borderline hypotheticals” ignores the platforms’ documented and ongoing game of whack-a-mole with just that type of hatred. The claim that sites “exercise virtually no editorial control or judgment” somehow misses the millions of pieces of content they review daily — and the many more algorithmic filters prevent from showing up at all.
This last point is supposed to prove that the government can classify platforms as “common carriers,” just like railroads or phone providers, and demand that they not discriminate. Those on the opposite side of this debate believe that’s the wrong analogy, and it is. But the alternative they propose is similarly shaky: They say these platforms are more like newspapers or radio broadcasters. The truth lies somewhere in between. Social media sites do act as something of a public utility; they also do exercise editorial control and judgment that’s essential to the value they provide. They exist in a category all their own, and no court so far has figured out what standard should apply to them — or what types of speech regulation, from the extreme restrictions in Texas and Florida to more moderate transparency mandates under consideration elsewhere to nothing at all, the Constitution permits.
The Supreme Court seems likelier than ever to do that thinking in the near future. If so, the justices should resist the temptation of seemingly easy answers that miss the digital age’s most difficult realities.
The Post’s View | About the Editorial Board
Editorials represent the views of The Washington Post as an institution, as determined through debate among members of the Editorial Board, based in the Opinions section and separate from the newsroom.
Members of the Editorial Board and areas of focus: Deputy Editorial Page Editor Karen Tumulty; Deputy Editorial Page Editor Ruth Marcus; Associate Editorial Page Editor Jo-Ann Armao (education, D.C. affairs); Jonathan Capehart (national politics); Lee Hockstader (immigration; issues affecting Virginia and Maryland); David E. Hoffman (global public health); Charles Lane (foreign affairs, national security, international economics); Heather Long (economics); Molly Roberts (technology and society); and Stephen Stromberg (elections, the White House, Congress, legal affairs, energy, the environment, health care).