RSS

A Supreme Court case could drastically change the internet

Nohemi Gonzalez, then 23, a California student, was one of 130 people killed by ISIS terrorists in Paris on November 13, 2015 (see details here).

Her family is suing Google, alleging its algorithms helped radicalize the killers by steering them to videos that ISIS posted on YouTube.

Vox says in an article here,

“The question of whether federal law permits a major tech company like Google to be sued over which content its algorithms served up to certain users divides some of the brightest minds in the federal judiciary. Although at least two federal appeals courts determined that these companies cannot be sued over their algorithms, both cases produced dissents.”

The Supreme Court has taken up the case, and will now resolve that question in its 2022-23 term.

“At stake [is] how the internet works, and what kind of content we … see online,” Vox says. Companies like Google use algorithms and automated search engines to steer individual users to content. A win for the Gonzalez family could impose “intolerable legal risk” on companies like Google and Facebook “that rely on algorithms to sort through content.”

Critics argue the algorithm approach causes social harm. Even so, Vox‘s writer, a law school graduate, is skeptical the Gonzalez family will prevail. Their lawyers face formidable proof issues; they’d have to show Nohemi’s killers watched YouTube videos, and those videos caused her death. But that’s not the issue before the Supreme Court. A federal appeals court ruled Google is immune from such lawsuits under Section 230.

That law shields internet platforms like YouTube (which Google owns) from being sued over user-created content. This basically means the Gonzalez family has to sue ISIS, not YouTube. This is significant; Vox says it’s unlikely social media sites could exist “if their owners could be sued every time a user posts a defamatory claim,” nor would sites like Yelp or Amazon reviews exist “if a restaurant owner or product maker could sue the website itself over negative reviews.” They have to go after the person who posted the review.

The lawyers for the Gonzalez family are trying to change that. They want to go after a defendant with deep pockets who’s easy to find. How do you sue dead terrorists? But nobody at Google or YouTube hurt their daughter. Just because you want to sue someone tangentially involved in a harm doesn’t mean you should be able to. If someone slips on a banana peel in the street in front of my house, should I be sued because I didn’t pick it up? In legal parlance, that depends on whether I had a legal duty to do so (the answer is no, unless I put the peel there).

But what if I did throw the peel in the street? And what if YouTube, even if it wouldn’t be liable for failing to remove ISIS’s videos, actively promoted them? Vox says, “The Gonzalez family argues that YouTube’s algorithm should be treated the same way,” and calls that “an entirely plausible reading of Section 230,” pointing out that bright legal minds (including judges) are divided on this question.

Congress enacted Section 230 in 1996 to protect online free speech. The principle behind it is that “a company that enables people to communicate with each other is not liable for the things those people say to one another.” In other words, the phone company isn’t liable if a Trump supporter in Iowa leaves a threatening message in an Arizona election official’s voicemail (see story here). Phone companies probably couldn’t stay in business if they could be sued every time someone uses a phone for illegal purposes.

But Vox’s writer notes the rule is “different for newspapers, magazines, or other publications,” who often can be sued for any content that appears in their publication. That’s because they have editorial control over it, which makes them active participants in its publication. This is why you see reporters using terms like “allegedly” in news stories.

The Vox writer says internet content falls into a “gray zone” between phone companies and print publications, because internet companies have some control over the content on their platforms, whether or not they choose to exercise it. Section 230 was enacted in reaction to a court decision that exercising any editorial control over content, even if incomplete, turned internet companies into publishers. It removed the “gray zone” ambiguity and firmly put internet companies into the phone company category by defining them as “platforms,” not “publishers.”

But that doesn’t fully answer the question of what happens if an internet company like YouTube actively promotes content it didn’t create but nevertheless published. Vox says, “In an ideal world, Congress would step in to write a new law that strikes a sensible balance.” But that ain’t gonna happen, because Republicans don’t want to solve problems, they want to force companies like Google and Facebook to let people like Trump spew lies on their platforms (see story here). (Who should decide whether someone is lying? Certainly not the government. But neither should the government tell private companies what to publish.)

Now that the Supreme Court has taken up the Gonzalez case, Vox says “we must wait and see if the Supreme Court hands down a decision that could smother many emerging forms of communication — and not because the justices necessarily have a particular axe to grind.” But, while Vox left this unsaid, I wouldn’t count on the conservative majority not having an axe to grind.

This case will be decided in an environment where conservatives in general feel they’re picked on and discriminated against by a broad array of media, most of all the internet companies who’ve kicked the most notorious rightwing liars and violence fomenters — including Trump — off their platforms.

Return to The-Ave.US Home Page


Comments are closed.