OPINION: Here is one solution for the problem of social media and political violence

The intensity of America’s political culture that factored into the assassination of conservative influencer Charlie Kirk has intensified the scrutiny of the role of social media.

Getting your Trinity Audio player ready...
Montgomery

The intensity of America’s political culture that factored into the assassination of conservative influencer Charlie Kirk has intensified the scrutiny of the role our social media ecosystem plays. And rightly so. The social media platforms have gotten a free pass for responsibility for their content for far too long.

It goes back to the infancy of the Internet and something called Section 230. This was a provision in the Communications Decency Act, passed in 1996, that granted immunity to online platforms from liability for content posted by their users.

As a longtime media publisher – 20 years in newspapers and now a digital platform publisher – this has always been a head-scratcher for me. The exception for Facebook (Meta), Twitter (now X) and the rest that followed was based on the presumption that they are not publishers but merely technology conduits; only the users are publishers and, hence, liable for what is published.

Not so for legacy publishers. If a letter to the editor in the newspaper potentially libels someone, the newspaper shares liability along with the author. Likewise, our digital local news outlets at Issue Media Group do not escape responsibility for what we publish.

But the social media platforms? A completely different standard.

Because there is very little filter, they are a breeding ground for falsehoods, hate and rage, so it isn’t hard to imagine how someone might become so radicalized as to be motivated to act on these influences in the most violent, horrific ways. That’s not to mention social media’s role in youth bullying and mental illness. They not only operate without fear of prosecution, they actually profit from users’ engagement with provocative content.

Of course, someone could read and see enough incendiary content in traditional media to become radicalized, but social media algorithms are designed to reinforce biases and encourage emotional response. Not that their goal is to breed radicalized assassins, but in the quest to encourage engagement and time on site – to deliver more advertising and make more money – engagement can lead to enragement in the most unimaginable ways.

If providing the platform doesn’t by itself make social media companies publishers, then engineering the algorithms that promote some content over other certainly should.

We’re far past the need to nurture a newborn industry. And the language in the law pertaining to an “interactive computer service” is antiquated, even quaint. Facebook and Twitter didn’t even debut until almost a full decade after Section 230 was enacted. And of course they have morphed into much more than initially being a place to reconnect with high school classmates, share pictures of your family vacations, and tweet something innocuous like what you were having for dinner.

Over half of Americans (54 percent) now say that social media and video networks – such as YouTube, also subject to the same absolution for content responsibility as the other “interactive computer services” – are their primary source for “news,” surpassing TV, newspapers, news sites and apps (Reuters Institute, 2025).

Quite apart from being a source of news, social media are where misinformation and disinformation are spread widely and rapidly. And the algorithms of the platforms that lead users to falsehoods and hate also limit legitimate news and content that might moderate views and emotions. Users end up in information silos that confirm their biases. This is at the core of the divisiveness tearing our company apart and the dangerous rhetoric that influences those vulnerable to violence.

Repealing Section 230, by the way, has bipartisan support, from presidents Joe Biden to Donald Trump.

“Section 230 needs to be repealed,” Sen. Lindsey Graham (R-SC) said in the wake of the Kirk killing. “If you’re mad at social media companies that radicalize our nation, you should be mad.” He and Sen. Dick Durbin (D-Ill) have led the charge to repeal.


Recommendations of the Aspen Institute on Information Disorder:

  1. Withdraw platform immunity for content that is promoted through paid advertising and post promotion. 
  2. Remove immunity as it relates to the implementation of product features, recommendation engines, and design.

Read more here


But the extent of any progress on this initiative has been limited to dragging Mark Zuckerberg and company into a hearing room from time to time for political theater. Maybe Charlie Kirk’s killing will be the final straw.

If you want a solution to the social media’s central role in the radicalization of so many perpetrators of ideological violence, there is a logical one. It doesn’t need to curtail First Amendment rights, and it doesn’t criminalize defamation that occurs on social media.

It does mean the social media companies could be sued for defamation in civil courts. Same as CBS, the New York Times and Fox News (all of which have). No more special treatment to protect social media companies. Simply applying the same standard for responsibility to social media as other media.

Section 230 at least ought to be revised to redefine a publisher as it pertains to social media. It should include the application of algorithms that promotes some content and demotes other, especially in the age of AI. Whether done by human or technology, this constitutes editing content. Content prioritization and recommendation should be done responsibly, as required by other publishers.

Our Sponsors

Gilmore Foundation

Our Media Partners

Battle Creek Community Foundation
Enna Foundation
BINDA Foundation
Southwest Journalism Media Collaborative

Solutions journalism takes time, trust, and your support.

Close
Psst. We could use your help today!

Don't miss out!

Everything Southwest Michigan, in your inbox every week.

Close the CTA

Already a subscriber? Enter your email to hide this popup in the future.