
League of Women's Voters to discuss recent SCOTUS decision on Social Media Censorship

League of Women Voters of Illinois hosting lecture on AI and misinformation

Commentary | Jan. 6 was an example of networked incitement

The shocking events of Jan. 6, 2021, signaled a major break from the nonviolent rallies that categorized most major protests over the past few decades.

My co-authors and I, a media and disinformation scholar, call this networked incitement: influential figures inciting large-scale political violence via social media. Networked incitement involves insurgents communicating across multiple platforms to command and coordinate mobilized social movements in the moment of action.
The reason there was not more bloodshed on Jan. 6 emerged through investigation into the Oath Keepers, a vigilante organization composed mostly of former military and police. During their trials for seditious conspiracy, members of the Oath Keepers testified about weapons caches in hotels and vans, stashed near Washington, D.C. As one member described it, “I had not seen that many weapons in one location since I was in the military.”
The Oath Keepers were following Washington law by not carrying the weapons in the district, while waiting for Trump to invoke the Insurrection Act, which gives the president the authority to deploy the military domestically for law enforcement.
The militia was waiting for orders from Trump. That was all that kept U.S. democracy safe from armed warfare that day.
Social media as command and control
What happened in D.C. on Jan. 6, 2021, does not easily fit into typical social movement frameworks for describing mobilization. The insurrectionists behaved akin to a networked social movement, with online platforms forming the infrastructure to organize action, but its leaders were politicians and political operatives as opposed to charismatic community leaders. On that day in particular, the insurrectionists, who are closely aligned with MAGA Republicans more broadly, functioned like Trump’s volunteer army rather than a populist movement.Even with the availability of social media, networked social movements still need mainstream media coverage to legitimize their cause. Typically, community organizers push a particular issue – for example Black Lives Matter and #MeToo – into the media spotlight to get the public to care about their issue. Social movements tend to struggle for exposure and to frame favorable narratives.

The insurrectionists had the advantage of betting on mainstream media coverage for Jan. 6, so they focused on gathering resources and coordinating attendance. As a result, Trump’s supporters did not need to expend much effort to bring attention to the event and, instead, concentrated on organizing ride-shares and splitting hotel costs. As in prior social movements, the networking capacity of social media proved to be an important conduit to bring strangers together for the occasion. What the insurrectionists failed to do was convince key stakeholders, such as mainstream media, Vice President Mike Pence and the U.S. Capitol Police, to join their fight.
Networked incitement is different from the legalistic understanding of incitement, where an inflammatory statement immediately precedes unlawful acts or creates a dangerous situation. The call to action for Jan. 6 came from the president himself in a series of social media posts enticing supporters to come to D.C. for a “wild” time.
Tweets like these from a prominent figure became social media’s equivalent of shouting fire in a crowded theater.
Mobilizing for violence
My colleagues and I sought data to better understand what motivated everyday folks to storm the Capitol that day under great personal risk. Using the method of qualitative content analysis, we assembled 469 charging and sentencing documents for 417 defendants and coded them for the stated reasons for attending the event. We chose these court documents because they represented the fullest narrative accounts available. The purpose of these documents was to explain the rationales and mental states of the accused, while also offering a defense or explanation for their actions.
We analyzed the documents, looking at the multiple motivations for the insurrectionist mobilization. Overwhelmingly, insurrectionists said they were motivated by a desire to support Trump, which was equally split with a rationale to stop a rigged election. In sum, we concluded that disinformation mobilizes and incites political violence under specific conditions, such as a popular public figure calling for help.
For example, the court documents also directly reference social media posts of the accused. On Dec. 22, 2020, Kelly Meggs, an Oath Keeper who was later convicted of seditious conspiracy and sentenced to 12 years in prison, wrote on Facebook:
“Trump said It’s gonna be wild!!!!!!! It’s gonna be wild!!!!!!! He wants us to make it WILD that’s what he’s saying. He called us all to the Capitol and wants us to make it wild!!! Sir Yes Sir!!! Gentlemen we are heading to DC pack your sh*t!!”
The reference to “it’s gonna be wild” was a rejoinder to the now infamous tweet Trump sent after a reportedly difficult six-hour meeting the president had with staff about how to proceed with the fraud inquiry and undo the election results. Oath Keeper Meggs’ tweet illustrates that even before Jan. 6, militia groups were looking for signs from Trump about how to proceed. An investigation by NPR also illustrated how Trump’s messages emboldened participants and ignited the events of that day.
A dark future
No sitting president before Trump had exploited the capacity of social media to directly reach citizens to command specific actions.
The use of social media for networked incitement foreshadows a dark future for democracies. Rulers could well come to power by manipulating mass social movements via social media, directing a movement’s members to serve as the leaders’ shock troops, online and off.
Clear regulations preventing the malicious weaponization of social media by politicians who use disinformation to incite violence is one way to keep that future at bay.
Joan Donovan, Assistant Professor of Journalism and Emerging Media Studies at Boston University, is on the board of Free Press and the founder of the Critical Internet Studies Institute.
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Voting in Illinois: Ensuring election integrity and voting security

- Be cautious if a caller or texter requests personal information, such as your Social Security number. They may falsely claim you can vote early by phone or fix nonexistent errors in your voter registration.
- Avoid sharing credit card or financial information over the phone, especially if the caller offers seemingly free gifts in exchange for participation in surveys.
- If asked to donate over the phone, verify the legitimacy of the request by asking for a website where you can find more information.
Local news options for some rural Illinois communities are limited

Illinois News Connection
CHICAGO - The days of thumbing through a community newspaper are retreating into history.

Commentary |Many online conspiracy-spreaders don't believe the crazy lies they spew

There has been a lot of research on the types of people who believe conspiracy theories, and their reasons for doing so. But there’s a wrinkle: My colleagues and I have found that there are a number of people sharing conspiracies online who don’t believe their own content.
They are opportunists. These people share conspiracy theories to promote conflict, cause chaos, recruit and radicalize potential followers, make money, harass, or even just to get attention.
There are several types of this sort of conspiracy-spreader trying to influence you.

In our chapter of a new book on extremism and conspiracies, my colleagues and I discuss evidence that certain extremist groups intentionally use conspiracy theories to entice adherents. They are looking for a so-called “gateway conspiracy” that will lure someone into talking to them, and then be vulnerable to radicalization. They try out multiple conspiracies to see what sticks.
Research shows that people with positive feelings for extremist groups are significantly more likely to knowingly share false content online. For instance, the disinformation-monitoring company Blackbird.AI tracked over 119 million COVID-19 conspiracy posts from May 2020, when activists were protesting pandemic restrictions and lockdowns in the United States. Of these, over 32 million tweets were identified as high on their manipulation index. Those posted by various extremist groups were particularly likely to carry markers of insincerity. For instance, one group, the Boogaloo Bois, generated over 610,000 tweets, of which 58% were intent on incitement and radicalization.
You can also just take the word of the extremists themselves. When the Boogaloo Bois militia group showed up at the Jan. 6, 2021, insurrection, for example, members stated they didn’t actually endorse the stolen election conspiracy, but were there to “mess with the federal government.” Aron McKillips, a Boogaloo member arrested in 2022 as part of an FBI sting, is another example of an opportunistic conspiracist. In his own words: “I don’t believe in anything. I’m only here for the violence.”
Governments love conspiracy theories. The classic example of this is the 1903 document known as the “Protocols of the Elders of Zion,” in which Russia constructed an enduring myth about Jewish plans for world domination. More recently, China used artificial intelligence to construct a fake conspiracy theory about the August 2023 Maui wildfire.
Often the behavior of the conspiracists gives them away. Years later, Russia eventually confessed to lying about AIDS in the 1980s. But even before admitting to the campaign, its agents had forged documents to support the conspiracy. Forgeries aren’t created by accident. They knew they were lying.
As for other conspiracies it hawks, Russia is famous for taking both sides in any contentious issue, spreading lies online to foment conflict and polarization. People who actually believe in a conspiracy tend to stick to a side. Meanwhile, Russians knowingly deploy what one analyst has called a “fire hose of falsehoods.”
Likewise, while Chinese officials were spreading conspiracies about American roots of the coronavirus in 2020, China’s National Health Commission was circulating internal reports tracing the source to a pangolin.
In general, research has found that individuals with what scholars call a high “need for chaos” are more likely to indiscriminately share conspiracies, regardless of belief. These are the everyday trolls who share false content for a variety of reasons, none of which are benevolent. Dark personalities and dark motives are prevalent.
For instance, in the wake of the first assassination attempt on Donald Trump, a false accusation arose online about the identity of the shooter and his motivations. The person who first posted this claim knew he was making up a name and stealing a photo. The intent was apparently to harass the Italian sports blogger whose photo was stolen. This fake conspiracy was seen over 300,000 times on the social platform X and picked up by multiple other conspiracists eager to fill the information gap about the assassination attempt.
Often when I encounter a conspiracy theory I ask: “What does the sharer have to gain? Are they telling me this because they have an evidence-backed concern, or are they trying to sell me something?”
When researchers tracked down the 12 people primarily responsible for the vast majority of anti-vaccine conspiracies online, most of them had a financial investment in perpetuating these misleading narratives.
Some people who fall into this category might truly believe their conspiracy, but their first priority is finding a way to make money from it. For instance, conspiracist Alex Jones bragged that his fans would “buy anything.” Fox News and its on-air personality Tucker Carlson publicized lies about voter fraud in the 2020 election to keep viewers engaged, while behind-the-scenes communications revealed they did not endorse what they espoused.
Profit doesn’t just mean money. People can also profit from spreading conspiracies if it garners them influence or followers, or protects their reputation. Even social media companies are reluctant to combat conspiracies because they know they attract more clicks.
You don’t have to be a profiteer to like some attention. Plenty of regular people share content where they doubt the veracity, or know it is false.
These posts are common: Friends, family and acquaintances share the latest conspiracy theory with “could this be true?” queries or “seems close enough to the truth” taglines. Their accompanying comments show that sharers are, at minimum, unsure about the truthfulness of the content, but they share nonetheless. Many share without even reading past a headline. Still others, approximately 7% to 20% of social media users, share despite knowing the content is false. Why?
Some claim to be sharing to inform people “just in case” it is true. But this sort of “sound the alarm” reason actually isn’t that common.
Often, folks are just looking for attention or other personal benefit. They don’t want to miss out on a hot-topic conversation. They want the likes and shares. They want to “stir the pot.” Or they just like the message and want to signal to others that they share a common belief system.
For frequent sharers, it just becomes a habit.
Over time, the opportunists may end up convincing themselves. After all, they will eventually have to come to terms with why they are engaging in unethical and deceptive, if not destructive, behavior. They may have a rationale for why lying is good. Or they may convince themselves that they aren’t lying by claiming they thought the conspiracy was true all along.
It’s important to be cautious and not believe everything you read. These opportunists don’t even believe everything they write – and share. But they want you to. So be aware that the next time you share an unfounded conspiracy theory, online or offline, you could be helping an opportunist. They don’t buy it, so neither should you. Be aware before you share. Don’t be what these opportunists derogatorily refer to as “a useful idiot.”

A threat to democracy, fighting back against voter suppression and intimidation

Spanish: 888-VE-Y-VOTA (888-839-8682)
Asian Languages: 888-API-VOTE (888-274-8683)
Arabic: 844-YALLA-US (844-925-5287)
More Sentinel Stories


I heard it in Syrian tenor Sabah Fakhri’s powerful voice reverberating in my mom’s car on the way to piano lessons and soccer practice during my youth. I smelled it in the za’atar, Aleppo pepper, allspice, and cumin permeating the air in the family kitchen. Read more . . .
Photo Galleries