Why do polls show a majority of Americans support a ceasefire in the Israel-Palestine conflict, and yet only 18 members of the U. S. House are officially supporting a resolution calling for such a ceasefire?
What explains that huge gap between what the public wants and what Congress wants? Part of the answer has to do with how America’s political discourse has been deliberately polarized by conservative groups like AIPAC, the pro-Netanyahu lobbying group seeking to equate support for Israel’s fundamental right to exist with support for the specific policies of Israel’s current right-wing government. These groups publicly equate opposition to such policies to anti-semitism, polarizing the national conversation and demonizing Democrats who question Israel’s policies. 12 House Democrats, Josh Gottheimer (NJ), Jared Moskowitz (FL), Debbie Wasserman Schultz (FL), Lois Frankel (FL), Jared Golden (ME), Juan Vargas (CA), Angie Craig (MN), Darren Soto (FL), Haley Stevens (MI), Frederica Wilson (FL), Don Davis (NC) and Greg Landsman (OH). have accepted a combined $8 million in campaign support from AIPAC and its affiliates in the last year.
In the 2022 midterm elections, AIPAC and what was effectively its campaign arm, Democratic Majority for Israel (DMFI), outspent all comers to oust Democratic primary candidates who’d dared to criticize the Israeli right, most particularly liberal Michigan Rep. Andy Levin, who was also president of his local synagogue, and who surely spoke for most Jewish Democrats in criticizing Israeli governments in general for abridging Palestinian rights, and the ultra-Orthodox in Bibi’s coalition in particular for pushing for a more theocratic state. What was notable about DMFI’s campaigns was that the attack ads they ran didn’t focus on Israel at all—they often involved themselves in districts that weren’t heavily Jewish—but on whatever else they could come up with, verifiable or otherwise.
The fate of Andy Levin doubtless troubles the sleep of Schiff, Porter, the Democrats’ House leadership, and probably even the White House. Like those Florida retirees, Biden is also old enough to remember when Israel was the cynosure of liberals’ eyes. What’s left of that Israel, last week’s letters point out, may soon crash and burn. Democrats are calling on the president, forcefully if implicitly, to do more to avert that catastrophe.
X, formerly known as Twitter, has, under the ownership of Elon Musk, dismantled its content-moderation staff, throttled the reach of news publications, and allowed any user to buy blue-check verification, turning what was once considered a badge of trustworthiness on the platform into a signal of support for Musk’s regime. Meta’s Facebook has minimized the number of news articles in users’ feeds, following years of controversy over the company’s role in spreading misinformation. And TikTok, under increased scrutiny in the United States for its parent company’s relationship with the Chinese government, is distancing itself from news content. A little over a decade ago, social media was heralded as a tool of transparency on a global scale for its ability to distribute on-the-ground documentation during the uprisings that became known as the Arab Spring. Now the same platforms appear to be making conflicts hazier rather than clearer. In the days since Hamas’s attacks, we’ve seen with fresh urgency the perils of relying on our feeds for news updates.
An “algorithmically driven fog of war” is how one journalist described the deluge of disinformation and mislabelled footage on X. Videos from a paragliding accident in South Korea in June of this year, the Syrian civil war in 2014, and a combat video game called Arma 3 have all been falsely labeled as scenes from Israel or Gaza. (Inquiries I sent to X were met with an e-mail reading, “Busy now, please check back later.”) On October 8th, Musk posted a tweet recommending two accounts to follow for information on the conflict, @WarMonitors and @sentdefender, neither of which is a formal media company, but both are paid X subscribers. Later that day, after users pointed out that both accounts regularly post falsities, Musk deleted the recommendation. Where Twitter was once one of the better-moderated digital platforms, X is most trustworthy as a source for finding out what its owner wants you to see.
Facebook used to aggregate content in a “News Feed” and pay media companies to publish stories on its platform. But after years of complicity in disseminating Trumpian lies—about the 2016 election, the COVID pandemic, and the January 6th riots—the company has performed an about-face. Whether because of negative public opinion or because of the threat of regulation, it’s clear that promoting news is no longer the goal of any of Meta’s social media. In recent days, my Facebook feed has been overrun with the same spammy entertainment-industry memes that have proliferated on the platform, as if nothing noteworthy were happening in the world beyond. On Instagram, some pro-Palestine users complained of being “shadowbanned”—seemingly cut off without warning from algorithmic promotion—and shared tips for getting around it. (Meta attributed the problem to a “bug.”)
Our feeds continue to create a feeling of transparency and granularity, while providing fewer of the signposts that we need to chart a clear path through the thicket of content. What remains, perhaps aptly, is an atmosphere of chaos and uncertainty as war unfolds daily on our screens.
In July, Meta launched its newest social network, Threads, in an attempt to draw users away from Musk’s embattled X. But, unlike X, Threads has shied away from serving as a real-time news aggregator. Last week, Adam Mosseri, the head of Instagram and overseer of Threads, announced that the platform was “not going to get in the way of” news content but was “not going go [sic] to amplify” it, either. He continued, “To do so would be too risky given the maturity of the platform, the downsides of over-promising, and the stakes.” I’ve found Threads more useful than X as a source for news about the Israel-Hamas war. The mood is calmer and more deliberate, and my feed tends to highlight posts that have already drawn engagement from authoritative voices. But I’ve also seen plenty of journalists on Threads griping that they were getting too many algorithmic recommendations and insufficient real-time posts. Users of Threads now have the option to switch to a chronologically organized feed. But on the default setting that most people use, there is no guarantee that the platform is showing you the latest information at any given time.
Governments from Pakistan to Mexico to Washington are woefully unequipped to combat disinformation warfare. Eastern European countries living in Russia’s shadow can teach us how to start fighting back, but only if our politicians decide to stop profiting from these tactics and fight them instead.
These are just a few examples of the disinformation swirling around the conflict between Hamas and Israel, much of which has been enabled by X, formerly known as Twitter, and by platforms like Meta and Telegram.
The platforms have also been used to terrorize. In one instance, a girl found out that a militant had killed her grandmother after he broadcast it on a Facebook livestream. Meta did not immediately reply to a request for comment.
X owner Elon Musk promoted two particularly virulent accounts spreading disinformation in a post that was viewed 11 million times before Musk deleted the tweet a few hours later.
One of those accounts, @sentdefender, was described by Digital Forensic Research Lab (DFR) expert Emerson Brooking as both “absolutely poisonous” and often retweeted “uncritically.”
X removed some of the most blatantly fake tweets, often hours after they were posted, but purveyors of disinformation like @sentdefender still operate freely.
A spokesperson for X replied to a request for comment by saying to “check back later.”
The platform announced changes to its public interest policy over the weekend, according to a post on its safety channel. The post said X has seen an increase in “daily active users” based in the conflict area in the past few days and that more than 50 million posts worldwide have discussed the attack.
The use of video game and recycled news footage to spread false information about the conflict is a growing trend, making it even more difficult to root out disinformation, according to Dina Sadek, a Middle East research fellow with the Digital Forensic Research Lab.
“We’re laser-focused and dedicated to protecting the conversation on X and enforcing our rules as we continue to assess the situation on the platform,” they said.
The post said X will remove newly created Hamas-affiliated accounts. It also said it is coordinating with industry peers and the Global Internet Forum to Counter Terrorism (GIFCT) “to try and prevent terrorist content from being distributed online.”
X said it is “proactively monitoring” for anti-semitic accounts and has “actioned” tens of thousands of posts sharing graphic media and violent and hateful speech.
On Tuesday, European Commissioner Thierry Breton sent a letter to Musk, cautioning that X is spreading “illegal content and disinformation.” EU’s Digital Services Act (DSA) mandates that large online platforms such as X remove illegal content and take steps to quickly address how they impact the public.
“Given the urgency, I also expect you to be in contact with the relevant law enforcement authorities and Europol, and ensure that you respond promptly to their requests,” Breton wrote. He advised Musk that he would be following up on matters related to X’s compliance with DSA.
“I urge you to ensure a prompt, accurate, and complete response to this request within the next 24 hours,” Breton said.
The difficulty of rooting out disinformation is made more difficult by the growing trend of using video games and recycled news footage to promote falsehoods about the conflict, said Dina Sadek, a Middle East research fellow with DFR. Telegram has been a major vehicle for disinformation, she added, likely because it doesn’t restrict how often users can post and because the content is sent as a text message.
“The second that you think something happened you can give them a boost and give them pictures from the incident,” she told The Record. “There’s just the speed of how things happen when on messaging applications and some of those have large numbers of subscribers.”
Sadek said it is too soon to detect patterns in the disinformation being disseminated — both in terms of the amount and which side’s supporters are most active — but she said she has seen it emanate from all sides of the conflict.
Stanford disinformation scholar Herb Lin told Recorded Future News that he predicts the propaganda war will intensify significantly in the coming weeks, citing Russia’s likely support to Hamas due to its friendly relationship with Iran.
“They have a quick reaction disinformation force,” he said. “They have the ability to react promptly to this sort of stuff and the first people to get on the air tend to dominate the messages for a while.”