Est. 1995

Tag: Israel

How Social Media Abdicated Responsibility for the News

Illustration by Nicholas Konrad / The New Yorker

X, formerly known as Twitter, has, under the ownership of Elon Musk, dismantled its content-moderation staff, throttled the reach of news publications, and allowed any user to buy blue-check verification, turning what was once considered a badge of trustworthiness on the platform into a signal of support for Musk’s regime. Meta’s Facebook has minimized the number of news articles in users’ feeds, following years of controversy over the company’s role in spreading misinformation. And TikTok, under increased scrutiny in the United States for its parent company’s relationship with the Chinese government, is distancing itself from news content. A little over a decade ago, social media was heralded as a tool of transparency on a global scale for its ability to distribute on-the-ground documentation during the uprisings that became known as the Arab Spring. Now the same platforms appear to be making conflicts hazier rather than clearer. In the days since Hamas’s attacks, we’ve seen with fresh urgency the perils of relying on our feeds for news updates.

An “algorithmically driven fog of war” is how one journalist described the deluge of disinformation and mislabelled footage on X. Videos from a paragliding accident in South Korea in June of this year, the Syrian civil war in 2014, and a combat video game called Arma 3 have all been falsely labeled as scenes from Israel or Gaza. (Inquiries I sent to X were met with an e-mail reading, “Busy now, please check back later.”) On October 8th, Musk posted a tweet recommending two accounts to follow for information on the conflict, @WarMonitors and @sentdefender, neither of which is a formal media company, but both are paid X subscribers. Later that day, after users pointed out that both accounts regularly post falsities, Musk deleted the recommendation. Where Twitter was once one of the better-moderated digital platforms, X is most trustworthy as a source for finding out what its owner wants you to see.

https://www.factcheck.org/2023/10/posts-use-fabricated-audio-to-misrepresent-cnn-report-during-rocket-attack-in-israel/

Facebook used to aggregate content in a “News Feed” and pay media companies to publish stories on its platform. But after years of complicity in disseminating Trumpian lies—about the 2016 election, the COVID pandemic, and the January 6th riots—the company has performed an about-face. Whether because of negative public opinion or because of the threat of regulation, it’s clear that promoting news is no longer the goal of any of Meta’s social media. In recent days, my Facebook feed has been overrun with the same spammy entertainment-industry memes that have proliferated on the platform, as if nothing noteworthy were happening in the world beyond. On Instagram, some pro-Palestine users complained of being “shadowbanned”—seemingly cut off without warning from algorithmic promotion—and shared tips for getting around it. (Meta attributed the problem to a “bug.”)

Our feeds continue to create a feeling of transparency and granularity, while providing fewer of the signposts that we need to chart a clear path through the thicket of content. What remains, perhaps aptly, is an atmosphere of chaos and uncertainty as war unfolds daily on our screens.

In July, Meta launched its newest social network, Threads, in an attempt to draw users away from Musk’s embattled X. But, unlike X, Threads has shied away from serving as a real-time news aggregator. Last week, Adam Mosseri, the head of Instagram and overseer of Threads, announced that the platform was “not going to get in the way of” news content but was “not going go [sic] to amplify” it, either. He continued, “To do so would be too risky given the maturity of the platform, the downsides of over-promising, and the stakes.” I’ve found Threads more useful than X as a source for news about the Israel-Hamas war. The mood is calmer and more deliberate, and my feed tends to highlight posts that have already drawn engagement from authoritative voices. But I’ve also seen plenty of journalists on Threads griping that they were getting too many algorithmic recommendations and insufficient real-time posts. Users of Threads now have the option to switch to a chronologically organized feed. But on the default setting that most people use, there is no guarantee that the platform is showing you the latest information at any given time.

More Info: Websites for Fact-Checking

Source:

How Social Media Abdicated Responsibility for the News

Social Media is a (Largely) Lawless Cesspool

Governments from Pakistan to Mexico to Washington are woefully unequipped to combat disinformation warfare. Eastern European countries living in Russia’s shadow can teach us how to start fighting back, but only if our politicians decide to stop profiting from these tactics and fight them instead.

 

Suzanne Smalley

Governments from Pakistan to Mexico to Washington are woefully unequipped to combat disinformation warfare. Eastern European countries living in Russia’s shadow can teach us how to start fighting back, but only if our politicians decide to stop profiting from these tactics and fight them instead.

A screenshot from a video shared widely on social media purporting to show a Hamas fighter downing a helicopter, which is actually pulled from the video game Arma 3.

Video game clips purporting to be footage of a Hamas fighter shooting down an Israeli helicopter. Phony X accounts spreading fake news through fictitious BBC and Jerusalem Post “journalists.” An Algerian fireworks celebration is described as Israeli strikes.

These are just a few examples of the disinformation swirling around the conflict between Hamas and Israel, much of which has been enabled by X, formerly known as Twitter, and by platforms like Meta and Telegram.

The platforms have also been used to terrorize. In one instance, a girl found out that a militant had killed her grandmother after he broadcast it on a Facebook livestream. Meta did not immediately reply to a request for comment.

X owner Elon Musk promoted two particularly virulent accounts spreading disinformation in a post that was viewed 11 million times before Musk deleted the tweet a few hours later.

One of those accounts, @sentdefender, was described by Digital Forensic Research Lab (DFR) expert Emerson Brooking as both “absolutely poisonous” and often retweeted “uncritically.”

Read More: Hacktivists take sides in the Israel-Palestinian war

X removed some of the most blatantly fake tweets, often hours after they were posted, but purveyors of disinformation like @sentdefender still operate freely.

A spokesperson for X replied to a request for comment by saying to “check back later.”

The platform announced changes to its public interest policy over the weekend, according to a post on its safety channel. The post said X has seen an increase in “daily active users” based in the conflict area in the past few days and that more than 50 million posts worldwide have discussed the attack.

The use of video game and recycled news footage to spread false information about the conflict is a growing trend, making it even more difficult to root out disinformation, according to Dina Sadek, a Middle East research fellow with the Digital Forensic Research Lab.

“We’re laser-focused and dedicated to protecting the conversation on X and enforcing our rules as we continue to assess the situation on the platform,” they said.

The post said X will remove newly created Hamas-affiliated accounts. It also said it is coordinating with industry peers and the Global Internet Forum to Counter Terrorism (GIFCT) “to try and prevent terrorist content from being distributed online.”

X said it is “proactively monitoring” for anti-semitic accounts and has “actioned” tens of thousands of posts sharing graphic media and violent and hateful speech.

On Tuesday, European Commissioner Thierry Breton sent a letter to Musk, cautioning that X is spreading “illegal content and disinformation.” EU’s Digital Services Act (DSA) mandates that large online platforms such as X remove illegal content and take steps to quickly address how they impact the public.

“Given the urgency, I also expect you to be in contact with the relevant law enforcement authorities and Europol, and ensure that you respond promptly to their requests,” Breton wrote. He advised Musk that he would be following up on matters related to X’s compliance with DSA.

“I urge you to ensure a prompt, accurate, and complete response to this request within the next 24 hours,” Breton said.

The difficulty of rooting out disinformation is made more difficult by the growing trend of using video games and recycled news footage to promote falsehoods about the conflict, said Dina Sadek, a Middle East research fellow with DFR. Telegram has been a major vehicle for disinformation, she added, likely because it doesn’t restrict how often users can post and because the content is sent as a text message.

List of social platforms with at least 100 million active users

https://guides.stlcc.edu/fakenews/spotfakenews

https://guides.stlcc.edu/fakenews/spotfakenews

“The second that you think something happened you can give them a boost and give them pictures from the incident,” she told The Record. “There’s just the speed of how things happen when on messaging applications and some of those have large numbers of subscribers.”

Sadek said it is too soon to detect patterns in the disinformation being disseminated — both in terms of the amount and which side’s supporters are most active — but she said she has seen it emanate from all sides of the conflict.

Stanford disinformation scholar Herb Lin told Recorded Future News that he predicts the propaganda war will intensify significantly in the coming weeks, citing Russia’s likely support to Hamas due to its friendly relationship with Iran.

“They have a quick reaction disinformation force,” he said. “They have the ability to react promptly to this sort of stuff and the first people to get on the air tend to dominate the messages for a while.”

Learn more: https://guides.stlcc.edu/fakenews/spotfakenews

© 2024 CounterPoint