Est. 1995

Tag: Misinformation

Disinformation is one of the world’s biggest risks ahead of elections, reports say. But it doesn’t have to be.

Disinformation is one of the world’s biggest risks ahead of elections, reports say. But it doesn't have to be.

https://checkmyads.org/updates/disinformation-reports-wef-adtech-google/

As we hurtle toward one of the most consequential election years of our lifetimes, major groups are warning of a huge risk on the horizon: mis- and disinformation.

That’s according to both the World Economic Forum and the Eurasia Group, which published separate but eerily similar reports on the biggest risks the world faces as we head into 2024.

With disinformation fueling division, the Eurasia Group warned that the upcoming US election will be “testing American democracy to a degree the nation hasn’t experienced in 150 years and undermining US credibility on the global stage.”

But that disinformation isn’t coming out of nowhere. There’s a business model that fuels it — the global ad tech market, which is expected to be worth $2.9 trillion by 2031, according to Forbes.

Thanks to an almost total lack of transparency in this industry, disinformation is profitable. But that doesn’t have to be the case.

Here’s what’s at stake, according to some of the biggest thinkers out there.

What do these reports actually say?

The biggest challenge of 2024, the Eurasia Group wrote, is “the United States vs itself.”

The political risk consultancy warned in its report that “public trust in core institutions—such as Congress, the judiciary, and the media—is at historic lows; polarization and partisanship are at historic highs.

“Add algorithmically amplified disinformation to the mix, and Americans no longer believe in a common set of settled facts about the nation and the world.”

That’s a scary thought ahead of an incredibly important election — and the Eurasia Group isn’t alone in that concern.

The WEF’s Global Risks Report 2024 — which surveyed 1,500 experts around the world — painted a picture of a treacherous road ahead with “optimism” in “short supply.”

The biggest short-term risk the experts outlined was “the spread of mis- and disinformation around the globe.”

This “could result in civil unrest, but could also drive government-driven censorship, domestic propaganda and controls on the free flow of information,” the WEF website summarized.

It could have a serious impact on elections around the world in 2024, which are set to take place in several countries, including Bangladesh, Mexico, India, the United Kingdom, and the United States.

“The widespread use of misinformation and disinformation, and tools to disseminate it, may undermine the legitimacy of newly elected governments,” the WEF warned. “Resulting unrest could range from violent protests and hate crimes to civil confrontation and terrorism.”

What do ads have to do with this?

While the takeaways are terrifying, we can tackle disinformation targeting voters. Because disinformation is a business, and its revenue source is ads.

Programmatic advertising — the automation of buying and selling ads — has let companies introduce so many middlemen and layers to the ad-buying process that brands often have no idea what their ad spend is funding.

https://checkmyads.org/about/

When you consider that the global ad tech industry is worth hundreds of billions of dollars right now — and that as much as 3 percent of programmatic ad buys go toward an “unknown” — that’s a lot of money disappearing into the ether.

We’ve caught disinformation grifters sticking their hands into this cookie jar, swiping ad dollars from brands that want nothing to do with their websites.

Consider Breitbart, a site full of racism and disinformation that brands including BMW have publicly said they don’t want to advertise on. How was it still serving BMW retailer ads in December?

Because bad actors know how this incredibly technical process works and use its complexity to profit. One way they game the system is by pooling together their inventory and hiding their icky websites behind brand-safe fronts.

Google and other ad exchanges are accomplices in the disinformation-for-profit business. Google controls most of the automated ad-buying-and-selling processes. It requires next-to-no transparency from the websites it works with, and regularly fails to enforce its own policies. We even caught Google profiting from scammers selling fake Shark Tank diet pills.

We don’t know if Google’s failures are because it doesn’t care or because it has lost control of its near-monopoly on the advertising ecosystem. But it doesn’t matter because the effect is the same: It makes disinformation profitable.

And that disinformation is threatening elections around the world.

But by holding ad exchanges accountable and empowering advertisers by pushing for greater transparency, we can close off the paths that make disinformation profitable.

And just maybe save democracy in the process.

How Social Media Abdicated Responsibility for the News

Illustration by Nicholas Konrad / The New Yorker

X, formerly known as Twitter, has, under the ownership of Elon Musk, dismantled its content-moderation staff, throttled the reach of news publications, and allowed any user to buy blue-check verification, turning what was once considered a badge of trustworthiness on the platform into a signal of support for Musk’s regime. Meta’s Facebook has minimized the number of news articles in users’ feeds, following years of controversy over the company’s role in spreading misinformation. And TikTok, under increased scrutiny in the United States for its parent company’s relationship with the Chinese government, is distancing itself from news content. A little over a decade ago, social media was heralded as a tool of transparency on a global scale for its ability to distribute on-the-ground documentation during the uprisings that became known as the Arab Spring. Now the same platforms appear to be making conflicts hazier rather than clearer. In the days since Hamas’s attacks, we’ve seen with fresh urgency the perils of relying on our feeds for news updates.

An “algorithmically driven fog of war” is how one journalist described the deluge of disinformation and mislabelled footage on X. Videos from a paragliding accident in South Korea in June of this year, the Syrian civil war in 2014, and a combat video game called Arma 3 have all been falsely labeled as scenes from Israel or Gaza. (Inquiries I sent to X were met with an e-mail reading, “Busy now, please check back later.”) On October 8th, Musk posted a tweet recommending two accounts to follow for information on the conflict, @WarMonitors and @sentdefender, neither of which is a formal media company, but both are paid X subscribers. Later that day, after users pointed out that both accounts regularly post falsities, Musk deleted the recommendation. Where Twitter was once one of the better-moderated digital platforms, X is most trustworthy as a source for finding out what its owner wants you to see.

https://www.factcheck.org/2023/10/posts-use-fabricated-audio-to-misrepresent-cnn-report-during-rocket-attack-in-israel/

Facebook used to aggregate content in a “News Feed” and pay media companies to publish stories on its platform. But after years of complicity in disseminating Trumpian lies—about the 2016 election, the COVID pandemic, and the January 6th riots—the company has performed an about-face. Whether because of negative public opinion or because of the threat of regulation, it’s clear that promoting news is no longer the goal of any of Meta’s social media. In recent days, my Facebook feed has been overrun with the same spammy entertainment-industry memes that have proliferated on the platform, as if nothing noteworthy were happening in the world beyond. On Instagram, some pro-Palestine users complained of being “shadowbanned”—seemingly cut off without warning from algorithmic promotion—and shared tips for getting around it. (Meta attributed the problem to a “bug.”)

Our feeds continue to create a feeling of transparency and granularity, while providing fewer of the signposts that we need to chart a clear path through the thicket of content. What remains, perhaps aptly, is an atmosphere of chaos and uncertainty as war unfolds daily on our screens.

In July, Meta launched its newest social network, Threads, in an attempt to draw users away from Musk’s embattled X. But, unlike X, Threads has shied away from serving as a real-time news aggregator. Last week, Adam Mosseri, the head of Instagram and overseer of Threads, announced that the platform was “not going to get in the way of” news content but was “not going go [sic] to amplify” it, either. He continued, “To do so would be too risky given the maturity of the platform, the downsides of over-promising, and the stakes.” I’ve found Threads more useful than X as a source for news about the Israel-Hamas war. The mood is calmer and more deliberate, and my feed tends to highlight posts that have already drawn engagement from authoritative voices. But I’ve also seen plenty of journalists on Threads griping that they were getting too many algorithmic recommendations and insufficient real-time posts. Users of Threads now have the option to switch to a chronologically organized feed. But on the default setting that most people use, there is no guarantee that the platform is showing you the latest information at any given time.

More Info: Websites for Fact-Checking

Source:

How Social Media Abdicated Responsibility for the News

Democracy in Distress: Unpacking America’s 5 Top Threats 

Threat 1: Cynicism and Apathy

Opposite sides of the same coin, cynicism and apathy are the reasons many Americans are not taking action right now. In a democracy, if we don’t use it, we lose it. The challenge is getting people to care enough to speak up and be part of the solutions. In the first of five articles, learn the single, most effective way to overcome cynicism and apathy—others and your own.

Threat 2: Misinformation

Have you noticed how friends and family are resistant to correct information? The current state of siloed news in the U.S. is profoundly toxic to our democracy. Yet there are ways to share accurate information that cuts through resistance. Learn all about what works in week two.

Threat 3: Hate

Abuse aimed at people of different races, economic statuses, genders, religions, etc. is an abject failure of our society. While hate might seem intractable, it isn’t. In the third week, we’ll introduce you to straightforward solutions that stop hatred at its root, creating a kinder, more equitable nation.

Threat 4: Uncivil discourse

Collaboration is the key ingredient that makes democracy work. Disrespect and disdain kill it. We cannot tolerate extreme views, but most Americans have positive overlaps in their political Venn diagrams; The trick is learning to find them. In week four, learn techniques that restore trust, civility, and collaboration—in our neighborhoods to Congress.

Threat 5: News and social media

The myriad outlets that keep us informed can also cause apathy, cynicism, and overwhelm. Corporate advertising plays to our fears by design, decreasing the quality and depth of information we need to understand important issues. In the final week, discover new strategies that keep you in control as a consumer of news media and informed as an engaged citizen.

What’s next

As we gear up for the next presidential primaries and election, there’s no time like the present to take an active part in your nation’s future. Addressing these threats strategically with thoughtful, doable steps can create the welcoming country we know is possible. In the coming weeks, look for clear ways to speak up and show up for ourselves, each other, and future generations:
 

© 2024 CounterPoint