Anti-Vax Lies Spread on YouTube—Paid for ‘by Russian PR Company’

Disinformation is rife on social media: No news here. But shadowy interests are paying so-called “influencers” to spread it.

The latest scandal is a Russian PR firm paying YouTubers to flog a made-up story about vaccines killing people. Hey, Vladimir—you know what really kills people? Not getting vaccinated.

The motivation is unclear. But in today’s SB Blogwatch, we’re clear it’s despicable.

Your humble blogwatcher curated these bloggy bits for your entertainment. Not to mention: Titanic panic.

Lambs to the Slaughter

What’s the craic? Charlie Haynes and Flora Carmichael report—“The YouTubers who blew the whistle on an anti-vax plot”:

Russian connections
“It started with an email” says Mirko Drotschmann, a German YouTuber. … An influencer marketing agency called Fazze offered to pay him to [say] the death rate among people who had the Pfizer vaccine was almost three times that of the AstraZeneca jab.

The information provided wasn’t true. … The data the influencers were asked to share had actually been cobbled together from different sources and taken out of context. … “I was shocked,” says Mirko “then I was curious, what’s behind all that?”

Fazze is a part of AdNow, which is a digital marketing company, registered in both Russia and the UK. … Eventually we managed to contact Ewan Tolladay, one of two directors of the British arm of AdNow, [who] said he had very little to do with Fazze – which he said was a joint venture between his fellow director – a Russian man called Stanislav Fesenko – and another person whose identity he didn’t know.

The identity of the agency’s mystery client remains unclear. There has been speculation about the Russian connections … and the interests of the Russian state in promoting its own vaccine – Sputnik V.

SRSLY? Max Fisher nets the story—“Disinformation for Hire, a Shadow Industry, Is Quietly Booming”:

For-hire disinformation
Private firms, straddling traditional marketing and the shadow world of geopolitical influence operations, are selling services once conducted principally by intelligence agencies. They sow discord, meddle in elections, seed false narratives and push viral conspiracies, mostly on social media.

And they offer clients something precious: deniability. … A wave of anti-American posts in Iraq, seemingly organic, were tracked to a public relations company that was separately accused of faking anti-government sentiment in Israel.

For-hire disinformation, though only sometimes effective, is growing more sophisticated as practitioners iterate and learn. … It is becoming more common in every part of the world, outpacing operations conducted directly by governments. The result is an accelerating rise in polarizing conspiracies, phony citizen groups and fabricated public sentiment, deteriorating our shared reality beyond even the depths of recent years.

And Katherine Hignett has, “How pro-Kremlin bots are fuelling chaos with lies about the pandemic”:

Russia’s scientific prowess
Sam’s phone buzzes with a new WhatsApp message … from her father. … The posts — anti-vaccine, anti-lockdown, anti-mask — just kept on coming. They’re full of false claims designed to frighten and enrage: that Bill Gates orchestrated the pandemic to make money or that Jewish people funded the development of vaccines designed to kill.

Sam is one of many thousands of people whose family members have been taken in by fake stories — often pre-covid conspiracy theories repackaged for the pandemic — that have sprang up on social media since the early days of the outbreak. … Some of it’s manufactured by propaganda outlets like Russia’s RT and Sputnik News.

Some troll farms focus on automatic amplification, using numerous fake accounts to share material that aligns with their goals. … Others also share content manually, with staff carefully choosing which messages will spread most effectively in smaller online communities.

Russia likely has financial reasons to spread disinformation too. … Western vaccines like those made by Pfizer and Moderna … have become a key target of pro-Kremlin disinformation efforts during the pandemic, in what [looks like] an effort to boost sales of Russia’s Sputnik V vaccine abroad [and] to advertise Russia’s scientific prowess.

I don’t know if I buy the financial argument. Neither does JaredOfEuropa:

KGB
Russia has not even applied for approval to sell their vaccine in the EU, so they have no interest. But the lack of a monetary interest does not mean the Russian state is not behind this.

Russia’s efforts are mostly aimed at sowing discord, between western allies, or between groups in those countries. If there’s a good argument going between two groups, they are not above sponsoring both sides of the argument in order to fan the flames a little. Well known tactics that go back to the KGB in Soviet times.

O RLY? elliekelly watched telly:

Stalin
I recently watched “How to Become a Tyrant” and they mentioned a [NY Times] reporter in the 30s … Walter_Duranty … who was more or less bribed by Stalin to write articles denying people were starving in Ukraine. He even won a Pulitzer for the “reporting.”

So ban social-media influencers? With a better idea, here’s drinkypoo, who knows where the blame lies:

Politicians
You can’t preserve democracy by clamping down on people’s free speech. The only way to fix this problem is with education, and neither “side” wants that, as then people would know better than to blindly follow politicians.

Back in my day, we never had this problem. Get off gabereiser’s lawn:

Giant mall
Ahh the good old days of web rings and IRC chatrooms and such. Now it’s a bazaar of monopolies, disinformation, vanity, and photo-sharing.

The issue is we no longer rely on some server in some guys basement, a BBS out of a college dorm closet, or a network of like-minded individuals for the sake of learning off each other. Everything is owned by a select few companies that add tracking cookies to advertise to you. It’s become a giant mall for the minds of the world.

Surely there must be a better way to detect disinfo? Don’t call Applehu Akbar Shirley: [You’re fired—Ed.]

Could an AI be trained?
Each of us would like to believe that the other guy’s argument originated as intentional disinformation. But how do we distinguish such a source from plain old misunderstanding?

This is a much more difficult problem than separating truth from untruth, because truth can be validated against history, observation or experiment. Could an AI be trained from large-scale pattern matches to make the distinction?

But who in their right minds is taking this money? mrspeaker calls you to Order:

Actually good for society
People are very good at justifying what they are doing if their livelihood is at stake. I know many programmers — nice people — working as the digital equivalent of seal-clubbers.

But there’s good money to be made clubbing seals. And if they don’t do it somebody else will. And the seals have a choice not be clubbed. And they need the experience. And they only indirectly club the seals. And remember that time when that “good” guy did a bad thing for seals? And also there’s WAY worse industries. And there’s this one study that suggests clubbing seals is actually good for society, etc., etc.

Meanwhile, what’s the correct plural form of “influencer”? Opportunist grasps the opportunity:

Is it time yet? Can we finally officially call them “influenza”?

And Finally:

Before you go, you simply must watch this

Hat tip: buffet_the_appetite_slayer

Previously in And Finally


You have been reading SB Blogwatch by Richi Jennings. Richi curates the best bloggy bits, finest forums, and weirdest websites … so you don’t have to. Hate mail may be directed to @RiCHi or [email protected]. Ask your doctor before reading. Your mileage may vary. E&OE. 30.

Image sauce: https://unsplash.com/photos/4ApmfdVo32Q (via Unsplash)

Richi Jennings

Richi Jennings is a foolish independent industry analyst, editor, and content strategist. A former developer and marketer, he’s also written or edited for Computerworld, Microsoft, Cisco, Micro Focus, HashiCorp, Ferris Research, Osterman Research, Orthogonal Thinking, Native Trust, Elgan Media, Petri, Cyren, Agari, Webroot, HP, HPE, NetApp on Forbes and CIO.com. Bizarrely, his ridiculous work has even won awards from the American Society of Business Publication Editors, ABM/Jesse H. Neal, and B2B Magazine.

richi has 590 posts and counting.See all posts by richi