The Moscow Times has this story about repressive ballot access laws in Russia. It is authored by Yelena Kotyonochkina, a former Moscow municipal deputy who now lives outside Russia.
The Moscow Times was formerly a print and on-line newspaper in Russia,, but it has been banned by the government and now is an on-line newspaper situated outside Russia.
Does it mention Kommiela was endorsed by Putin?
He was completely serious, too.
Russian history is interesting and tragic. Their current problems are all of their own making.
No, they’re all of the other nations making.
FILE FOR RUSSIA OFFICE AND DIE — SUICIDE PACT ???
Western propaganda.
speaking of band news on Russia, Read article in Rappnews entitled Rappannock homeowners Dimitri and Anastasia Simes indicted for allegedly violating U.S. sanctions on Russia” 6 September 2024. It claims that Mrs. Anastasia Simes was indicted with her husband for ride with her husband in a automobile that had a driver employed by Russian oligarch Aleksandr Yevgenyevich Udodov. If convicted they could get 20 years
in prison each for violating the IEEPA.
This brings me back to 1987, when I let the FBI set up surveillance of two different Soviet Neighbors in my apartment building from my apartment building in Arlington, VA. Only thing that happen was on December 9, 1987, I got place in custody by two FBI agents.
It is clear that what the Department of Justice is doing is trying to connect Dimitri Simes to the current Trump campaign, because of his campaign advice in 2016 to Donald Trump.
The US regime is the evil one. Russia is good.
“It is authored by Yelena Kotyonochkina, a former Moscow municipal deputy who now lives outside Russia.
The Moscow Times was formerly a print and on-line newspaper in Russia,, but it has been banned by the government and now is an on-line newspaper situated outside Russia.”
So if you know full well that it is anti-Russian propaganda, why bother sharing it at all? For everyone and gather around to have a good laugh at how laughable it is? Or what?
————
@Thecommander Jeff cc. Banderite Perun-worshipper
Russia’s current problems are largely the result of (national) socialism, both internal and external. The 1917 revolutions and subsequent civil war exiled and suppressed Russia and her values. By the time Soviet oppression finally collapsed, the west had become more socialist than Russia.
President Yeltsin botched the transition to a free market, by allowing the socialist elites from the Soviet era to snatch up all state property being privatized at a fraction of its value. These internal socialists then happily conspired with external western socialists to try and turn Russia into another client state of western hegemony. And they would have succeeded, had it not been for altruists like Alexander Dugin successfully counseling president Putin on his duty to always act in the interests of the Russian people, largely returning to pre-1917 values.
However, the struggle for Russia continues. Externally, the socialist west instigates color revolutions in former Soviet countries neighboring Russia: they take control of those countries using (national) socialist terrorists who they then use against Russia, thereby driving Russia ever more into the arms of communist China, communist Iran, communist North Korea, etc.
Internally, there are still a great number of socialist elites and their dependents, who remember how good they had it by exploiting ordinary people in Soviet Union. They have worked tirelessly, and unfortunately quite successfully, at fusing Russian patriotism with Soviet communism. The communist party for example, which continues to be disproportionately powerful, does everything it can to frustrate disentangling Russian heroes used by the Soviet Union against their recent ally national socialist Germany.
The internal socialists manage to hold onto power by subverting the patriotism necessary to defend Russia from external socialists. It creates a catch-22: until the external socialists are no longer a threat, Russia cannot afford to deal with its internal socialists; but until its internal socialists are dealt with, the external socialists continue to have propaganda ammunition against Russia.
And all the while that this stalemate continues, both internal and external socialists have opportunities to hollow out the Christian soul of the Russian people, to try and replace it with their own satanic anti-values. (For example, external socialists wish to promote libertinism, feminism, infanticide, homosexuality, pederasty, etc. While internal socialists which to restore a centrally planned economy of nationalized assets for them to enrich themselves with.)
So while they are nominally opposed to one another, because this is necessary for their facade; in practice it is in both their interests to keep up this status quo of internal and external socialist pressures under which Russia is suffering. That is the tragic cause of Russia’s problems.
You’re the Banderite, but nevertheless, you make good points, despite not a actually believing what you write.
Just read an additional article on subjects against Trump and Kennedy views relating to so-called “Russian propaganda” entitled “American company, Russian propaganda: New Kremlin tactic reveals escalating effort to sway U.S. Voters” by David Klepper. (5 September 2024, AP).
It has a picture showing a panel with Attorney General Merrick Garland flanked on two sides by members of the Justice Department’s Election Task Force, viz., Deputy Attorney General Criminal Division, Nicolev Argentieri, Deputy Attorney General Lisa Monaco, FBI Director, Christopher Wray, and Assistant Attorney General, National Security Division, Mathew Olson. Yes this is the same Assistant AG that is currently going after Dimitri and Anastasia Simes.
My background I have been to the Soviet Union three times in my life, viz., 1963, 1967, & 1969. I have a good knowledge of Soviet Propaganda, and what is going on is propaganda from the Biden administration not the Russians. My maternal grandfather was born in Odesa, Imperial Russia. My mother’s Uncle was an officer in the Imperial Russian Army and was a friend of the last Tsar of Russia. He was the only person during the mid 1960’s which I met remembered the events first had at St. Petersburg, Russia from 1917.
While I was a Student at San Francisco State College, I befriended both George and Vladimir Derugin. We were interested in resorting the Russian Tsar to the throne of Russia, We the assistance of Boris Vertloogan in San Francisco talks took place o0n how that could happen.
I have served as Chairman of the American Independent Party of California twice and party registration now isv nearv 850 thousandv electors
that makes the party third in the State of California,
ag
division Matthaw Olson
fbi
I recall the FBI in 1987 setting up my apartment in Arlington, VA to tap into the apartment across the hall with the honey trap and the apartment just undermine with an additional Soviet.
FAILURE OF ALLIES IN 1918-1919 TO WIPE OUT THE LENIN COMMIES IN OLDE RUSSIA – ESP USING WW I GERMANS/JAPANESE.
PRICE PAID SO FAR – 100 PLUS MILLION DEAD IN RUSSIA / CHINA / KOREA / VIETNAM / CUBA / ETC.
Iran and China are trying to steal the election for Kampala Harris.
A major operation to influence public opinion saw comments posted on Western media articles in support of Russian interests, according to new research. Cardiff University’s Crime and Security Research Institute found 32 prominent media websites across 16 countries were targeted including Mail Online, the Express and the Times. The comments were fed back into Russian-language media outlets as the basis for stories, researchers said.
The tactic was spotted this year.
But the operation is believed to have been escalating since at least 2018 and recently focused on the Western withdrawal from Afghanistan.
Technology moves to the heart of UK security MI5 warns of spies using LinkedIn to steal secrets. In one example highlighted by the researchers, a Mail Online story last month about the Taliban’s victory in Afghanistan featured 2,500 comments from the public. But some, it is alleged, were part of an organized Russian campaign.
By selecting a small number of comments, a Russian news article was then headlined: “The British have compared the rise of the Taliban to power with the end of Western civilization.” This, Cardiff researchers say, was one of 18 recent stories on the fall of Kabul that were produced using reader comments in the UK and US, supporting Russia’s narrative about the end of liberal democracy, the failure of Nato or made a link to the willingness to support Ukraine.
The Cardiff team say this was just the latest part of a long-running campaign. They identified 242 stories where provocative pro-Russian or anti-Western statements were posted in reaction to articles relevant to Russia. Researchers used pattern recognition techniques to analyze reader comments, which suggested some accounts were posting pro-Kremlin content in an organized manner.
Researchers showed a translated example of an article on the RuEconomics website. Sometimes those posting were accused by others on the site of being Russian trolls but in most cases they did not respond. These comments were then picked up by Russian media organizations who took the original news story and used the comments to construct a Russian language news story with a particular slant suggesting “The British think X or Y.”
These were published to suggest extensive support among Western citizens for Russia or President Vladimir Putin or for a particular policy.
These would be published in Russia, but also other European countries, particularly Bulgaria. These would then be further amplified on social media platforms, including Telegram.
Prof. Martin Innes said the “sophisticated” campaign takes advantage of the fact that while social media sites have put more resources into detecting influence campaigns, traditional mainstream sites have fewer security measures to stop people creating multiple, false identities.
One account was found to have changed location 69 times and changed name 549 times since it was created last June.
“There has been a tendency to think about influence operations as just pivoting around the use of fake social media accounts on Twitter and Facebook and the like,” Prof Innes, told the BBC.
“What is important about this research is showing how other kinds of media can be and are being manipulated and on an almost industrial scale.”
Some media websites allow people to vote on comments as well and the researchers say pro-Kremlin comments received an unusually high number of “up-votes.”
Russia has consistently denied being involved in propaganda and misinformation operations.
Other websites targeted include the US’s Fox News, Le Figaro in France, Der Spiegel and Die Welt in Germany, and Italy’s La Stampa. The work was undertaken by the OpenSource Communications Analytics Research program at Cardiff University’s Crime and Security Research Institute and was funded by UK Foreign, Commonwealth and Development Office.
If you believe everything you read, it would seem the European Union wants to ban drivers from getting their old cars repaired, or that it plans to limit the flights people can take by implementing “carbon passports.” It might even impose Covid-style “climate lockdowns.”
None of these things are true, but as 373 million eligible voters head to polls this week to elect a new EU parliament, a torrent of disinformation is flooding the continent.
The EU and several of its member countries have set up investigative agencies to counter disinformation ahead of the vote — and they are squarely focused on campaigns originating in Russia.
Allegations of Russian disinformation around elections are nothing new — they have been a feature in the run-up to votes in the United States, United Kingdom and the EU before, though Russia has always denied waging such campaigns. But the tactics are becoming more sophisticated, and are leaping from online platforms into parliaments and public discourse.
Artificial intelligence and deepfakes are quickly becoming the tools of choice by those looking to spread false narratives, said Morgan Wright, chief security advisor for SentinelOne, an American cybersecurity company.
SentinelOne, alongside the independent research group EU DisinfoLab, has worked to uncover a Russia-based influence network operating in Europe since 2022 dubbed “Doppelgänger.”
The network puts out clone sites of prominent European media organizations, including major publications like the Guardian and Bild in Germany, as vehicles to spread misleading and false content. There is a focus on fake stories to influence attitudes on subjects like the wars in Ukraine and Gaza. But for the past year, the climate crisis has been the second-most targeted subject, according to the European Digital Media Observatory (EDMO).
One such fake story, published on a website mimicking Bild, described how a teenage cyclist bled to death after streetlights were turned off to save electricity. The fake article claimed the German government cut the lights because of an energy crisis fueled by sanctions on Russia over its war in Ukraine. Before the war, Germany had relied heavily on Russian gas for energy. The story was debunked by numerous German media outlets, but continued spreading on Facebook.
Paula Gori, EDMO’s secretary general, said that spreading false climate narratives fits in with Russia’s geopolitical goals as the country’s lucrative oil and gas sector has been hit with sanctions and an EU ban on imports.
“It’s quite easy for Russians to spread disinformation that the EU is struggling because of the sanctions, and European citizens are struggling because there is no gas from Russia,” she said.
False narratives that renewables are doing little for the EU’s energy security have also emerged, Gori said. Official statistics, however, tell a different story: In 2022, renewables accounted for 23% of the energy consumed in the EU. Some European countries are now using more renewable energy than fossil fuels.
EU DisinfoLab found other stories falsely claiming wind turbines were causing toxic pollution.
The aim also appears to be to sow confusion and division, rather than to bring about a change in climate policy, Wright said.
“Russia has been very opportunistic. It’s looking for controversy and strife, and any current issues that they can exploit,” he said. “The goal is to get (people) fighting with each other. They don’t care about climate policy.”
Russia has another interest in undermining the EU’s messaging on climate. As it seeks to strengthen its relations in the Global South, particularly in Africa and Asia, where it is competing with the West for business and influence, it is trying to depict Europe’s climate policies to exploit poorer countries and stop them from industrializing, Gori said.
This idea of using disinformation to widen existing divisions is straight from Russia’s old disinformation handbook, according to Wright.
“If you go back to 1917, to the creation of Cheka, the first of Russia’s intelligence organization, they’ve been the masters at disinformation for over 100 years,” he said. “They’ve been using the same tactics for decades, it’s the tools that change — now it’s artificial intelligence and social media.”
Campaigns that begin online are seeping into Europe’s parliaments, where populist politicians are peddling some of the same false narratives.
Politicians in France and Italy have shared false news that climate policies to cut pollution from farming will force EU citizens to eat insects, while people in Croatia, Germany and Poland were told politicians in England were imposing “climate lockdowns” on their citizens and similar restrictions could be coming to their countries.
The campaigns are having real-life consequences, particularly for the EU’s Green Deal legislation, the bloc’s overarching vision for climate action.
The EU is considered a global leader in tackling planet-heating pollution, but climate disinformation could undermine the bloc’s ambitious goal to reduce carbon emissions by 90% by 2040, compared with 1990 levels.
That umbrella goal is already under threat. The “green wave” that brought many climate-focused politicians to power in the 2019 European elections looks to be over, with green parties predicted to suffer heavy losses this month, which would mean fewer progressive climate voices in parliament.
Other fake stories have targeted the EU’s agricultural policies, most notably through farmers protests that swept through a number of EU states, including France, Germany, Spain and Poland this year.
Gori said EDMO’s researchers found clear evidence of attempts to hijack the protests, which started as grassroots initiatives over genuine farmer concerns. She pointed to a widely shared false story claiming famers in France and Spain would be “booted off their land” to make way for solar plants.
There were many reasons farmers took to the streets, but some of the perceived negative impacts of green farming proposals led to the EU dropping or diluting a number of the policies.
The EU scrapped a plan to cut the use of pesticides in half by 2030, and it delayed new rules on soil health and biodiversity. It also dropped a requirement to cut non-carbon greenhouse gas emissions in agriculture.
“The protests were legitimate, of course … but they were used and exploited by Russia to share disinformation that attacks the EU institutions and causes polarization,” Gori said.
Pallavi Sethi, a climate change misinformation policy fellow at the Grantham Research Institute at the London School of Economics, said climate is just the latest in a string of issue some far-right politicians have targeted to stoke division. Before climate, it was migration.
The far-right Alternative for Germany (AfD), for example, has weaponized the debate around heat pumps. When the ruling coalition government proposed phasing out home heating systems running on methane gas, the AfD branded them an “eco-dictatorship” and made the issue a key part of its campaign, despite scientific evidence showing the climate benefits of electrifying heating.
“Right-wing populism political ideology often emphasizes the rights of ordinary citizens and demonizes the ‘corrupt elite’ — the governments who want to do something about climate change, who are forming these climate policies, and the scientific community that is providing evidence,” she said.
European Commission President Ursula von der Leyen speaks during a debate on the results of the latest EU summits, as part of a plenary session at the European Parliament in Strasbourg, eastern France, on February 6, 2024. EU chief Ursula von der Leyen on February 6, 2024, recommended the bloc bury a plan to cut pesticide use in agriculture as a concession to protesting European farmers. The original proposal, put forward by her European Commission as part of the European Union’s green transition, “has become a symbol of polarisation,” she told the European Parliament in Strasbourg, France.
The EU’s answer to the problem has been its Digital Services Act, which specifically targets illegal content, misleading advertising and disinformation. It has been using the new legislation to force big social media companies to clean up their platforms. Most recently, the European Commission — part of the EU’s executive government — opened formal proceedings against Facebook and Instagram over disinformation targeting the European election.
And last month, the EU imposed sanctions on the Prague-based Voice of Europe, an online media outlet that it accused of running a pro-Russian influence operation. Voice of Europe could not be reached as its contact page shows an error message.
But these efforts can only go so far to address what has become an enormous problem. Climate Action Against Disinformation, an international coalition of groups, said the response by social media companies and governments has been woefully inadequate.
CNN in February reported on a fake AI-generated audio recording of a top candidate in Slovakia “admitting” he would rig the parliamentary elections, which was published on Facebook just days before a key vote last fall.
A spokesperson for Meta, Facebook’s parent company, said in a statement at the time: “Our independent fact-checking network reviews and rates misinformation — including content that’s AI-generated — and we label it and down-rank it in feed so fewer people see it.”
While the statement said content that violates company policies is removed, it did not address why some posts containing the Slovak deepfake were not marked as false.
Facebook has removed content that breaches its policies for several years, but it does not automatically remove content just because it has been doctored or generated by AI. Instead, it aims to label altered content as such.
Its community standards policy in the past had only targeted video, but in April, it was expanded to include audio.
In Slovakia, however, the damage may have already been done. The candidate targeted — a pro-Western politician — was defeated by another with close ties to Moscow.
“There was a flaw in how Meta — Facebook — looked at stuff. They would only take down things if it was a video that was doctored,” Wright said. “They had no policy on taking down doctored audio. That was a flaw, and they exploited it.”
Russian troll farms and social media bots are now old school. The Kremlin’s favorite way to sway U.S. elections in 2024, we learned this week, makes use of what many Americans consider a harmless pastime — content created by social media influencers.
A DOJ indictment on Wednesday alleged that content created and distributed by a conservative social media company called Tenet Media was actually funded by Russia. Two Russian government employees funneled nearly $10 million to Tenet Media, which hired high-profile conservative influencers such as Tim Pool, Benny Johnson and Dave Rubin to produce videos and other content that stoked political divisions. The indictment alleges that the influencers — who say they were unaware of Tenet’s ties to Russia — were paid upward of $400,000 a month.
It’s the latest sign that Russia’s online influence efforts are evolving, said Pekka Kallioniemi, a Finnish disinformation scholar who is the author of “Vatnik Soup,” a book on Russia’s information wars set to publish Sept. 20. Influencers with a fanatic following are far more successful at spreading disinformation than bots and trolls, he told POLITICO Magazine in an interview.
“These people, they are also idolized. They have huge fan bases,” he said. “They are being listened to and they are believed. So they are also a very good hub for spreading any narratives in this case that would be pro-Kremlin narratives.”
This conversation has been edited for length and clarity.
Why are far-right social media influencers ripe targets for Russia? How has the Kremlin been able to infiltrate far-right media so effectively?
The main reason is that they share a similar ideology. This kind of traditionalism and conservatism is something that Russia would also like to promote: They show Putin as the embodiment of traditionalism and family values. And this is very similar, of course, in U.S. politics. Anti-woke ideology is also behind this.
There are also these kinds of narratives promoted by people on the left. It is an extremely cynical system where the whole idea is to polarize the U.S. population by providing extreme ideologies and extreme ideas and push them to a U.S. audience.
So it isn’t just a right-wing thing, it happens on both sides?
Yes, and I would emphasize that it is far-left and far-right. It is the far ends of the political spectrum that are both targeted. The narratives [on the left] are the same as the ones promoted by right-wing influencers.
Kirby details DOJ crackdown on Russian election interference
How have Russia’s influencing tactics been changing? Is there a reason behind that evolution?
If you go way back to the launch of Russia’s Internet Research Agency in 2013, they started mass producing online propaganda and they used these so-called troll farms. Later on, they also started using automated bots. But in addition, the Russians seem to be using these big, big social media accounts that are called “superspreader” accounts. They are being utilized to spread the narrative far and wide. This term came from Covid-19 studies: There was this Covid study that found out 12 accounts were responsible for two-thirds of Covid vaccine disinformation, and actually Robert F. Kennedy Jr.’s account was one of them. These studies, also in the geopolitical sphere, discovered that actually a lot of this disinformation is spread through the superspreader accounts. Russia had probably realized this, and this incident is a good indicator that they are being utilized by the Kremlin.
What about the superspreader accounts does the Kremlin find useful?
Because their reach is so big. They have usually organically grown to be popular. Whereas with troll and bot accounts, the following is not organic. They usually have a smaller following, and it’s very hard to spread these narratives outside the network. So if you have a main hub — a superspreader account with 2 million followers — it is much easier to spread a narrative because these accounts already have a huge reach and a big audience and sometimes their content even goes into the mainstream media or traditional media.
These people, they are also idolized. They have huge fan bases. Huge superspreader social media personalities — they are being listened to and they are believed. So they are also a very good hub for spreading any narratives that would be pro-Kremlin narratives.
Would you say that the rise of social media has helped Russia’s disinformation campaign?
Of course. Before social media, they had a lot of difficulties penetrating the Western media. It happened, but not so often. So social media has been a useful tool for Russia to spread its propaganda. They were the first ones to actually utilize social media to do these kinds of mass disinformation campaigns and information operations, and they had a really good head start in that sense. It took the Western media and intelligence services years to figure out the whole thing.
The Internet Research Agency was established in 2013. First, they started in a more domestic environment, so they were defaming the opposition, Alexei Navalny and so on, and of course Ukraine. But after that, when there was no more opposition in Russia, they moved on to the U.S. audiences and U.S. elections in 2016.
It is also worth mentioning that probably they are using AI now and in the future, because it’s just automating things. It’s so much cheaper and also more effective. You can create huge volume by using AI. So for example, what Russian operatives have done is create fake news sites or blogs, and the content on these blogs is completely generated by AI, but sometimes they inject Russian narratives or propaganda manually. There are hundreds of these blogs. Also, of course, they use the traditional system of bots and trolls to then make these stories seem much bigger. It’s kind of this multilevel system, and sometimes one of the superspreader accounts can pick up the story, and then it really goes viral. It’s a very sophisticated system that is still not very well understood.
Are you surprised at all by this DOJ indictment that involves two Russian media executives pushing pro-Kremlin propaganda in the U.S.?
I was not surprised. For a long time, people have thought, “There is no smoking gun, there is no direct evidence of any kind of foreign influencing.” But now this is it — and I think that this is just the tip of the iceberg. There’s so much more happening, especially through these shell companies located in the United Arab Emirates or Czech Republic, or whatever because Russia’s very good at masking money flows.
What is the ultimate goal of Russia’s disinformation campaign? Electing Donald Trump? Or is there a broader objective?
They want to polarize and divide countries, especially the U.S., which has a two-party system. Whenever a country is focusing on domestic disputes and arguments, its foreign policy becomes much weaker. We saw that with the Ukraine aid that was delayed for months and months and months, and that’s basically their goal: to create these internal conflicts, so the foreign policy of various countries becomes much weaker and indecisive.
So they want division and also for people to stop paying attention to what Russia does?
Yes. But the famous thing about Russian disinformation is that it rarely even mentions Russia. So it’s usually talking about other issues, for example, the southern border of the U.S. or woke culture or losing traditional values. I think the main narrative that is pushed is that the U.S. shouldn’t send any more money to Ukraine, because there are so many domestic problems that should be fixed instead.
And the reason is that when you start doing an investigation on Russian culture in general, you realize that it’s not really that traditional or conservative or anything like that. You see that they have very big problems, and they are actually quite secular. The image that Russia tries to create of themselves, it’s not the same as reality. They just decide, OK, let’s not talk about Russia at all. Let’s talk about other countries and their problems. It’s very different from China. China likes talking about China and how great they are. So it’s like this complete opposite in that sense.
Some people refer to Americans sympathetic to Kremlin arguments as “useful idiots.” Is that a fair characterization of this situation? Has there been a change in the type of “useful idiots” Russia is seeking out?
I’m quite sure that the owners of Tenet Media, Lauren Chen and Liam Donovan, I’m pretty sure they knew what they were getting into. There were a lot of signs that they actually knew that the money was coming from Russia. About the influencers? I’m not sure. I think almost all of them have stated that they didn’t know. But I mean, it raises questions, if somebody is willing to pay you $400,000 for four videos a month. There has to be due diligence. You have to think, where is this money coming from? Why is somebody willing to pay so much for producing these YouTube videos that get maybe 100,000 views, which isn’t that much, or 200,000 views? Maybe they didn’t know, but they certainly didn’t do their due diligence. They didn’t do proper background checks of where the money was coming from, because that was a lot of money.
When it comes to seeking useful idiots, I think it’s pretty much the same as before. There is a counterintelligence acronym called MICE. Basically, it lists what motivates somebody to do espionage: money, ideology, compromise or ego. This is a very simplified model, but I think it fits quite well in this propaganda domain. So there’s usually something that motivates these people. And I think “useful idiot” as a term is not very good, because a lot of these people, they are not idiots. They might be greedy. People have different motivations to do things. But I think the basic idea behind the so-called useful idiot is still the same. It is somebody who’s willing to work for a foreign nation, usually in order to undermine their own country.
So who do they seek out to spread propaganda? What kind of person are they looking for?
I think a lot of these people who are doing it very well are usually charismatic and in some ways controversial. They know how to create controversy around topics and on social media. Creating controversy usually also brings engagement — people like your content, share your content, comment on your content. So charismatic people are probably the most valuable assets right now.
Do you think people have a growing understanding of Russia’s disinformation campaign? And to what degree do they care?
I think a lot of people simply don’t care. Most people care about inflation, food prices, energy prices, the kind of stuff that actually affects their day-to-day life. If somebody is being paid to promote Russian narratives, I don’t think a lot of people care about that, because it doesn’t really affect their life that much. But the interesting thing is that Russian narratives usually revolve around these day-to-day topics. In the indictment, the narratives being pushed were about food prices and everything becoming too expensive and so on. So Russia also promotes this day-to-day stuff in their disinformation. But yeah, I don’t think people care as much as they maybe should.
Ahead of the election, how can we be vigilant against Russia’s disinformation campaigns?
Well, I’ve always said that the best antidote to this is education, but I think it’s too late when it comes to the November elections. But Finland, it’s a great example. We have a very good education system that promotes media literacy and critical thinking, and also cognitive resilience, against propaganda and disinformation. I think this would be the best solution.
In general, people should be more critical of what they read, especially on social media, and realize that there are people who are willing to spread lies and fake news just for engagement. Always remember that people might be paid to spread these stories like we just witnessed with Tenet Media. So critical thinking as a general rule is a good way to stay vigilant.
But also, I always say that people should just close their computers and smartphones and go out and just live their lives and enjoy it. The digital world can be pretty hostile, and it can bring out these negative emotions. Maybe take a break and go for a hike. Just enjoy life.
The Russian government’s covert efforts to sway the 2024 presidential election are more advanced than in recent years, and the most active foreign threat this political season, U.S. intelligence officials said Friday.
Russia’s activities “are more sophisticated than in prior election cycles,” said a senior official with the Office of the Director of National Intelligence (ODNI) in a briefing with reporters, noting the use of “authentic U.S. voices” to “launder” Russian government propaganda and spread socially divisive narratives through major social media, as well as on sham websites that pose as legitimate American media organizations.
Moscow is targeting U.S. swing states in particular, the official said, and using artificial intelligence to more quickly and convincingly create fake content to shape the outcome in favor of former president Donald Trump.
That is “consistent with Moscow’s broader foreign policy goals of weakening the United States and undermining Washington’s support for Ukraine,” the ODNI official said, who like others spoke on the condition of anonymity under ground rules set by the agency.
China, for its part, is not attempting to influence the presidential race, but it is seeking to do so in state-level and regional races, officials said, referencing similarities to Beijing’s efforts in the 2022 midterm elections.
Officials noted that they have seen no foreign efforts to directly interfere in the 2024 election by, for instance, hacking into voting machines.
However, the ODNI officials acknowledged for the first time that the Iranian government was behind not only the hack of the Trump campaign revealed last month, but also the leak of internal campaign documents. A person who used an AOL email account and the name “Robert” sent the documents to media outlets, including The Washington Post.
This week, the U.S. government announced a sweeping set of actions to counter Russian influence campaigns, including an indictment of two Russian employees of the state-run news site RT for allegedly paying an American media company to spread English-language videos on YouTube, TikTok, Instagram and X.
Prosecutors also seized 32 Russian-controlled internet domains that were used in a state-led influence effort called “Doppelganger” to undermine international support for Ukraine. In addition, the Treasury and State departments announced sanctions on Russian individuals and entities that are accused of disseminating propaganda.
RT has cultivated networks to disseminate narratives friendly to Moscow, while trying to mask the content as authentic Americans’ free speech, ODNI said in an election security update Friday.
Two things place “Russia at the top of the list” of foreign governments seeking to influence the election, the ODNI official said. “They’re fairly robust and quite practiced at doing this type of activity. Also the scope and the scale of their activities are quite significant.”
“Russia is working up- and down-ballot races,” the official said. They are using artificial intelligence “to more quickly and convincingly create synthetic content” and influence-for-hire firms that leverage marketing, public relations and other expertise to complicate attribution.
“Americans are more likely to believe other Americans’ views compared to content with clear signs of foreign propaganda,” the official said. “So what we see them doing is relying on witting and unwitting Americans to seed, promote and add credibility to narratives that serve these foreign actors’ interests.”
These new tactics and technologies are an advance on what Russia was doing in 2016, when its efforts to interfere in the election stunned the public and caught U.S. intelligence flat-footed, said Brandon Van Grack, a former federal prosecutor who investigated Moscow’s efforts to sway the 2016 campaign.
“The main thing that struck me is they were able to launder their influence through U.S. media personalities in a way that, had they been overt as in identifying themselves as the Russian government or RT, those viewpoints would never have been disseminated,” he said.
None of the influencers alluded to in the indictment unsealed Wednesday were identified. But conservative YouTuber Matt Christiansen said on a live stream that day that he was among the six influencers or “commentators” mentioned in the document.
He insisted, however, that he had never been influenced by the Russians in his coverage and merely promoted values dating back to America’s founding.
“I guess if that aligns with [Russian President Vladimir] Putin, then I’ve been had,” Christiansen said.
In 2016, RT was operating openly in the United States. It was forced to register a year later as a foreign agent, but was effectively silenced in the West in 2022 after Russia’s invasion of Ukraine.
“You did not have influencers like this, who were spreading this information directed and funded by the Russian government in 2016, at least not nearly in the same manner,” said Van Grack, now a partner at Morrison Foerster.
The ODNI also noted, however, that as Moscow’s sophistication has increased, the U.S. intelligence community’s “understanding of these efforts has also increased — in our effort to understand and counter it.”
Gavin Wilde, a former U.S. intelligence analyst focused on Russia, noted that the FBI named Kremlin Deputy Chief of Staff Sergei Kiriyenko as directing the Doppelganger effort.
“That’s a level of detail and formal orchestration by higher levels in the Kremlin” than was ever publicly alleged by the U.S. government, including during the 2016 election, when a Russian troll farm known as the Internet Research Agency (IRA) spread disinformation on American social media, he said. The IRA campaign, detailed in a January 2017 ODNI assessment on Russian interference, was run by Russian oligarch Yevgeniy Prigozhin, who as founder of the Wagner mercenary group would later run afoul of Russian President Vladimir Putin and die in a mysterious midair plane explosion.
The same 2017 intelligence assessment identified RT as the Kremlin’s principal international propaganda outlet, which under editor in chief Margarita Simonyan, contributed to the influence campaign.
“In 2016, the best we understood at the time was the propaganda efforts by RT or the Russian troll farms were tacitly approved by the Kremlin,” said Wilde, a White House aide in the Trump administration who coordinated policy to counter foreign malign influence. “So Kiriyenko having steered the propaganda aspects of this seems like a level of orchestration we haven’t seen before.”
The Washington Post, citing internal Kremlin documents, revealed in April 2023 that Kiriyenko directed a network of political strategists to conduct propaganda campaigns promoting Russian interests and undermine support for Ukraine across Europe. In April this year, The Post revealed that under Kiriyenko’s direction, the political strategists turned in earnest to undermining support for Ukraine in the United States as a supplemental aid bill was pending in Congress. They also sought to stir fear over U.S. border security and to stoke economic and racial tensions.
A Russian influence operation paid millions of dollars to right-wing American “influencers” who promote Russia-friendly narratives, according to a detailed indictment alleging money laundering and other crimes. Through a company called “Tenet Media,” two Russian state media employees used nearly $10 million to fund right-wing influencers who often ape Russian propaganda, including Dave Rubin (2.45 million YouTube subscribers), Benny Johnson (2.39 million), Tim Pool (1.37 million), and Lauren Southern (712,000). Russia’s operation also allegedly built fake websites that mimicked media outlets such as the Washington Post and Fox News, and used a network of fake accounts and bots on social media to spread conspiracy theories and other Russia-friendly content.
For example, after ISIS terrorists killed 145 at a Moscow concert hall this March, Tenet Media allegedly received instructions to ask the Americans on its payroll to blame the attack on Ukraine and the United States. At least one of the influencers, believed to be Benny Johnson, cast doubt on ISIS’s claim of responsibility, and raised the (absurd) possibility that the United States was responsible.
Johnson and the other Tenet Media–paid influencers are pleading ignorance. They’re not accused of a crime, and there’s no evidence they knew the ultimate source of the payments.
But they didn’t bother looking, either. Pool reportedly got $100,000 per video for videos he was already making, and he didn’t even have to give up the rights to the videos. That surely struck him as an unusually good deal.
Apparently the crowd that defends conspiracy theories as “just asking questions” didn’t think a relevant question was “who are these people giving me a lot of money, and what do they want?”
Pool, Johnson, etc. are why we have the term “useful idiots.” And as Russia presumably knew when it picked them, they’ll remain useful, even after being exposed.
The best defense against foreign influence is healthy domestic politics. Help us build a better political community. Join Bulwark+.
This is a quintessential instance of Russia’s Internet Age influence operations, which most infamously targeted the United States in the 2016 presidential election, and didn’t stop there. Their primary goal is weakening the United States—along with the European Union, NATO, and democracy—by stoking division and undermining public confidence in institutions and establishment sources of information, namely government and mainstream media.
It’s much easier to stoke preexisting divisions than to create new ones; to promote and amplify existing voices and positions rather than invent some from scratch. Russian influence operations played all sides of America’s race and policing debate, including by setting up fake Facebook groups with names such as “Black Matters,” “Don’t Shoot,” and “Back the Badge” that garnered hundreds of thousands of members before Meta shut them down.
It’s not a coincidence that the American (and Canadian) influencers Russia promoted are right-wing culture warriors—the sort who cast themselves and their followers as victims of the nearly-all-powerful forces of “wokeness,” complain about being “silenced” in videos viewed by millions, and present a mix of personal opinion and falsehoods as the secret truth some ambiguous They “don’t want you know.”
A lot of culture war divisiveness is organically American. But for America’s foreign adversaries, especially those like Vladimir Putin who promote right-wing cultural views, the more the merrier.
Even better are Western voices directly promoting Russian interests, especially on Russia’s current top concern: Ukraine. Russian influence operations don’t really aim to convince skeptical people that something false is real. That rarely works. Rather, they aim to give people already inclined to support Russia’s position something to say, and to muddy the waters enough that average people get frustrated and check out.
Most people have strong personal opinions on a few issues, and otherwise get their stances from elite cues. Rather than ask, What do I think? about every issue and seek out politicians who agree, most affiliate with a group—based on their top personal issue, or sometimes just acculturation—and then ask, What does my side think about this?
We can see signs of Republicans following elite cues on Ukraine. In March 2022, a month into the war, most Republicans wanted to help Ukraine stand up to Russian aggression. In a Pew survey, 49 percent of Republicans said the United States was not helping Ukraine enough, with another 23 percent saying the amount of aid was “about right,” and just 9 percent saying it was too much. That was a little more supportive than Americans as a whole.
By September 2022, the numbers had flipped, with Republicans’ most popular answer being that we were helping Ukraine “too much” (32 percent), followed by “about right” (30 percent) and “not enough” (16 percent). They were less supportive of Ukraine than Americans on average and have been ever since. At the end of 2023, 48 percent of Republicans wanted to reduce aid to Ukraine, outnumbering “about right” (20 percent) and “not enough” (13 percent) combined.
It’s hard to explain this flip-flop without elite cues. A negative opinion of Ukraine aid at the end of 2023 could be a reaction to the costly and largely unsuccessful counteroffensive earlier that year, but in September 2022, Ukraine was retaking territory. American policy was consistent throughout: No U.S. troops, only assistance, primarily via older equipment, along with ammunition production upgrades the Pentagon had been wanting to do, all worth less than a tenth of the annual U.S. military budget.
What changed? Republican leaders.
Republican voters’ quick support for Ukraine, a U.S.-partnered European democracy, against Russian invasion reflected the old Reaganite worldview. But Donald Trump reacted to the invasion by praising Putin, calling the move “genius” and “savvy.” Influential right-wing figures such as Tucker Carlson, Elon Musk, David Sacks, and Sen. Mike Lee—along with online influencers like Tim Pool and Benny Johnson—pushed Russia-friendly narratives. And millions of their followers came around.
The Republican shift is far from costless. From fall 2023 into spring 2024, Republicans delayed Ukraine aid in Congress. Bipartisan majorities in both the House and Senate supported it, but a large faction of Republicans did not, and Speaker Mike Johnson refused to allow a vote. By the time Congress passed the support bill, the manufactured delay-caused shortages of ammunition and other essentials had enabled Russian advances and allowed more Ukrainians to die.
We don’t know if the delay in aid would have been shorter if not for Russian influence operations. What we know is that the Kremlin and MAGA Republicans push in the same direction, the effect is greater than zero, and Russia wouldn’t keep devoting money and operatives to the effort if they thought it wasn’t serving their interests.
Russian influence operations didn’t swing a sizable percentage of American opinion on their own—not even close. A variety of Americans oppose U.S. aid to Ukraine for a variety of reasons, some would have argued against it no matter what, and not all of them are voting for Trump or consuming right-wing media.
But the Russian influence agents know that.
They weren’t convincing people who support Ukraine, and the Western democratic alliance more broadly, to switch sides. But they probably strengthened the convictions of some anti-Ukraine Americans, contributed to making their position mainstream in the Republican party, and spread it more widely.
If you oppose Russia’s aggression, this wouldn’t make you think otherwise, but by perpetuating and amplifying Russia-sympathetic arguments, it may have helped convince you that the anti-Ukraine stance is more widespread than it actually is, and therefore warrants more consideration in public debate.
It’s not election-swinging on its own, but it’s worth opposing in the name of sovereignty and democracy. Americans should decide American elections absent foreign manipulation, and with the most accurate information possible.
For that reason, it’s good that the Justice Department chose to expose this operation now, before the election. Perhaps we’ve learned something since the Obama administration, which didn’t tell the public about Russia’s operation targeting the 2016 election. Senate Majority Leader Mitch McConnell doubted the CIA’s conclusions that Russia was hacking U.S. targets and trying to elect Trump. McConnell told the White House he would consider public pushback on Russia an act of partisan politics, and Obama, who as president had the authority to disclose the information no matter what the Senate Majority Leader thought, decided not to.
The government is more on the lookout for foreign influence operations, especially from Russia, and took action to counter it. Instead of a presidential announcement, disclosure came via a detailed indictment filed in court.
By going public now, the Justice Department informed voters about Russia’s ongoing efforts and undermined this operation. The Russia-paid influencers have less credibility on Ukraine. YouTube closed Tenet Media’s channel. Websites and independent media figures may be warier of getting involved with this sort of thing in the first place.
However, while the U.S. government can expose specific influence operations, there remains a large swath of the American right that sees Putin’s Russia as an ally against the “real” enemy—the broadly defined Left—and is eager to believe Russian propaganda as long as it’s negative about Americans they don’t like. That’s a harder problem.
In early 2022, a young couple from Canada, Lauren Chen and Liam Donovan, registered a new company in Tennessee that went on to create a social media outlet called Tenet Media.
By November 2023, they had assembled a lineup of major conservative social media stars, including Benny Johnson, Tim Pool and Dave Rubin, to post original content on Tenet’s platform. The site then began posting hundreds of videos — trafficking in pointed political commentary as well as conspiracy theories about election fraud, Covid-19, immigrants and Russia’s war with Ukraine — that were then promoted across the spectrum of social media, from YouTube to TikTok, X, Facebook, Instagram and Rumble.
It was all, federal prosecutors now say, a covert Russian influence operation. On Wednesday, the Justice Department accused two Russians of helping orchestrate $10 million in payments to Tenet in a scheme to use those stars to spread Kremlin-friendly messages.
The disclosures reflect the growing sophistication of the Kremlin’s longstanding efforts to shape American public opinion and advance Russia’s geopolitical goals, which include, according to American intelligence assessments, the election of former President Donald J. Trump in November.
In 2016 and 2020, Russia employed armies of internet trolls, fake accounts and bot farms to try to reach American audiences, with debatable success. The operation that prosecutors described this week shows a pivot to exploiting already established social media influencers, who, in this case, generated as many as 16 million views on Tenet’s YouTube channel alone.
Most viewers were presumably unaware, as the influencers themselves said they were, that Russia was paying for it all.
“Influencers already have a level of trust with their audience,” said Jo Lukito, a professor at the University of Texas at Austin’s journalism school who studies Russian disinformation. “So, if a piece of information can come through the mouth of an existing influencer, it comes across as more authentic.”
The indictment — which landed like a bombshell in the country’s conservative media ecosystem — also underscored the growing ideological convergence between President Vladimir V. Putin’s Russia and a significant portion of the Republican Party since Mr. Trump’s rise to political power.
The Kremlin has long sought to exploit divisions on both sides of the American political spectrum, but contentious conservative voices provide ample fodder for its own propaganda, especially when it involves criticism of the Biden administration or, more broadly, of the country’s foreign policy, including support for Ukraine in its war against Russia.
The federal investigation that led to the indictment unsealed on Wednesday is part of a broader government effort, first reported in The New York Times, to combat Russian disinformation, election interference and cyberattacks. Administration officials have said the effort could lead to more charges.
The indictment detailed the lengths Russia went to try to make Tenet a player in the country’s political discourse, while obfuscating the fact that it was footing the bill.
That included transferring at least $9.7 million from Russia through shell companies in countries like Turkey, the United Arab Emirates and Mauritius. Those payments accounted for 90 percent of the company’s revenue from last October to August, the indictment said.
Prosecutors have not, so far, charged Ms. Chen and Mr. Donovan. It is unclear where they are, and they did not respond to requests for comment. The indictment did note that neither they nor Tenet had registered as a representative of a foreign government, a requirement of the Foreign Agents Registration Act, known as FARA.
Tenet’s influencers all described themselves as victims of the Russian ruse, and at times disparaged the federal investigation. They emphasized that they took no direction from Russians, though the indictment details various efforts by the company’s sponsors to sow specific narratives, some of which appeared in the content they posted.
In one instance, Mr. Johnson, a former journalist with 2.4 million subscribers on YouTube, suggested on his own show that Ukraine might have been responsible for a deadly attack at a concert hall in Moscow in March, reflecting a since-debunked Russian claim. (A branch of the Islamic State claimed responsibility.) The next day, according to the indictment, one of the Russians asked Ms. Chen to push that narrative on Tenet’s influencers, stating in a message “I think we can focus on the Ukraine/U.S. angle.”
Another influencer on Tenet’s roster, Lauren Southern, a far-right Canadian commentator with more than 1.2 million followers between YouTube and X, produced a video mocking the Summer Olympics in Paris in July, echoing Russia’s efforts to denigrate the Games and their French hosts.
The Russians even pushed Tenet to highlight a video from Tucker Carlson, the former Fox News star who now produces his own online show. He made it during a visit to Moscow this year, marveling about the abundance on display in a supermarket in the city.
A producer working for Tenet, in a message cited in the indictment, thought Mr. Carlson’s video “just feels like overt shilling” but, after being pressured by Tenet’s owners, agreed to post the clip in any case.
Nina Jankowicz, a co-founder of the American Sunlight Project, an advocacy group in Washington that fights disinformation online, said that “this is a classic case of information laundering.”
“The Russians and other foreign actors have used it for decades to obscure the source of influence operations,” she went on. “In this case, they chose influencers who were already engaging in rage bait, exploiting the pre-existing fissures in our society for clicks.”
Flush with Russian cash, Tenet certainly compensated some of its influencers well. It paid at least $8.7 million to the top three influencers, who were not named but who appear to be Mr. Rubin, Mr. Pool and Mr. Johnson based on details in the indictment, such as the number of followers on social media.
According to the indictment, Mr. Rubin received $400,000 a month, plus a $100,000 signing bonus, to produce four videos a week on Tenet’s YouTube channel. Mr. Pool was paid $100,000 per video, which he produced weekly.
The contracts put those three on the same pay scale as some of those on Forbes’s “Top Creators 2023” list, though Mr. Pool portrayed his payment as standard in an interview on “The Ben Shapiro Show” on Friday.
Under terms of their arrangements, the influencers could keep producing other content separate from the work they did for Tenet.
A representative of Mr. Johnson declined to comment but provided details of the timeline and nature of his contract with Tenet. On X, however, Mr. Johnson said he had acted “as an independent contractor” under what he termed “a standard, arms-length deal which was later terminated.”
Mr. Rubin, who is the creator and host of “The Rubin Report,” a political talk show on YouTube and Blaze Media, a conservative media company, said in a post online that he had no knowledge of connections between Tenet and Russia.
So did Mr. Pool, who has promoted Mr. Trump’s election fraud conspiracies and portrayed Ukraine as an “enemy” on his popular online show. In his response on X, he directed a crude insult to Mr. Putin. On Thursday, he said that the F.B.I. had invited him to a “voluntary interview” and that he would cooperate with the investigation.
Prosecutors said the two Russians charged on Wednesday, Kostiantyn Kalashnikov and Elena Afanasyeva, had violated FARA and laws against money laundering. The pair are employees of RT, the Russian global television network.
In a response to a request for comment about the indictment, the network replied sarcastically. “We eat U.S. D.O.J. indictments for breakfast,” its statement said. “With lots of sour cream, usually.”
Mr. Donovan, 30, Tenet’s co-founder, appears on corporate records in Tennessee as a founder of Roaming USA Corporation, the company that later created Tenet. His account on X, which has not posted any messages since July, describes him as Tenet’s president. Among those who shared his posts, along with the company’s, was the owner of X, Elon Musk.
Tenet appears to have ceased operations since Wednesday. YouTube, in a statement on Thursday evening, said it had taken down its account on the platform, along with four others associated with Ms. Chen.
Ms. Chen, who is married to Mr. Donovan and is also 30, worked for RT from March 2021 until February 2022. RT’s website still describes her as a YouTuber who was “most passionate” about topics that “include dating culture, family values, individual liberty, gender equality and issues surrounding race.”
She also produced podcasts on Blaze Media and served as a contributor to Turning Point USA, the conservative organization run by Charlie Kirk. Her profiles on Mr. Kirk’s site and on Blaze Media’s disappeared this week. Her account on X, which remains active, has nearly 600,000 followers.
In a statement, Blaze Media’s chief executive, Tyler Cardon, said, “Lauren Chen was an independent contractor, whose contract has been terminated.”
For at least two of the influencers, the offer to join Tenet appeared to raise concerns about the origins of such generous contracts.
The indictment detailed how they questioned the company’s backers. In response, Mr. Kalashnikov and Ms. Afanasyeva, along with Ms. Chen and Mr. Donovan, provided a profile page of a fictitious European banker, Eduard Grigoriann.
They also arranged a phone call with someone purporting to be the banker. That was enough, apparently, to assuage any concerns.
“It’s lamentable that these influencers conducted so little due diligence,” Ms. Jankowicz said. “When something seems too good to be true — in this case, getting paid $100,000 per video for content you were already making — it probably is.”
Another of those who worked for Tenet was Tayler Hansen, who is perhaps best known for filming the shooting of Ashli Babbitt in the U.S. Capitol during the violence on Jan. 6, 2021.
For years, he scraped by financially by licensing his footage, selling branded merchandise and soliciting donations from supporters as he gradually built a following of more than 170,000 on X.
When Tenet approached him last year and offered the opportunity to work for a biweekly salary, he jumped at the chance, he said in an interview. Tenet also hired Mr. Hansen’s producer and covered his travel expenses.
“I had full autonomy, and there’s really no point in not working with a company that grants you full autonomy,” Mr. Hansen said. “I’ve never had as much freedom.”
Asked how he thought Tenet made money, he said simply, “Donors.”
Martin J. Riedl, a journalism professor at the University of Tennessee, Knoxville, who studies the spread of misinformation on social media, said the case of Tenet spotlighted gaping regulatory holes when it came to the American political system.
While the Federal Election Commission has strict disclosure rules for television and radio advertisements, it has no such restrictions for paid social media influencers.
The result is an enormous loophole — one that the Russians appeared to exploit.
“Influencers have been around for a while,” Mr. Riedl said, “but there are few rules around their communication, and political speech is not regulated at all.”
$tock, you can’t bury the truth.
You serial plagiarizer of ridiculous propaganda. You suck.