Honestly, decentralized social media are probably bad news for the current state of the art of disinformation campaigns. The bullshit that has been thriving on Facebook and Twitter is not only a chorus of bigoted aunts and uncles, but (perhaps more importantly) a coordinated attack from state sponsored troll farms seeking, among other things, to destabilise Western democracies.
The fediverse is, by design, less vulnerable to these attacks. Your trolls can generate activity around your disinformation content all they want: if nobody I follow boosts it, it’s not going to show up in my Mastodon feed. And you can feel free to recreate r/conservative or whatever in the fediverse, but if it becomes a cesspool like on Reddit you’ll be stuck with your trolls talking to each other on a defederated instance with no-one listening. Disinformation strategies currently employed successfully on centralized social media platforms are likely to fail here, causing a problem for bad actors.
It is probably paranoid to think there’s any geopolitical actor behind the current attack, but I fully expect the fediverse to become under attack from Russian troll farms as soon as they realize they’re no longer reaching out to people on Twitter, Reddit or Facebook.
From 2014: Research done at Eglin Air Force Base in 2014 about influence of opinions through “a decentralized potential field-based influence algorithm is developed in this work to ensure that all individuals’ states achieve consensus asymptotically to a desired convex hull spanned by the stationary leaders’ states, while maintaining consistent influence between individuals (i.e., network connectivity).”
Can we stop acting like we’re the fucking good guys in this? It’s absolutely fair that there are Russian troll-farms pushing disinformation, but to act like there are only Russian troll farms and they only exist to destabilize western democracies is a fucking joke.
Last I checked, there are plenty of US conservatives and rich people who want to dismantle democracy and these people own fucking news organizations, we don’t even need to go to Russian troll farms for that horseshit. It’s fucking home-grown. (I mean Musk and Murdoch weren’t even born in America and these two dipshits control some of the biggest names in US media I can think of, and both of these motherfuckers hate democracy. Reddit’s Steve Huffman literally looks up to Musk. Facebook is MAGA central because of Mark Zuckerberg.)
So let’s stop acting like the phone call isn’t coming from inside the house. When the state-actors show up, it’s gonna be all of them not just some of them.
You’ll notice I only mentioned Russia once, and not even in the paragraph you cited. The Russian troll farms are without doubt the most famous, and they have backed candidates like Farage, Trump and Le Pen with uncanny success. But it would be incredibly naive to think other actors are not involved with similar strategies, which is why I kept my post general. Steve Bannon has his ties to Russia, but he’s American as apple pie.
You listed a few interesting things and seemed miss an important one.
I might be a bit wrong in what I’m about to say, but the basics are right. Meta released a chatGPT like LLMs source code and had their weights leaked. Their model is named LLaMA.
People have been messing around with LLaMA inspired LLMs on their personal computers thanks to meta for months now.
Bad actor LLM bots are now a hobbyist level task. The fediverse is showing signs of not significantly caring or trying. Imo, Lemmy instances aren’t ready for this.
The US government as a whole, comparatively, are the good guys in this. The US government is pretty cautious and tends to shy away from spreading propaganda to its own people.
There are a lot of caveats there, absolutely. I’ll get into some of those. But let’s not pretend the US government is on par with the Russian or Chinese governments when it comes to social media propaganda.
The GOP being in bed with the Russians and collaboratively pushing narratives is not being done on behalf of the government. And I doubt whatever is happening at Elgin is targeting Americans.
Honestly, decentralized social media are probably bad news for the current state of the art of disinformation campaigns. The bullshit that has been thriving on Facebook and Twitter is not only a chorus of bigoted aunts and uncles, but (perhaps more importantly) a coordinated attack from state sponsored troll farms seeking, among other things, to destabilise Western democracies.
The fediverse is, by design, less vulnerable to these attacks. Your trolls can generate activity around your disinformation content all they want: if nobody I follow boosts it, it’s not going to show up in my Mastodon feed. And you can feel free to recreate r/conservative or whatever in the fediverse, but if it becomes a cesspool like on Reddit you’ll be stuck with your trolls talking to each other on a defederated instance with no-one listening. Disinformation strategies currently employed successfully on centralized social media platforms are likely to fail here, causing a problem for bad actors.
It is probably paranoid to think there’s any geopolitical actor behind the current attack, but I fully expect the fediverse to become under attack from Russian troll farms as soon as they realize they’re no longer reaching out to people on Twitter, Reddit or Facebook.
If you don’t think state sponsored troll farms exist in the “West,” I’ve got a bridge to sell you.
From 2011: US government working on Persona Management “sock puppet” software to flood forums with pro-US talking points
https://www.theguardian.com/technology/2011/mar/17/us-spy-operation-social-networks
From 2013: US ally Israel pays Israeli college students to defend Israeli government online
https://www.usatoday.com/story/news/world/2013/08/14/israel-students-social-media/2651715/
From 2014: Reddit lists Eglin Air Force base as the most “Reddit Addicted City”
https://old.reddit.com/r/Blackout2015/comments/4ylml3/reddit_has_removed_their_blog_post_identifying/
From 2014: Research done at Eglin Air Force Base in 2014 about influence of opinions through “a decentralized potential field-based influence algorithm is developed in this work to ensure that all individuals’ states achieve consensus asymptotically to a desired convex hull spanned by the stationary leaders’ states, while maintaining consistent influence between individuals (i.e., network connectivity).”
https://arxiv.org/pdf/1402.5644.pdf
From 2018: Facebook works with Cambridge Analytica to undermine US elections
https://www.nytimes.com/2018/04/04/us/politics/cambridge-analytica-scandal-fallout.html
Can we stop acting like we’re the fucking good guys in this? It’s absolutely fair that there are Russian troll-farms pushing disinformation, but to act like there are only Russian troll farms and they only exist to destabilize western democracies is a fucking joke.
Last I checked, there are plenty of US conservatives and rich people who want to dismantle democracy and these people own fucking news organizations, we don’t even need to go to Russian troll farms for that horseshit. It’s fucking home-grown. (I mean Musk and Murdoch weren’t even born in America and these two dipshits control some of the biggest names in US media I can think of, and both of these motherfuckers hate democracy. Reddit’s Steve Huffman literally looks up to Musk. Facebook is MAGA central because of Mark Zuckerberg.)
So let’s stop acting like the phone call isn’t coming from inside the house. When the state-actors show up, it’s gonna be all of them not just some of them.
You’ll notice I only mentioned Russia once, and not even in the paragraph you cited. The Russian troll farms are without doubt the most famous, and they have backed candidates like Farage, Trump and Le Pen with uncanny success. But it would be incredibly naive to think other actors are not involved with similar strategies, which is why I kept my post general. Steve Bannon has his ties to Russia, but he’s American as apple pie.
You listed a few interesting things and seemed miss an important one.
I might be a bit wrong in what I’m about to say, but the basics are right. Meta released a chatGPT like LLMs source code and had their weights leaked. Their model is named LLaMA.
People have been messing around with LLaMA inspired LLMs on their personal computers thanks to meta for months now.
Bad actor LLM bots are now a hobbyist level task. The fediverse is showing signs of not significantly caring or trying. Imo, Lemmy instances aren’t ready for this.
https://ai.meta.com/blog/large-language-model-llama-meta-ai/
The US government as a whole, comparatively, are the good guys in this. The US government is pretty cautious and tends to shy away from spreading propaganda to its own people.
There are a lot of caveats there, absolutely. I’ll get into some of those. But let’s not pretend the US government is on par with the Russian or Chinese governments when it comes to social media propaganda.
The GOP being in bed with the Russians and collaboratively pushing narratives is not being done on behalf of the government. And I doubt whatever is happening at Elgin is targeting Americans.
I don’t think “not targeting their own citizens” is quite the flex you think it is when it comes to pushing disinformation and misinformation.
I agree with this entirely.