General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsStudy: The political effects of X's feed algorithm
https://www.nature.com/articles/s41586-026-10098-2Abstract
Feed algorithms are widely suspected to influence political attitudes. However, previous evidence from switching off the algorithm on Meta platforms found no political effects1. Here we present results from a 2023 field experiment on Elon Musks platform X shedding light on this puzzle. We assigned active US-based users randomly to either an algorithmic or a chronological feed for 7 weeks, measuring political attitudes and online behaviour. Switching from a chronological to an algorithmic feed increased engagement and shifted political opinion towards more conservative positions, particularly regarding policy priorities, perceptions of criminal investigations into Donald Trump and views on the war in Ukraine. In contrast, switching from the algorithmic to the chronological feed had no comparable effects. Neither switching the algorithm on nor switching it off significantly affected affective polarization or self-reported partisanship. To investigate the mechanism, we analysed users feed content and behaviour. We found that the algorithm promotes conservative content and demotes posts by traditional media. Exposure to algorithmic content leads users to follow conservative political activist accounts, which they continue to follow even after switching off the algorithm, helping explain the asymmetry in effects. These results suggest that initial exposure to Xs algorithm has persistent effects on users current political attitudes and account-following behaviour, even in the absence of a detectable effect on partisanship.
Main
Social media platforms have fundamentally transformed human lives: a large and growing share of the global population connects with others, gets entertained and learns about the world through social media2. These platforms have also become increasingly important for political news consumption. A quarter of US adults report social media as their primary news source, and one half say they at least sometimes get news from these platforms3. Typically, platforms use feed algorithms to select and order content in personalized feeds for each user4. Before algorithms were introduced, users saw a simple chronological feed that displayed posts from followed accounts, with the most recent posts appearing at the top.
Public intellectuals and scholars have raised concerns about the potential adverse effects of social media, particularly feed algorithms, on social cohesion, trust and democracy5,6,7,8. These concerns arise from the spread of misinformation9,10,11, the promotion of toxic and inflammatory content12,13,14 and the creation of filter bubbles with increasingly polarized content15,16,17,18. There is substantial rigorous quantitative evidence that internet access and social media indeed have important negative effects19,20,21,22. Research on search engine rankings also shows that the order in which information is presented can influence user behaviour and political beliefs23. However, previous literature on the effects of social media feed algorithms reports zero political effects. A large study of Facebook and Instagram, conducted by academics in cooperation with Meta during the 2020 US election, found that experimentally replacing the algorithmically curated feed with a chronological feed did not lead to any detectable effects on users polarization or political attitudes, despite causing a substantial change in political content and lowering user engagement with the platforms1. Similarly, studies on Googles search engine and YouTube algorithms found little evidence of filter bubbles24,25,26,27. Studies of Meta platforms linking content to user behaviour and attitudes also found no impact, despite prevalent like-minded content and amplified political news28,29,30.
Yet, the fact that switching off a feed algorithm does not affect users political attitudes does not mean that algorithms have no political impact. If the initial exposure to the algorithm has a persistent effect on political outcomes, switching off the algorithm might show no effects despite its importance. For instance, this could happen because people start following accounts suggested by the algorithm and continue following them when the algorithm is switched off. In addition, different platforms may have different effects, for instance, due to different informational environments or the different objectives of their owners31,32,33,34.
*snip*
canetoad
(20,520 posts)Of any social networking company's feed consequences because I refuse to allow any tech bro dish up what he/she thinks I should pay attention to. (It's ok, Rob's Words on YouTube say's its fine these days to leave dangling participles hanging about)
Using the web sites of some trusted and unbiased news organisations and of course DU, I'm able to find what's current, pertinent and newsworthy.
I'm reminded of an anecdote told by my brother, "Dave the Bastard", a blues harp player, when asked as a teenager how he found so many wallets, watches, money, keys - everything. Dave was always finding things and I asked him how. It's stunning in it's simplicity.
"If you want to find something, you have to look for it."
He turned everything of value into the police station and most of his finds were returned to him, not being claimed withing the time. He had no conflicts of conscience.
****
Missed your posts for a while there, but good to see you back in the saddle, so to speak.
Nevilledog
(54,872 posts)Jack Valentino
(4,719 posts)moral compass or political beliefs....
people who don't pay enough attention, and are easily seduced by "a strong-man"...
Perhaps Democrats should 'invest' in social media trolls to promote OUR views,
like the Russians do!
canetoad
(20,520 posts)Is a form of mind control. Is it worth it?
Jack Valentino
(4,719 posts)and in our case, we need only tell the truth!----
the truth is certainly on our side!----
but it needs to be more widely disseminated !
canetoad
(20,520 posts)No need to go into CTs, there are plenty of common techniques used to break down critical thinking.Suggestibility is not about having a weak mind - advertising is using verified mind control techniques to sell us more shit.
It's not necessarily being weak to be susceptible to persuasion or mild hypnosis. It's how advertising works.