Service Alert
Social media algorithms refer to a variety of processes that rank user-provided signals to determine what type of content and advertisements to display to users (Mosseri, 2021). These signals include information about:
Algorithms on various platforms rank this information to attempt to serve users content and advertisements that are more likely to keep the user engaged with the ads or the content on the platform itself.
Mass personalization refers to psychological targeting that is (Hermann, 2022, p. 5):
Social media platforms are incredibly proficient at holding users' attention with hyper-personalized content.
A single “like” allows actors to infer psychological traits that enable them to send targeted messages that significantly affect user behavior (Matz et al. 12717).
Tech companies can enable advertisers to send microtargeted messages based on algorithmically determined personality profiles (Kozyreva et al., 2020, p. 118)
YouTube recommendations have been shown to lead people to more extremist content (Kozyreva et al., 2020, p. 115).
In 2018, in an effort to address declining engagement on the platform, Facebook reconfigured its recommendation algorithms, which led outrageous and sensationalized content to go viral at higher rates (Hagey & Horwitz, 2021). Internal documents stated, "Misinformation, toxicity, and violent content are inordinately prevalent among reshares" (Hagey & Horwitz, 2021).
Researchers have shown that viewing 20 widely-shared videos sowing doubt about election systems will retrain TikTok’s algorithm so that it will push more “election disinformation, polarizing content, far-right extremism, QAnon conspiracy theories and false Covid-19 narratives” even when using neutral search terms (Hsu, 2022).
The terms "filter bubbles" and "echo chambers" are often used interchangeably, and they generally refer to (Kozyreva et al., 2020, p. 118):
While algorithms do not necessarily create these situations on social media, algorithms do reinforce user choices, which tends to lead users to more extreme, polarizing content, which is more likely to be misinformation (Hameleers et al., 2022; Kozyreva et al., 2020)..
However, though some studies show that algorithmic recommendations only push a minority of users to hyper-extremist content, the sheer number of people who use these platforms means that millions of people are led toward extremist content through algorithmic recommendations (Brown et al., 2022).