Social media algorithms are designed to keep users engaged and coming back for more. But what happens when the algorithms start to dictate our mental health?
We’re all familiar with FOMO (fear of missing out). We scroll through our feeds and see what everyone else is posting about, and we feel the need to keep up and post something just as worthy of attention. And then there’s SMH (shaking my head), which happens when we see something that makes us cringe or laugh or cry, but we don’t have any way to express how we feel other than by scrolling past it without commenting.
But there’s also another phenomenon that social media has brought into our lives: FOBO (fear of being offline). It's a fear of not being connected—a fear that manifests itself in compulsive checking of Facebook, Instagram, Twitter, Snapchat… the list goes on. This is often referred to as "social media addiction”. It's all too easy for these addictions to become self-perpetuating cycles: when we're feeling down or depressed, we turn to social media hoping that seeing other people's happy lives will make us feel better. But social media is a poor substitute for real human connection, and it's only a temporary distraction from our problems.
Are social algorithms manipulative?
We've all heard the news. Facebook has been using its algorithms to manipulate our emotions, and it's not great. But is there anything we can do about it?
First off, we must acknowledge that a lot of this is out of our control. We're all just trying to find what makes us feel good, and social media sites are going to try their best to show us posts that will make us feel good—or at least keep us on their site longer. That's the way business works!
What are they showing us?
While social media algorithms are in no way designed to highlight issues of the day - it seems that they've done exactly that. Let's call it collateral damage. It's by no means a conspiracy - merely a consequence of the world that we have designed.
The “black box problem”
The "black box dilemma," as described by Kelley Cotter, assistant professor in Penn State University's College of Information Sciences and Technology, is the feeling that what little we do know about how social media algorithms operate is dwarfed by how much we don't.
According to Cotter, these social networks purposely hide their algorithms' inner workings to safeguard proprietary technology and deflect possible regulatory scrutiny. Social media platforms have provided brief justifications for why some information appears in your feed, even though they are precisely what you would anticipate: High levels of engagement—comments, likes, shares, and other interactions—in videos or photographs increase their likelihood of going viral. But according to Cotter, these just seem to be PR stunts.
“A lot of it also is made up of rationales,” Cotter says. “So not just, ‘This is what the algorithm does,’ but ‘It does this because we want X to happen.’ Usually, it’s like, ‘We want to make sure that you’re seeing the things that you care about or you’re making real connections with people.’ So, it’s a lot of coaching of the information in these lofty goals that they have.”
The TikTok algorithm has emerged as the one to be taken seriously when it comes to social networking algorithms.
Mark Zuckerberg, CEO of Meta, disclosed that Facebook's and Instagram's feeds would include more information from users they don't follow but who they could find interesting in the company's first-quarter earnings call. It's blatantly an effort to compete more effectively with TikTok's For You page, an unending scroll of discoverability that has been a key driver of the platform's growth—and a source of mystery.
The algorithm used by TikTok is generally regarded as being too effective. Since TikTok shot to fame during the pandemic, fans and artists have been attempting to figure out what makes the crucial For You page so good at forecasting which videos will go viral. The New York Times was able to verify documents from TikTok's engineering team in Beijing that described the algorithm's operation to non-technical staff members. TikTok's recommendation engine, according to a computer scientist who analyzed the documents for the Times, is "perfectly acceptable, but typical stuff," and the platform's benefit lies in the vast amounts of data and a framework designed for recommended content.
An addiction to the algorithm
Social media algorithms are created with retention in mind: the more committed users, the greater the amount of money the platform makes. For some people, spending hours reading through social media mostly makes them feel bad about wasting their time. But for other people, being drawn in that way might harm their mental health. According to studies, both youth and adults who use social media frequently experience higher levels of anxiety and sadness.
Dr. Nina Vasan, a psychiatrist, has experienced it firsthand. As the founder of Brainstorm, Stanford's academic lab devoted to mental health innovation, she is working to assist social media platforms to lessen their impact. For instance, Brainstorm and Pinterest collaborated to develop a "compassionate search" experience where users could get guided activities to help them feel better if they searched for themes like quotations about stress or work anxiety. To stop potentially hazardous or triggering content from auto-filling in search results or being recommended, Vasan and her team also collaborated with Pinterest's developers.
Breaking the habit of constant (and thoughtless) social media scrolling is one of the most significant problems Vasan is working to fix for everyone.
“The algorithms have been developed to keep us online. The problem is that there’s no ability to pause and think. Time just basically goes away,” Vasan says. “We need to think about how we can break the cycle and look at something else, take a breath.”
The algorithms' tendency to show you more of what you enjoy is one of its issues. Theoretically, that sounds fantastic, but it may mean that posts from the same brands or people dominate your feed, thereby blocking a wider variety of content and viewpoints. It also implies that, on occasion, you should think twice before engaging, especially on social media platforms like Instagram.
That is how the algorithm operates.
What do we think?
As marketers, we're often painted in a negative light. We are the ones who interrupt mindless scrolling with an ad, after all. That said - we do have a responsibility to our audiences, and that is to place their well-being above a pushy tactic designed to ignite anxiety and fear. We must be mindful of the power that we have, and how we wield it. If a brand is going to use social media to create fear or anxiety to sell a product, it should at least take responsibility for the fallout.