Is it Anxiety? Or Simply Facebook? A Look at Psychological Manipulation by Social Media

Photo courtesy of The Social Dilemma

We have all heard the well-worn complaints that large tech companies are selling our data. Large companies such as Facebook monetizing personal information is something over 28% of users were unaware of when joining Facebook and a fact that makes 51% of Facebook users uncomfortable, according to the Pew Research center. 

Some of you may be wondering, so what?

Is it that much of a security issue to receive more ads for hanging wall plants or cat toys? What are the major consequences of Facebook selling your preferences to a company and building a profile to optimize your ads?

There are consequences, and they are much graver than persuading you to buy an apple slicer according to research from Business Insider. 

The profiles social media companies create are so detailed they have found ways to analyze the deepest levels of our subconscious. These profiles are optimized to track our hand and eye movements to identify deeply embedded emotions that trigger stress responses. Understanding and playing on these triggers is one of the main ways these companies can induce addiction in their users. 

With this more nuanced description of “profile building” in mind, let’s take a look at an example of a “trigger response” used by Instagram provided to us by BBC. 

Let’s say you are in your mid-twenties and job hunting. Instagram observes that you are on the look-out for a job by observing your search history and saved posts. Instagram gets this information and begins to track your interactions. The algorithm notices that your hand hesitates before scrolling when you see posts that are related to your friend’s careers. It notices that you pause for 0.001 seconds longer than your usual time before liking a friend’s post that has the hashtag “adulting” and “new job.”

Instagram notices, it learns, and it feeds you material to keep you addicted. 

“Social media might be fueling the increase in mental illness, as Gen Z is the first truly digital generation.”

Instagram knows that you have a deeply embedded emotion related to certain content- your current anxiety is linked to finding a job. Now the algorithm has its hook; its job is to find the best way to optimize that anxiety it has detected and generate more clicks. 

Contrary to one’s first intuition, monetizing this find will not involve generating more internship related ads. Rather, every time you reload Instagram, it will scan your friends’ posts and prioritize your friends who are posting content about being “productive” or “young professionals.” This will result in you feeling more aware of your worry and will cultivate a deeper desire within you to stay on the app to quell your anxiety through more intake. 

“Since 2014, millennials (or people who turned 23 to 38 in 2019) have seen a 47% increase in major-depression diagnoses.”

It’s no coincidence that people who are anxious are 44 times more likely to spend more than 5 hours on their screen daily. This same article also observed that ads in-between content that makes people anxious are more likely to get clicked or viewed due to a “heightening state of mind and vulnerability,” so everyone is winning behind the screen. 

Pew Research Center found 45% of teens aged 13 to 17 said they use the internet “almost constantly.” Gen Z teenagers told Business Insider the constant social-media use was driving a longing for interpersonal connection.”

Chart, line chart  Description automatically generated

Combating social media companies’ abilities to psychologically manipulate your mind is a formidable task to overcome. Is it fair that we have to tackle this task alone?  The answer is a bit complicated, but in short- no.

There is a wide array of products that have regulations in place to protect vulnerable people. From tobacco to alcohol, the government and regulated substance companies themselves recognize that their products can cause severe harm to underdeveloped brains. Similarly, there is a host of research proving that eating disorders, isolation, depression, and anxiety stem from certain content shown to young viewers on social media. 

There are ways to put restrictions forth, but those suggestions must be made by social media companies themselves. The average age on the Congress Regulation Committee is 62 years old. Even if there were more young members, it takes a fairly sophisticated understanding of technology to implement realistic regulations on content for young users. 

This fact creates a tough dilemma for Silicon Valley. The people making their money from these algorithms are the same people with the expertise needed to regulate these algorithms’ accuracies. While many will argue that we live in a free, capitalistic society and that these companies do not have a duty to regulate, but times have changed and new evidence has arisen. 

Doctors used to recommend cigarettes to patients in the 50’s to ease joint pain, and we now know that this product is harmful and implemented regulations for vulnerable users. Five years ago we did not know the harms of social media on the younger generation because the world needed a few years to identify harmful patterns that social media has caused. We would never hand a 5-year-old a cigarette because we know it’s harmful. Similarly, we need to begin taking the evidence seriously and restricting certain content from younger viewers. And this limit must be created by those who understand this technology best – it’s creators.