Social Media Creates Negative Feedback Loops Driving Mental Illness

by Leroy on March 3rd, 2023. Last updated March 4th, 2023.

I now understand the anti-television arguments from the 1950's and 60's. Those people saw the screen's power to capture attention and glimpsed a very dark future. But I actually think that darkness was held back by one simple fact of television: everything shown on the screen was broadcast. Every television program and advertisement is broadcast to an audience of more than one person, even if that audience is just a bunch of individuals on the couch. It is this fundamental barrier that modern social media apps have broken. These apps show videos and advertisements directly to individuals just like television, however, there is no broadcast and thus no barrier to individual feedback. These apps are designed to mine an individual user's engagement and adapt immediately by showing content which exploits their unconscious to get them to continue scrolling. This feedback loop is the root cause of social media borne mental illness.

How Television Entices Us

Early television advertisements were short and often limited by the technology of the time. One advertisement from 1941 promoted watches, was 10 seconds long, and aired before an entire baseball game [1]. Because broadcast television was new at the time, most television programs were sandwiched between advertisements, one at the beginning and end. By the 1960's, television was reaching over 90% of American homes [2] and advertising certainly followed. An episode of the Dick Van Dyke Show from 1963 had four advertising slots out of nine broadcast segments [3]. What this really means is that the television programs were already being designed around advertisements first and foremost. The structure of these shows were designed such that it could be broadcast around advertisements, not the other way around.

Discussion and debates on the effects of television were roaring at this time on the effects of violent television on its watchers. This was an era of the Vietnam war, high-profile political assassinations, racial division, and social unrest -- and it was all being broadcast on television. Just getting people to look was and remains the main interest of television networks, funded by advertisements. What better way to get you to look than to show you the abyss. Audiences were certainly captivated by watching Neil Armstrong step onto the Moon. But audiences equally responded to coverage of the atrocities of the Vietnam War, the protests of the war at universities, and events like the assassination of John F. Kennedy. The broadcast networks and cinema companies funded violent shows and movies to try and profit off this kind of audience response. Stories of drama, war, and evil are well known, but its visual depiction is a completely different stimulus. Humans simply respond more to negative than positive stimuli [4] and that most certainly includes videos and images shown to us on television. But "negative stimuli" doesn't need to be restricted to wanton violence on television shows. What about anxiety? Loneliness? What happens when we start depicting that visually?

The Fundamental Barrier of Television Broadcasts

While the modern television format has existed for more than 60 years, there's a lot of evidence that audiences are shrinking [5]. Many have argued that television isn't novel anymore and there are much more enticing alternatives such as social media, video games, streaming services, the Internet in general, and much else. While there is merit to these alternatives, I think there's a fundamental truth about television broadcasting that cannot be overcome and limits its hold on our attention: television is broadcast. Television broadcasts are one-way, fire-and-forget. When you change the channel, the networks have no idea. They know people are changing channel, are tuning-in but not watching, are turning-off their television, but they can't collect that information directly. In fact, television networks must work around this limitation by using other channels to get feedback. A famous example is Nielsen [6], which sends out surveys with dollar bills included to entice individuals to respond. But that's just audience information, information en masse. They cannot gain information about an individual in that audience. I don't think the television networks are losing their grip on the American audience accidentally, I think it was forced from their hand by this limitation.

As the television audiences have widened and become more diverse, and interests have become more diverse, the television networks have not been able to keep up with social media. Television's feedback loop is too slow. The television networks are incapable of getting hyper-specific and filling out niches as seen countlessly on the Internet because audiences cannot seek them out, but are broadcast to them. But this barrier is a form of protection for the audience.

I don't think the television networks care about their audience and think of their audience as an unfortunate side-effect of the business of putting advertisements in front of eyes. The less eyes, the less the networks care. We can see evidence of this every night when the audience has no expectation of quality programming. Every late night ad-roll, every three-easy-payments-of-19.99 commercial, every hour long advertisement-masqerading-as-show all played from 3am and ended the minute before the morning show came on. We can also see evidence during the day where audiences are large enough that they show something real, but small enough to risk offending: Seinfeld reruns, and likely many other reruns, are sped up to fit in more advertisements [7]. The television networks want to find any way of fitting in more advertisements but this is in direct opposition to most of their audience. Television networks would like to be more direct and respond to the individuals in their audience in order to sell more advertisements, but the modern audience is already numb to wanton violence, pornography, and that kind of shock advertisement which teases your brain into looking. They would like to dig deeper but can't and instead must tailor their broadcasts to the most amount of people. Because of the barrier between television network and its audience, the networks cannot tap into the individual human psyche but modern social media apps can and do.

The Barrier Has Been Broken

Social media has evolved into a form of television. The term "social media" is rather archaic given how quickly technology and culture evolves on the Internet. The term "social media" once described websites for exchanging words with one another. Websites like MySpace, Facebook, Twitter, Tumblr, Reddit, and all of those old forums fit this definition. Now, individuals are rarely exchanging words on social media, getting small comment sections at best. "Social media" as a term now means apps for consuming videos or images. Social media is now defined by apps like Instagram and TikTok.

The barrier between the audience and network doesn't exist for modern social media apps. In fact, social media apps have direct feedback from the individuals using them. This level of feedback took some time for the social media companies to understand, not too different from the 20-year gap between the 10 second advertisement and the advertising slots in the Dick Van Dyke Show. But they certainly understand it now and have built systems to take advantage of this level of feedback. Why have apps turned from collections of buttons, pages, and menus into a single screen with a giant feed? Because your feedback directly drives what the feed produces next. The social media companies have realized they can directly exploit your impulses and behavior to keep your attention [8]. With television, the networks were far removed from the individual. With social media apps, the individual responds directly and immediately. These apps aren't trying to tie in advertisements into the wants and needs of an individual. Instead, they are exploiting the unconscious pysche of human psychology -- the part of your brain that simply responds to the negative -- to keep you looking. Worst yet, they have successfully marketed this pyschically-mined data as yours.

Of course, that's the main rebuttal I hear when I talk to individuals about this level of direct feedback: these social media apps have become theirs. "I love my cat videos," one said. Many I've spoken to have transcended the app's clutches: "TikTok is a learning app for me, all my videos teach me something." Perhaps the most insidious phrase I've heard by a non-insignificant amount of people is, "it's my algorithm."

"The algorithm" is the name given to the invisible machine that produces what comes up next in the feed. In all of those previous cases of people I talked to, the algorithm has trapped them without them even realizing. Their feed feels personal but the truth is that the algorithm isn't personalized to anyone. In fact, the algorithm isn't personal at all. It is just a big machine which accepts inputs is so arbitrary that it cannot possibly learn about you but instead learns how you use it. "The algorithm" is a feedback machine that wants to learn how you use it so it can keep your attention. The algorithm is designed to brute-force your attention by exploiting your behavior. It simply exists to use you, not provide you with anything. It has been said repeatedly, but bears to be repeated again: you are the product of these social media apps.

Modern Social Media Apps are Brute-Forcing Your Attention

When I'm talking about "direct feedback" I'm not thinking angry letters sent to company offices or comment-section rants. So what is feedback for these social media apps? Feedback is every input you make to the app. Every input. Many people, especially those less tech-savvy, never think that every tap, scroll, or press onto their phone screen is being tracked, catalogued, and used as input. Most see widgets and interactable features of the app, such as the comment or like-button, as inputs but those are merely just a part. No, feedback is comprised of every single interaction one makes with these apps. Every casual thumb-press on your phone screen is an explicit input to the algoritm regardless of how implicit it may feel. Any slight, and I do mean slight, indicator of increased attention will be noticed and acted on. Did you open the comment section at all? Comment anything? Did you scroll far into the comments? Did your thumb touch the screen in a different spot rather than the usual thumb-scroll of disinterest? Did you press and hold before swiping because you wanted to see just a bit more? Did you scroll back in the feed? Yes to any of those? Ok, let's show more of that.

It may seem innocuous, most of those things one does without even thinking. But how many things have you given unwanted attention without thinking first? I have given my attention to all sorts of things I didn't want to see: car crashes, poorly-behaved children, rants from questionable people, personal drama spilling out publically, street fights, and much else. All of these dark things somehow capture my attention before I can rationally pull it away. These apps know, probabilistically based on your inputs, that you will look if it sends you this video over that video. The problem starts when those videos start having an affect on you because of their negativity. Again, humans respond more to negative stimuli and that certainly includes negative feedback. People find themselves stuck in negative feedback loops caused by their thoughts [9]. Now we are creating negative feedback loops with visuals too? How much stronger of an affect does that have?

Modern social media apps have a cleverly designed a feedback loop with a human right in the middle. The person is given a stimulus, often a video, and responds directly using the app. The app uses the person's response and sends out a new stimulus immediately. Because the person may have simply looked longer at something negative, relative to that person, the app responds with a stimulus of that kind again. The person may have done this unconsciously, but that's doesn't matter, the algorithm has noticed.

It's no wonder social media causes mental illness [10] when social media apps are arbitrary machines using us directly, all with the sole purpose to keep us looking. This level of direct feedback and response is the underlying exploitive mechanism driving the proliferation of mental illness. I certainly don't think we should be living with societal-wide traps like this. We are letting these social media companies do a brute-force hack on the human pysche using millions of individual people, who are unaware, as training data. It seems to be working.

  1. Brief History on Television Advertising
  2. Television on Television Violence: Perspectives from the 70s and 90s
  3. Vintage Sponsor Spots
  4. Not all emotions are created equal: The negativity bias in social-emotional development
  5. Fading Ratings: A Special Report on TV’s Shrinking Audiences
  6. Nielsen Company History
  7. Giddy Up: “Seinfeld” Reruns Are Being Sped Up to Fit More Commercials
  8. An impulse to exploit: the behavioral turn in data-driven marketing
  9. Breaking the Cycle: Negative Thought Patterns
  10. Social Media is a Major Cause of the Mental Illness Epidemic in Teen Girls