Skip to Content, Navigation, or Footer.

Bahl ’24: Facebook needs to treat users as people, not sources of revenue

PQ Template - Bahl-03.png

A few months ago, a friend of mine logged into her Instagram on my phone. Later in the day, I found myself killing time by scrolling through social media having forgotten that she was still logged in. With some confusion, I noticed that I was seeing post after post advertising the next 1200-calorie diet, fat-burning workout or “full day of eating” — quite the departure from my usual content of niche TikTok trends and college student memes. I half-jokingly wondered if the algorithm was broken or glitching until it hit me that it wasn’t my account that I was logged into. In fact, the account that was immersing me so deeply in diet culture belonged to someone who had actively struggled with body image issues for years and was in recovery from an eating disorder.

Facebook, the parent company of Instagram, has recently come under fire for their website’s impact on female minors, spurred by the whistleblowing of their former civic integrity team member Frances Haugen. In a 2019 internal presentation to employees, Facebook acknowledged that it worsened body image issues for one in three girls by directing eating disorder content to female teenagers with the knowledge of its negative impact on their mental and physical health. But, Haugen alleges, Facebook is making a conscious choice to prioritize the maximal revenue-generation strategy of its algorithms over safety for those consuming their content.Facebook’s advice for mitigating the harm that social media causes, like deleting the app for a few days or scheduling screen breaks, addresses the symptom, not the disease — and is a way of deflecting responsibility for knowingly using an algorithm that harms users’ mental health. 

Facebook is not the only social media company that has made the decision to use an algorithm that doesn’t differentiate between positive and negative impacts when it comes to recommending content. The type of algorithm that generated this feed for my friend uses data which, even with recent data protection legislation, remains unregulated and exploitable. These algorithms track likes, comments, follows and reposts — exactly what a user is doing every millisecond they spend on the app — to make pinpointed decisions about what content to display. To humans, these data points are meaningless, but to a server that has access to billions of them from millions of users, extracting patterns is much easier. The result is that the algorithm maximizes engagement time and profit, not user experience and certainly not user health. 

This algorithm also unites users that employ similar rhetoric or have similar interests, which is at best, a well-intentioned goal with pernicious byproducts. In many cases, these newly connected communities are harmless or even positive: fans of the same TV show, students at the same school or gamers on the same platform. However, if a social media algorithm recognizes similarities in rhetoric between various users exhibiting symptoms of depression and begins recommending them each other’s content, it could be actively dangerous for them. In interacting exclusively with each other without the moderation of a mental health professional or therapist, they could very feasibly enable each other’s destructive behaviors. In fact, this avenue to normalize invisible, destructive behaviors can actually end up preventing mentally ill users from seeking professional help at all.

ADVERTISEMENT

Facebook is aware that, by prioritizing engagement time and profit and connecting users engaged in similar discourse, its software can be damaging to its users’ mental health, particularly that of young women. But, like other social media companies, it treats its algorithm like a black box that can never be dissected, changed or modified. By doing so, it has systematized harming users as a part of how it operates. Frankly, I find it beyond disconcerting to see a supposedly “intelligent” social media algorithm exacerbate the struggles of those dealing with mental illness — including my own loved ones. The time has come for big tech to ascribe value to its users’ mental wellbeing. The people scrolling deserve to be seen as more than just revenue.

Anika Bahl ’24 can be reached at anika_bahl@brown.edu. Please send responses to this opinion to letters@browndailyherald.com and other op-eds to opinions@browndailyherald.com.

ADVERTISEMENT


Popular


Powered by SNworks Solutions by The State News
All Content © 2024 The Brown Daily Herald, Inc.