Don’t like social media and the stress it seems to cause? How much fault lies at the feet of social media companies and their mysterious algorithms, and how much of it is your own fault?
That’s something that Wendy Moe, a marketing professor at the University of Maryland, has done some research into.
Social media companies make money by gaining engagement, and to get that, they have to get reactions from customers like you. So how much blame should that social media feed that agitates you have for drawing that reaction from you?
“What some of our research has shown is that human nature gravitates to content that creates anxiety. Sometimes, it gravitates toward content that causes a little bit of anger. And so if that’s the content that consumers … are dwelling on and watching, all the algorithms are doing is saying, ‘Hey, this customer really likes to watch this anger-inducing, anxiety-inducing content, so I’m just going to show them more of it,’” Moe said.
“There’s no malice involved. It’s just a function of how the algorithms are built,” she added.
Moe used an analogy involving a slice of chocolate cake: If you just offer someone a piece, they might be inclined to turn it down if they’re worried about their health. But if you put that piece in front of them, it becomes a lot easier to give into temptation.
“If you ask people what they want, they’re going to say, ‘I don’t want things that make me unhappy. I don’t want things that are violent. I don’t want all these things,’” she said. “But when the content is put out in front of them, human nature kind of takes over, and they stare at it. And that staring at it is what’s causing the algorithms to show them more of it.”
Moe is advocating for giving consumers more control over what the algorithms send their way, but she said going all out with zero tracking methods will lead to more nonsense and things you also don’t enjoy or engage with. Rather, she hopes companies will start giving users a way to let their feeds know they liked a certain type of post and want more of that in the future.
“If we trust the consumer a little bit more to choose what they want to see more of and choose what they don’t want to see more of, outside of the algorithm, then we might be able to shape content a little bit better,” she argued.
Moe is also a mother and said when her teens were a bit younger, she tried to “ruin the algorithm” with them from time to time.
“What I do in ‘ruin the algorithm’ is push them content and videos of happy, fluffy, cute, baby animals,” she said. “Randomly, they’ll get a video of a little bunny or a really cute, white puppy, and just them watching that video that I show them and interacting with it influences their algorithm to be a little bit more lighthearted.”
It helps, but she admits it’s also not the best way to improve social media experiences for people whose mental health is getting affected by what shows up on their feeds.
“Some folks want to blame some of these companies that are building these algorithms and wanting them to design the algorithm so that it’s better,” Moe said. “And that worries me a little bit, because I don’t really know that I want some individual at Facebook or Instagram, or whatever other platform is doing this — I’m not sure I want some other individual telling me what I should and should not be looking at.”
“It would be great if I didn’t have to do that, and we can just have this interface where we choose the kinds of content we want to see,” she added. “Both for myself and maybe for parents.”
Get breaking news and daily headlines delivered to your email inbox by signing up here.
© 2025 WTOP. All Rights Reserved. This website is not intended for users located within the European Economic Area.