Nicole Xu for NPR
The Super Bowl is over, so we are officially in Valentine’s Day mode over here. We’re going to get into a story we haven’t been able to stop thinking about for weeks: West Elm Caleb. Why was he dubbed one of the year’s first internet villains? And what does his saga tell us about race on the internet? Let’s dive in.
Here’s the backstory: In early January, a woman posted a TikTok about her frustration with being ghosted after dates. In the video, she specifies she’s in New York City, and her caption references a tall guy named Caleb. That’s about it.
This essay first appeared in NPR’s Code Switch newsletter. Subscribe to the newsletter so you don’t miss the next one, plus get recommendations on what to read, watch and listen.
But this smattering of info, catalyzed by TikTok’s powerful recommendation algorithm, turned the video into a lighthouse for a swath of young, mostly white women in the NYC dating scene. In the comments and their own videos, about a dozen women shared stories about meeting a tall guy named Caleb on Hinge, who said he worked at West Elm as a furniture designer.
In front of a growing audience, women who said they had or were currently dating Caleb pieced together a character built out of red flags. The man who told them he had “just deleted Hinge” had apparently actually left their meeting to go to another date, they said. As the story spread, more and more people chimed in, analyzing Caleb’s behavior, trying to find their local equivalents, calling him everything from emotionally manipulative to a love bomber to a sociopath.
And so as these things go, Caleb went from a local Hinge player to dating supervillain, the target of a massive, righteous internet investigation. An audience of millions helped uncover and blow up his personal information. Several brands made posts trying to dunk on Caleb and edge in on the attention — a non-West Elm furniture store, a rival dating app, Hellman’s mayo for some reason.
It was around this time that it jumped from a TikTok trend to Internet News, with pieces bursting forth about how actually, this punishment didn’t fit the crime. Caleb’s behavior, while certainly questionable sounding, really didn’t seem outside the realm of normal (given the generally low standards for boys on the apps.)
And while it’s good that many called out the overstep in reaction, it also made us think about the ways that these outrage cycles happen, and often get ignored, when people of color are involved. Mass harassment, character attack and doxxing are all pretty par for the course for people who stumble into internet attention. But this case seemed to play out in a very specific way.
Which raises some questions: How exactly did Caleb get so much attention in the first place, and what kind of attention was it? While social media companies tend to stay tight-lipped about how their algorithms work, TikTok describes part of their recommendation system as “collaborative filtering” — a system that creates “personalized” recommendations by showing you what other users who like the same things as you also like (yeah, it’s the kind of thing that takes a few diagrams to explain.)
Marc Faddoul, an A.I. researcher who raised concerns about racial bias in TikTok’s recommendations, told Buzzfeed News that “Collaborative filtering may also reproduce whatever bias there is in people’s behavior. People who tend to like blonde teens tend to like a whole lot of other blonde teens.”
But viral zeitgeistiness isn’t just about the amount of eyeballs on a story. It’s also about the type of audience attached to those eyeballs. Are the people watching you mostly looking for easy fun? Or is your audience saturated with content makers, who make response videos or work for ad agencies or national news organizations?
On TikTok, there’s a flipside to the white stories that get the most mainstream attention — Black users have consistently had to fight for visibility and credit. Last year, creators flagged that terms like “Black Lives Matter” and “Black people” were seemingly being suppressed by automated moderation. The Black dancers and choreographers who consistently created the biggest dance trends on the Internet watched white users skyrocket to popularity by copying their work — to the point of a content strike, also last year.
Another recent, more sobering social media phenomenon also raised questions about the line between pre-programmed and live human editorial bias. The disappearance of Gabby Petito last year got national attention in part because it attracted a huge, algorithm-driven wave of internet sleuthing. That tragedy and its widespread coverage lead to a renewed spotlight on “Missing White Woman Syndrome,” since many noted that the case stood in harsh contrast to the 710 Indigenous people who were reported missing in Wyoming in the past decade, who received relatively little media attention.
The more our lives become intertwined and caught up in tech and social media algorithms, the more it’s worth trying to understand and unpack just how those algorithms work. Who becomes viral, and why? Who gets harassed, who gets defended, and what are the lasting repercussions? And how does the internet both obscure and exacerbate the racial and gender dynamics that already animate so much of our social interactions?
If you liked this excerpt from Code Switch, subscribe to our newsletter.