The Bluesky app logo is displayed on a smartphone
Image Credits:NurPhoto / Contributor / Getty Images
Social

Bluesky at a crossroads as users petition to ban Jesse Singal over anti-trans views, harassment

Now with 25 million users, Bluesky is facing a test that will determine whether or not its platform will still be seen as a safe space and place of refuge from the toxicity of X. In recent days, a large number of users on Bluesky have been urging the company to ban one newcomer for his opinions and works shared both on and off the platform.

Writer and podcast host Jesse Singal joined Bluesky 12 days ago to the horror of much of the Bluesky community. Singal has been cataloged by GLAAD’s Accountability Project for his writings on transgender issues and other matters. The GLAAD project “catalogs anti-LGBTQ rhetoric and discriminatory actions of politicians, commentators, organization heads, religious leaders, and legal figures, who have used their platforms, influence, and power to spread misinformation and harm LGBTQ people.”

He is now the most blocked user on the social network, and user outrage over his participation on the platform is growing. People are demanding that Bluesky take a stand: It’s either a place that promises it won’t host bad actors, or it’s a place that promises not to inflate the reach of bad actors thanks to its various moderation tools.

It cannot be both.

In Bluesky’s earlier days, trans users, like many other marginalized groups, and Black and queer users, flocked to the social network from Twitter (now called X) after Elon Musk bought it and began to promote more right-wing and conservative views. Bluesky has many custom tools that allow people to control their own social networking experience, including block lists, layered moderation, custom feeds, custom algorithms, labeling services, and more. This allows users to do things like block groups of users they may not want to interact with — like MAGA supporters, for instance, who can be found on a dedicated block list.

Many in the community who escaped Twitter now feel their Bluesky experience is at risk because of Singal’s joining. As a high-profile user, he brings a network of followers with him, which could lead to increased harassment, they say.

Users can point to a wide number of Singal’s writings and opinions as points of reference for his bad reputation, but for Bluesky to ban him, he would have to have violated its terms of service and guidelines specifically.

Some Bluesky users say he has done so: He has already engaged in targeted harassment on the platform and has a long history of harassing trans folks, they say, and has shared private medical information without patients’ consent — a charge Singal disputes.

A new Change.org petition is circulating arguing Bluesky should simply ban Singal.

As of the time of writing, the petition asking Bluesky to enforce its guidelines already has over 18,000 signatures, including one from singer Lizzo.

Bluesky users have also reported Singal’s account en masse, leading the company to ban him, reinstate him, and then label his account intolerant by its moderation service. (That means users can go into their Bluesky settings to turn on or off or set to be warned about posts that fall into this category.)

But many Bluesky users don’t want to just moderate and ignore Singal, they want him gone. It’s become a dealbreaker.

By keeping him, Bluesky risks harming the community, depleting its goodwill, and losing users, while also sending a signal to others that bad actors and harassers are welcome there.

But by banning Singal, Bluesky could come under attack from the next head of the FCC, Brendan Carr, who is ready to come after social networks he believes are suppressing conservative views. Whatever Bluesky does here will attract attention, for better or for worse.

This is not the first time Bluesky has faced user backlash over moderation issues. The startup strained its relationship with some Black users after failing to crack down sufficiently on hateful and violent rhetoric last year.

In February, the company hired a former Twitter/X Trust & Safety exec, Aaron Rodericks, to lead Bluesky’s efforts and hopefully prevent further problems. Rodericks is now receiving a massive amount of inbound from Bluesky users, asking him to take action on Singal’s account.

Ahead of publication, Bluesky did not return requests for comment. Singal has declined to comment.

After publication, the Bluesky Safety account shared the following statement on the platform:

In the past month alone, over 12M people joined Bluesky, with daily active users growing 10x. We received more reports in two days than in all of last year, and have integrated new systems and quadrupled our moderation team to try and keep up. Sometimes, we have to work quickly to correct errors.

We recently introduced a new system to detect impersonation. This tool automatically flagged some accounts that were taken down, then reinstated. Other prominent impersonation accounts were quickly removed, contributing to a false perception that we take action based on behavior outside of Bluesky.

The current controversies reveal weaknesses in the clarity of our Community Guidelines and failures in our communications. This has led to understandable confusion about how we interpret our guidelines. We need to be more transparent – without revealing to bad actors how to circumvent our systems.

Our growing moderation team needs clear evidence and well-defined rules to make decisions that can be consistently enforced, so they action accounts based primarily on content and behavior that appears in our app, where the context and authenticity can be directly verified.

We do not currently take action on accounts that share Bluesky screenshots with commentary, unless that commentary violates our Guidelines. We will take action when someone’s private information is shared without their consent, but only when it is personally identifiable and verifiable in-app.

Our moderation team continues to monitor and review all received reports, and will action accounts based on in-app violations of our Community Guidelines. We want our actions to be clear and consistent, recognizing that we have more work to do to make that happen.

We’ve been working on an improved version of our Community Guidelines that provides greater detail and more examples, and plan to release it early next year and solicit feedback from the community. As we grow, we will continue to learn and adjust.

Moderation decisions draw intense public scrutiny from many places. That’s why Bluesky’s architecture is designed to enable communities to control their online spaces, independent of the company or its leadership. We will continue to work on empowering people with more control over their experience.

We talked with Singal after publication of this article; he declined to comment. This article was also updated to include Bluesky’s statement after publication and to correct an error.

Topics

, , , ,

Related