Bluesky has experienced explosive growth in the past year. which requires the platform to step up its moderation efforts. In his recently released moderation report Bluesky said the number of users will grow by about 23 million by 2024, increasing from 2.9 million users to almost 26 million. And its moderators received 17 times more user reports than they received in 2023—6.48 million in 2024, up from 358,000 the previous year.
The bulk of these messages were related to “harassment, trolling or bigotry,” spam and misleading content (including impersonation and misinformation). Having accounts impersonating other people in the wake of the surge in popularity of Bluesky and the platform with a “more aggressive” approach in an attempt to deal with him. Then he announced that he had increased the team of moderators fourfold. A new report says Bluesky's moderation team has grown to around 100 people and is still hiring. “Some moderators specialize in certain policy areas, such as child safety special agents,” it notes.
Other categories that Bluesky says it has received a lot of messages about include “illegal and urgent issues” and objectionable sexual content. There were also 726,000 posts tagged “other.” Bluesky says it fulfilled 146 requests from “law enforcement, governments, law firms” out of 238 requests last year.
The platform plans to make some changes to the way it handles reports and appeals this year, which it says will “make it easier to communicate with users,” such as providing users with updates on actions it has taken regarding content they have reported, and , further allowing users to appeal video removal decisions directly within the app. In 2024, moderators deleted 66,308 accounts, and automated systems deleted 35,842 spam and bot profiles. “Looking ahead to 2025, we are investing in stronger proactive detection systems to complement user reporting as the growing network requires multiple detection methods to quickly identify and eliminate harmful content,” says Bluesky.