Prioritizing user safety and enhancing privacy control, the social media platform Discord has rolled out a new feature called “Ignore.” Once activated, Discord says the feature allows users to hid new massage, direct messages (DMs), server notifications, profiles and activities from the ignored user.
Why ignore instead of block?
Discord is already offering block option, allowing users to block unwanted users for reasons such as privacy, safety, dislike of their content, or simply not wanting to interact with them. Blocking provides a permanent solution to these issues. However, Discord says they have received feedback from users-particularly teens-indicating that sometimes blocking a user doesn’t solve the problem, and in some cases, it can make things worse.
Discord has introduced a better option to avoid confrontation while managing privacy in a peaceful way. When a user is added to the “Ignore” list, they are unaware that they have been ignored, allowing for discreet management of interactions.
How to activated ignore on Discord?
To ignore a user, go to their profile and tap on the triple-dot menu in the top right corner of the scree. This will open a dropdown menu, Select “Ignore,” which is located just above the red-highlighted “Block” option. Once activated, the ignored users activities will be minimized on your Discord feed. However, the ignored user will still be able to view your profile.
On a blog post shared on February 10, Discord announced the rollout of its new “ignore” feature. The company is also partnering with prominent organizations such as Google, OpenAI, Roblox, Eric Schmidt, John S. and James L. Knight Foundation, AI Collaborative, Patrick J. McGovern Foundation and Project Liberty Institute. Together, they are launching a non profit foundation called Robust Open Online Safety Tools (ROOST).
ROOST will provide free, open-source tools aimed at improving internet safety with a focus on child protection and digital security. These user-friendly tools will assist in detecting, reviewing, and reporting child sexual abuse material (CSAM).