Skip to content

Instagram and Facebook should update nude rules, says Meta Board

    Content creators have long criticized Facebook and Instagram for their content moderation policies related to photos that show partial nudity, arguing that their practices are inconsistent and often biased against women and LGBTQ people.

    This week, the board of trustees of Meta, the platform’s parent company, strongly recommended clarifying guidelines for such photos after Instagram removed two posts depicting non-binary and trans people with bare breasts.

    The posts were quickly reinstated after the pair appealed, and Meta’s board of trustees overturned the original decision to remove them. It was the board’s first case directly involving gender non-conforming users.

    “The restrictions and exceptions to the female nipple rules are extensive and confusing, especially as they apply to transgender and non-binary people,” Meta’s Oversight Board said Tuesday in its summary of the case. “The ambiguity inherent in this policy creates uncertainty among users and reviewers and makes it unworkable in practice.”

    The issue arose when a transgender and non-binary couple posted photos of their bare breasts with covered nipples in 2021 and 2022. Captions include details of a fundraiser for a member of the couple to undergo top surgery, a gender-confirming procedure to flatten a person’s chest. Instagram removed the photos after other users reported them, saying their depiction of breasts violated the site’s Sexual Solicitation Community Standard. The couple appealed the decision and the photos were subsequently reinstated.

    The pair’s back and forth with Instagram underscored criticism that the platform’s guidelines for adult content are unclear. Under community guidelines, Instagram bans nude photos, but makes some exceptions for a range of content types, including mental health posts, breastfeeding images, and other “health-related situations” — parameters the Meta board described as “complicated and ill-defined.” . in his summary.

    How to decide which images of people’s breasts should be allowed on social media platforms has long been a source of debate. Dozens of artists and activists claim that there is a double standard of women’s coffin posts being removed before men’s. That’s also the case for transgender and non-binary people, advocates say.

    Meta’s Oversight Board, a body of 22 academics, journalists and human rights lawyers, is funded by Meta but operates independently of the company and makes binding decisions for the company. The group recommended that the platforms further clarify the standard of adult nudity and sexual activity, “so that all people are treated in a manner consistent with international human rights standards, without discrimination based on sex or gender”.

    It also called for “a comprehensive human rights impact assessment of such change, involving various stakeholders, and the development of a plan to address any harm identified.”

    Meta has 60 days to review the board’s executive summary, and a company spokesperson said they would respond publicly to each of the board’s recommendations by mid-March.