The Instagram algorithm has a double discourse. On the one hand, it favors publications of users with little clothes on - there is a greater chance to find this type of photos in our feeds (profiles) - and on the other hand, it censors nudity. A study carried out by the German algorithmic decision-making research and advocacy organization Algorithm Watch collaborated with the European Data Journalism Network highlighted several findings.

Images that sexualize the body, where more skin is objectified and displayed, are reproduced more frequently than other content types. The algorithm wants to easily arouse the user's attention to continue to use the platform. Photographs shown are of women in underwear or bathing suits and bare-chested men.

The study involved 26 volunteers who installed a plug-in that recorded their behavior on Instagram.

They were also asked to follow 37 content creators from 12 countries who used the social network to advertise their brands and reach new customers. Thus, between February and May, the team analyzed the type of publications that appeared with the most significant relevance in user accounts.

Read: ”The three deaths of Marisela Escobedo," a story of struggle

Read: Five Mexican women write about mystery and terror

What to do?

Consumers of social media content can do little to avoid algorithms. Still, an alternative to reducing the sexualization and objectification of the body requires governments to commit to auditing the algorithms of tech companies and making sure that instead of sexualizing the body to sell and hook the user, dignify and empower consumers decision-making.

With information from El País

Traducción: Valentina K. Yanes