- Resources for Journalists
- Join us
An exclusive investigation reveals that Instagram prioritizes photos of scantily-clad men and women, shaping the behavior of content creators and the worldview of 140 millions Europeans in what remains a blind spot of EU regulations.
Sarah is a food entrepreneur in a large European city (the name was changed). The company she created helps women feel at ease with their food intake and advocates “intuitive eating”. Like many small-business owners, Sarah relies on social media to attract clients. Instagram, Europe’s second-largest social network after Facebook, is a marketing channel she could not do without, she said.
But on Instagram, which is heavily oriented towards photos and videos, she felt that her pictures did not reach many of her 53,000 followers unless she posed in swimwear. Indeed, four of her seven most-liked posts of the last few months showed her in a bikini. Ely Killeuse, a book author with 132,000 followers on Instagram who agreed to speak on the record, said that “almost all” of her most liked pictures showed her in underwear or bathing suits.
It could be the case that their audiences massively prefer to see Sarah and Ely in bathing suits. But since early 2016, Instagram arranges the pictures in a user’s newsfeed so that the photos a user “cares about most will appear towards the top of the feed”. If the other pictures Sarah and Ely post are less popular, it could be that they are not shown to their followers as much.
Which photos are shown and which are not is not just a matter of taste. Entrepreneurs who rely on Instagram to acquire clients must adopt the norms the service encourages to reach their followers. Even if these norms do not reflect the values they built their businesses on, or those of their core audience and clients.
To understand what pictures Instagram prioritized, the European Data Journalism Network and AlgorithmWatch asked 26 volunteers to install a browser add-on and follow a selection of professional content creators. We selected 37 professionals from 12 countries (14 of them men) who use Instagram to advertise brands or to acquire new clients for their businesses, mostly in the food, travel, fitness, fashion or beauty sectors.
The add-on automatically opens the Instagram homepage at regular intervals and notes which posts appear on top of the volunteers’ newsfeeds, providing an overview of what the platforms considers most relevant to each volunteer.
If Instagram were not mingling with the algorithm, the diversity of posts in the newsfeed of users should match the diversity of the posts by the content creators they follow. And if Instagram personalized the newsfeed of each user according to their personal tastes, the diversity of posts in their newsfeeds should be skewed in a different way for each user. This is not what we found.
Between February and May, 1,737 posts published by the content creators we monitor, containing 2,400 photos, were analyzed. Of these posts, 362, or 21%, were recognized by a computer program as containing pictures showing women in bikinis or underwear, or bare chested men. In the newsfeeds of our volunteers, however, posts with such pictures made up 30% of all posts shown from the same accounts (some posts were shown more than once).
Posts that contained pictures of women in undergarment or bikini were 54% more likely to appear in the newsfeed of our volunteers. Posts containing pictures of bare chested men were 28% more likely to be shown. By contrast, posts showing pictures of food or landscape were about 60% less likely to be shown in the newsfeed.
These results, which can be read in detail on a dedicated page , pass standard tests of statistical significance.
The skew towards nudity might not apply to all Instagram users. While it was consistent and apparent for most volunteers, a small minority were served posts that better reflected the diversity published by content creators. It is likely that Instagram’s algorithm favors nudity in general, but that personalization, or other factors, limits this effect for some users.
Our results fall short of a comprehensive audit of Instagram’s newsfeed algorithm. They only document what happened in the newsfeeds of our volunteers. (You can help us improve the results by installing the add-on , we will publish updates as more data comes in.) Without access to Facebook’s internal data and production servers, it will always be impossible to draw definitive conclusions.
Facebook did not answer our precise questions but sent a statement: “This research is flawed in a number of ways and shows a misunderstanding of how Instagram works. We rank posts in your feed based on content and accounts you have shown an interest in, not on arbitrary factors like the presence of swimwear.”
We nevertheless have reasons to believe that our findings are representative of how Instagram generally operates.
In a patent published in 2015, engineers at Facebook, the company that runs Instagram, explained how the newsfeed could select which pictures to prioritize. When a user posts a picture, it is analyzed automatically on the spot, according to the patent. Pictures are given an “engagement metric”, which is used to decide whether or not to show an image in the user’s newsfeed.
The engagement metric is partly based on past user behavior. If a user liked a specific brand and a photo shows a product of the same brand, the engagement metric increases. But the engagement metric can also be computed based on past behavior from all users of the service. The patent specifically states that the gender, ethnicity and “state of undress” of people in a photo could be used to compute the engagement metric.
While Instagram claims that the newsfeed is organized according to what a given user “cares about most”, the company’s patent explains that it could actually be ranked according to what it thinks all users care about. Whether or not users see the pictures posted by the accounts they follow depends not only on their past behavior, but also on what Instagram believes is most engaging for other users of the platform.
Facebook automatically analyzes pictures with a software, known as computer vision, before its algorithm decides which ones to show in a user’s newsfeed. Such software draws automated inferences from a training data set, made of thousands of manually annotated images. Its limitations could impact how Instagram prioritizes pictures in newsfeeds.
Computer scientists have known for years that such systems replicate and amplify the biases of their training data, leading to spurious, or fallacious, correlations. For instance, a program tasked with identifying wolves and dogs based on pictures of the canines found online will not recognize the animals in the human sense of the word. Instead, it will give the label “wolf” to any animal on a snowy background.
Training data for computer vision is usually produced by poorly-paid workers with an incentive to work quickly and provide results that fit the expectations of their employers. This leads them to uncritically adopt the categories offered to them and to overlook the subtleties a photo might contain, wrote Agathe Balayn, a PhD candidate at the Delft University of Technology on the topic of bias in automated systems.
The consequences can be severe. In December, a Brazilian artist tried to advertise one of his Instagram posts. The request was denied on the grounds that the post contained violent content. It only depicted a boy and Formula One racer Lewis Hamilton. Both were dark-skinned. In April, a yoga teacher was denied an advertisement on the ground that the picture showed profanity, even though she was only doing the side crane pose. She is Asian-American.
(In our experimental setup, we also used a computer vision system, Google Vision. While its results are egregious – the label “beauty”, for instance, was only returned for females – it is very likely that its biases are similar to Facebook’s computer vision engine, were it only because it was built in part by the same people.)
Instagram’s guidelines state that nudity is “not allowed” on the service, but favors posts that show skin. The subtle difference between what is encouraged and what is forbidden is decided by unaudited, and likely biased, computer vision algorithms. Every time they post a picture, content creators must thread this very fine line between revealing enough to reach their followers but not revealing so much that they get booted off the platform.
A 2019 survey of 128 Instagram users by the US magazine Salty showed that abusive removal of content was common. Just how common such occurrences are, and whether People of Color and women are disproportionately affected, is impossible to say as long as Instagram’s algorithms remain unaudited.
However, a review of 238 patents filed by Facebook containing the phrase “computer vision” showed that, out of 340 persons listed as inventors, only 27 were female. Male-dominated environments usually lead to outcomes that are detrimental to women. Seat-belts in cars, for instance, are only tested on male dummies, leading to higher rates of injuries for women. Our research shows that Facebook’s algorithms could follow this pattern.
Sarah and other entrepreneurs who rely on Instagram were terrified to speak to the press. Most professional Instagram content creators fear retaliation from Facebook, in the form of account deletion or shadow-bans (a practice where a user’s posts are shown to none or very few of their followers, without the user’s knowledge) – a death sentence for their business.
A young entrepreneur with about 70,000 followers, who said that Instagram was “very important” for her business, specifically told AlgorithmWatch that she did not want to be named for fear of a shadow-ban. Ely Killeuse, who talked on the record, said that having another source of income was the “number one condition” for her. Too much dependence on Instagram would mean losing her freedom and her sanity, she added.
The General Data Protection Regulation (GDPR), which came into force in 2018, and the Platforms to Business (P2B) regulation, which will be applicable from 12 July 2020, already provide many guarantees for users and professionals. In particular, GDPR states that users have a “right to explanation” regarding automated decisions, and the P2B regulation shall force online intermediation services to disclose the “main parameters determining [algorithmic] ranking”.
This new measure should not force platforms to disclose the inner workings of their algorithms, according to Petra de Sutter, who chairs the Committee on the Internal Market and Consumer Protection at the European Parliament. Preventing Instagram from sorting its users’ newsfeeds would not be legally feasible, she wrote in an email to AlgorithmWatch. Instead, the transparency P2B will bring should allow for well-informed policy decisions at a later point, she added. As for fears of shadow-bans, Ms de Sutter considers them overblown. “A question never brought retaliation”, she wrote.
P2B might be different, but two years after GDPR came into force, several experts deplore a very lacunary implementation. One problem is that the Irish data protection authority, who is responsible for regulating Facebook’s Dublin-based European subsidiary, appears to be woefully understaffed and “does not seem to understand GDPR”, as a specialist in platform work put it to AlgorithmWatch. Another issue lies in the lack of policing. No authority, at the European level or within Member States, has the power or the tools needed to audit any of the giant platforms, including Instagram, leaving many of GDPR’s provisions unenforced.
While our results show that male and female content creators are forced to show skin in similar ways if they want to reach their audience, the effect could be larger for females, and be considered a discrimination of female entrepreneurs. However, although discrimination based on gender is prohibited by the Charter of Fundamental Rights of the European Union, no legal avenues exist for an Instagram user to start legal proceedings. The specifics of social media entrepreneurship are not taken into account in legislation.
Miriam Kullmann, an assistant professor at WU Vienna University of Economics and Business, wrote to AlgorithmWatch that European anti-discrimination legislation deals almost exclusively with employment relationships. Self-employed persons, such as the professionals we monitored, are not protected.
Some groups do fight for the rights of independent creators on social media. IG Metall, Europe’s largest union, supports a collective action of YouTubers, demanding more fairness and transparency from Google (which owns YouTube) when a video was demonetized. They do not plan to extend their program to content creators on Instagram or other platforms.
With the number of European entrepreneurs creating content from Instagram likely ranging in the thousands, the impact of their posts is massive, as they routinely boast hundreds of thousands of followers. Facebook claims that close to 140 million residents of the European Union, or one in three, used Instagram in April.
Among the 18-to-24-year-olds, Instagram penetration is about 100% in every EU country. The stay-at-home orders linked to the Covid-19 pandemic increased the time spent on Instagram by staggering amounts. In one week in locked-down Italy, Instagram views doubled over normal, Facebook reported to investors.
Do you use Instagram professionally? Have you seen your posts or your account suspended, disabled or shadow-banned? We’d love to hear from you. Contact Nicolas Kayser-Bril securely at email@example.com or Signal +491702875332.
Almost a century ago, famed British author Virginia Woolf said that women needed “a room of one’s own” to allow for their creativity to flourish. Deferring to the opinion of external authorities, she wrote, was like inviting rot to develop at the heart of one’s work.
On Instagram, deferring to the opinion of the authorities that built the newsfeed algorithm is not a choice. Refusing to show body parts dramatically curtails one’s audience. Male and female entrepreneurs must abide by the rules set by Facebook’s engineers if they want to stand a chance of making a living.
This investigation was the result of a collaboration between EDJNet and Algorithm Watch, coordinated by Nicolas Kayser-Bril.