There are thousands of distortion filters available on major social platforms, with names like La Belle, Natural Beauty, and Boss Babe. Even the goofy Big Mouth on Snapchat, one of social media’s most popular filters, has distortion effects.
In October 2019, Facebook banned distortionary effects due to “public debate about a potential negative impact.” Awareness of bodily dysmorphia increased and a filter called FixMe, which allowed users to mark their faces like a cosmetic surgeon would, drew a wave of criticism. to encourage plastic surgery. But in August 2020, the effects were reissued with a new policy banning filters that explicitly encouraged surgery. However, effects that resize facial features are still allowed. (When asked about the decision, a spokesperson asked me to Facebook press release from this era.)
When the effects were reissued, Rocha took a stand and started posting body shame convictions online. She pledged to stop using the warping effects herself unless they were clearly humorous or dramatic rather than embellishing and said she didn’t want to “be responsible” for the ill effects of some. Filters on Women: Some, she says, have sought plastic surgery that makes them look like their filtered selves.
“I wish I had worn a filter right now”
Krista Crotty is a Clinical Education Specialist at the Emily Program, a leading center for eating disorders and mental health based in St. Paul, Minnesota. Much of his work over the past five years has focused on educating patients on how to consume media in healthier ways. She says when patients present themselves differently online and in person, she sees an increase in anxiety. “People post information about themselves – whether it’s height, shape, weight, whatever – it doesn’t look anything like what they actually look like,” she says. “There is a lot of anxiety between this authentic self and the digital self, because that’s not who you really are. You don’t look like the photos that have been filtered out. “
For young people, who are still figuring out who they are, navigating between a digital and authentic self can be particularly complicated, and it’s unclear what the long-term consequences will be.
“Online identity is a bit like an artifact, almost,” says Claire Pescott, the University of South Wales researcher. “It’s kind of a projected image of yourself.”
Pescott’s observations of children led her to conclude that filters can have a positive impact on them. “They can kind of try different personalities,” she explains. “They have these identities ‘as long as’ they could change, and they can grow with different groups.”
But she doubts that all young people will be able to understand how filters affect their self-perception. And she’s worried about how social media platforms grant immediate validation and comments in the form of likes and comments. Young girls, she says, have particular difficulties distinguishing between filtered photos and ordinary photos.
Pescott’s research also revealed that while children are now often educated about online behavior, they receive “very little education” about filters. Their safety training “was related to the overt physical dangers of social media, not the emotional and more nuanced side of social media,” she says, “which in my opinion is more dangerous.”
Bailenson expects that we can learn more about some of these emotional unknowns from established VR research. In virtual environments, people’s behavior changes with the physical characteristics of their avatar, a phenomenon called the Proteus effect. Bailenson found, for example, that people who had taller avatars were more likely to behave confidently than those who had shorter avatars. “We know that visual representations of the self, when used meaningfully in social interactions, change our attitudes and behaviors,” he says.
But sometimes these actions can play on stereotypes. A well-known study from 1988 found that athletes who wore black uniforms were more aggressive and violent when playing sports than those who wore white uniforms. And this translates into the digital world: a recent study showed that video game players who used avatars of the opposite sex actually behaved in a stereotypical manner.
Bailenson says we should expect to see similar behavior on social media, as people adopt masks based on filtered versions of their own faces, rather than entirely different characters. “The world of filtered video, in my opinion – and we haven’t tested it yet – is going to behave very similarly to the world of filtered avatars,” he says.
Given the power and ubiquity of filters, there is very little in-depth research into their impact – let alone guardrails around their use.
I asked Bailenson, who is the father of two young daughters, how he feels about his daughters’ use of AR filters. “It’s really difficult,” he says, “because it goes against everything we’re taught in all of our basic cartoons, which is ‘Be yourself’.”
Bailenson also says playful use is different from the constant, real-time growth of ourselves, and it’s important to understand what these different contexts mean for kids.
The few regulations and restrictions on the use of filters are based on companies controlling themselves. Facebook’s filters, for example, must go through an approval process which, according to the spokesperson, uses “a combination of human and automated systems to review effects when submitted for publication.” They are reviewed for certain issues, such as hate speech or nudity, and users can also report filters, which are then reviewed manually.
The company says it regularly consults with expert groups, such as the National Eating Disorders Association and the JED Foundation, a non-profit mental health organization.
“We know people may feel pressured to look a certain way on social media, and we are taking action to address this issue on Instagram and Facebook,” a statement from Instagram said. “We know the effects can play a role, so we ban those that clearly promote eating disorders or encourage potentially dangerous cosmetic surgery procedures… And we are working on more products to help reduce the pressure than the ones. people can feel on our platforms, like the option to hide like accounts. “
Facebook and Snapchat also tag filtered photos to show they’ve been transformed, but it’s easy to bypass the tags by simply applying changes outside of apps, or uploading and uploading a filtered photo.
Labeling may be important, but Pescott says she doesn’t think it will significantly improve an unhealthy beauty culture online.
“I don’t know if that would make a huge difference, because I think it’s the fact that we see it, even though we know it’s not real. We still have that aspiration to look that way, ”she says. Instead, she believes the images kids are exposed to should be more diverse, more authentic, and less filtered.
There is also another concern, especially since the majority of users are very young: the amount of biometric data that TikTok, Snapchat and Facebook have collected through these filters. While Facebook and Snapchat say they don’t use filtering technology to collect personally identifiable data, a review of their privacy policies shows they do have the right to store data from photographs and videos. on platforms. Snapchat’s policy states that snapshots and chats are deleted from its servers once the message is opened or expires, but stories are stored for longer. Instagram stores photo and video data for as long as it wants or until the account is deleted; Instagram also collects data on what users see through its camera.
Meanwhile, these companies continue to focus on AR. In a speech to investors in February 2021, Snapchat co-founder Evan Spiegel said, “Our camera is already capable of extraordinary things. But it is augmented reality that is the engine of our future ”, and the company will“ double ”augmented reality in 2021, calling the technology“ useful ”.
And while Facebook and Snapchat say the facial detection systems behind the filters don’t log into user identities, it’s worth remembering that Facebook’s smart photo tagging feature – which looks at your photos and tries to ” identify the people who could be included. —Was one of the first large-scale commercial uses of facial recognition. And TikTok recently settled for $ 92 million in a lawsuit alleging the company was misusing facial recognition for ad targeting. A Snapchat spokesperson said that “the Snap’s Lens product does not collect any identifiable information about a user and we cannot use it to link or identify individuals.”
And Facebook in particular sees facial recognition as part of its AR strategy. In January 2021 blog post titled ‘No turning back,’ Andrew Bosworth, director of Facebook Reality Labs, wrote, ‘We’re in the early days, but we intend to give creators more to do in AR and with greater capabilities. ” The company’s planned release of AR glasses is highly anticipated, and it has already teased the possible use of facial recognition as part of the product.
In light of all the effort it takes to navigate this complex world, Sophia and Veronica say they just want to know more about beauty filters. Apart from their parents, no one has ever helped them understand all of this. “You shouldn’t have to get a specific college degree to figure out that something might be unhealthy for you,” says Veronica.