[ad_1]
A viral video shows a young woman leading an exercise class at a roundabout in the Burmese capital, Nyapyidaw. Behind her a military convoy approaches a checkpoint to make arrests in Parliament. Did she inadvertently film a coup? She continues to dance.
The video later became a viral meme, but for the first few days amateur sleuths online debated whether it was filtered green or otherwise manipulated, often using jargon from image verification and forensics.
For many viewers online, the video captures the absurdity of 2021. Yet claims of audiovisual manipulation are increasingly used to make people question whether what’s real is fake.
At Witness, in addition to our ongoing work to help people film the reality of human rights violations, we conducted a global effort to better prepare for increasingly sophisticated audiovisual manipulations, including so-called deepfakes. These technologies provide tools to make it look like someone has said or done something they never did, to create an event or person that never existed, or to more easily edit a video. .
The hype fails, however. The political and electoral threat of real deepfakes lends itself well to the headlines, but the reality is more nuanced. The real reasons for concern became clear during the expert meetings that the witness gave. Brazil, South Africa, and Malaysia, as well as in the WE and Europe, with people who have suffered damage to their reputation and their credentials, and professionals such as journalists and fact-checkers tasked with fighting lies. They highlighted the current harm caused by non-consensually manipulated sex images targeting ordinary women, journalists and politicians. This is a real, existing and widespread problem, and recent reports confirmed its growing scale.
Their testimony also showed how complaints Deepfakery and video manipulation were increasingly used for what law professors Danielle Citron and Bobby Chesney call the “liar’s dividend,” the ability of the powerful to claim plausible denial over incriminating footage. Statements like “This is a fake” or “It has been manipulated” have often been used to denigrate a leaked video of a compromising situation or to attack one of the few sources of civil power in authoritarian regimes: credibility smartphone images of state violence. It is based on stories of state sponsored deception. In Myanmar, the military and authorities themselves have repeatedly shared false images and challenged the veracity and integrity of actual evidence of human rights violations.
During our discussions, journalists and human rights defenders, including those in Myanmar, described fearing the burden of relentlessly proving what is real and what is wrong. They were concerned that their work would not only become rumors of debunking, but also having to prove that something was genuine. Skeptical publics and public factions question the evidence to strengthen and protect their worldview, and to justify their actions and partisan reasoning. In the United States, for example, conspirators and supporters of the right dismissed former President Donald Trump’s awkward concession speech after the attack on Capitol Hill saying “it’s a fake”.
There are no easy solutions. We must support the building of audiovisual forensic and auditing skills in community and professional leaders around the world who can help their audiences and community members. We can promote the general accessibility of platform tools to make it easier to view and challenge the perennial “superficial” poorly contextualized or edited videos that simply misinterpret a video or perform basic editing, as well as more sophisticated deepfakes. Responsible “authenticity infrastructureWhich makes it easier to know if and how an image has been manipulated and by whom, for those who want to “show their work”, can help if it is developed from the start with an awareness of how it could also be abused .
We also need to frankly recognize that the promotion of auditing tools and skills may in fact perpetuate a conspiratorial approach to media ‘disbelief by default’ that is actually at the heart of the problem with so many videos that In reality show reality. Any approach to providing better skills and better infrastructure must recognize that conspiratorial reasoning is one step away from constructive doubt. Media literacy approaches and forensic tools that send people down the rabbit hole rather than promoting common sense judgment can be part of the problem. We don’t all need to be instant open source investigators. First we need to apply simple frameworks like the SIFT methodology: Stop, examine the source, look for reliable coverage, and trace the original context.
[ad_2]