The move comes after Google suspended the application from the Play Store on Friday. Like Apple, Google cited the fact that Talking allowed its users to continue to call for violence for its decision. Before Wednesday, those who took part in the U.S. Capitol Riot used the network to plan the incident. Despite the suspension, Android and iOS users can continue to use Talk, as long as the app is already installed on their device.
When news of Apple’s request first came out, Parler CEO John Matze said he disagreed with the company’s moderation policies. “Apparently they think Parler is responsible for ALL user-generated content on Parler,” he wrote in a Publish on Speak. “Therefore [sic] by the same logic, Apple should be responsible for ALL actions taken by their phones. Matze maintained that he had not seen users of the platform using the platform to promote illegal activity. “If people break the law, break our terms of service or do something illegal, we would definitely be involved,” he said. Told The New York Times. “But if people are just trying to get together or plan an event… there’s nothing particularly wrong with that.
What happens next for Parler will likely be decided by Amazon. As TechCrunch points out, the company currently hosts its platform through Amazon Web Services (AWS). Not only does speaking seem to infringe Amazon Acceptable Use Policy, but there is also a push within the company to force him to cut ties with Parler.
In many ways, this is the story of the last days. According to a report by The Washington Post, Twitter’s decision to ban Donald Trump came after hundreds of company employees pushed CEO Jack Dorsey to make the president’s suspension permanent.
The full text of the letter Apple sent to Parler can be seen below:
To the developers of the Parler application,
Thank you for your response regarding the dangerous and harmful content on Parler. We have determined that the measures you describe are inadequate to address the proliferation of dangerous and objectionable content on your app.
Parler has failed to live up to its commitment to moderate and remove harmful or dangerous content that encourages violence and illegal activity, and does not comply with App Store review guidelines.
In your response, you indicated that Parler had taken this content “very seriously for weeks”. However, the processes put in place by Parler to moderate or prevent the distribution of dangerous and illegal content have proved insufficient. Specifically, we continued to find direct threats of violence and calls to incite lawless actions in violation of Directive 1.1 – Security – Objectionable Content.
Your answer also refers to a moderation plan “for now”, which does not meet the ongoing requirements of Directive 1.2 – Security – User Generated Content. While there is no perfect system to prevent harmful or hateful user content, apps need to have robust content moderation plans in place to proactively and effectively address these issues. A temporary “task force” is not a sufficient answer given the widespread proliferation of harmful content.
For these reasons, your app will be removed from the App Store until we receive an update that meets App Store review guidelines and you have demonstrated your ability to effectively moderate and filter harmful content. and harmful to your service.
Application Review Committee