Amazon Web Services will block Parler's access to its hosting services at the end of the weekend and may take the service offline if a new provider cannot be found.
"Because Parler is unable to comply with our Terms of Service and poses a very real public safety risk, we plan to lock Parler's account effective Sunday, January 10th at 11:59 p.m. PST," Amazon wrote in one Email Parler received and first reported from BuzzFeed.
The AWS email to Parler cited a few examples of violent and threatening contributions in the past few days, including threats to "systematically murder liberal leaders, liberal activists, BLM leaders and supporters" and others. "Given the unfortunate events that happened last week in Washington, DC, there is a serious risk that this type of content will further fuel the violence," the message added.
Parler launched in 2018 as a "free speech" alternative to Twitter and Facebook. In 2019 and 2020, it attracted a number of conservative, right-wing and far-right fringe users. Usage has increased dramatically in the past few days following the events on Wednesday at the US Capitol and the subsequent complete ban on Twitter and other platforms by President Donald Trump.
This increased traffic has also created a heightened threat of violence on the platform, which tech companies across the board seem to be taking seriously after this week – and no wonder, given that the insurgents who attacked the Capitol used social media on a massive scale Planning took advantage. perform and brag about what they do.
However, Parler has not formulated a clear plan for dealing with violent threats on its platform. As Amazon wrote:
It is clear that Parler does not have an effective process for complying with the AWS Terms of Service. It also appears that Parler is still trying to pinpoint its position on content moderation. They remove some violent content when you are contacted by us or others, but not always with urgency. Its CEO recently publicly stated that he "doesn't feel responsible for any of this or the platform". This morning you communicated that you have a plan to moderate violent content more proactively, but you want to do it manually with volunteers. We believe that this emerging plan to use volunteers to promptly identify and remove dangerous content is not working given the rapidly growing volume of violent posts.
Apple also removed Parler from its iOS app store today, citing similar concerns.
"Parler has failed to comply with its obligation to moderate and remove harmful or dangerous content that leads to violence and illegal activity, and does not comply with the guidelines for reviewing the App Store," wrote Apple. "Your app will be removed from the App Store until we receive an update that meets the guidelines for review in the App Store, and you have demonstrated your ability to effectively moderate and filter the dangerous and harmful content on your service. "
Google booted Parler from its app store on Friday and also cited the dissemination of explicitly violent content on the platform.