Apple has informed the social media network Parler that it has 24 hours to rid the platform of inappropriate content, or else face removal from the App Store. The news was first reported by Input Mag, who obtained a copy of the email sent from Apple to Parler.
In an email sent by Apple to Parler this morning (Pacific Time) and obtained by Input, the company provided numerous examples of Parler users explicitly calling for violence and referenced CEO John Matze’s comment that he doesn’t “feel responsible for any of this and neither should the platform.”
Apple added in the email that it disagrees with Parler’s hands-off approach to moderation, emphasizing that it will not distribute apps that include the types of content found on Parler. Apple says that Parler is responsible for all user-generated content, as per the App Store Review Guidelines.
“We want to be clear that Parler is in fact responsible for all the user generated content present on your service and for ensuring that this content meets App Store requirements for the safety and protection of our users,” the company said. “We won’t distribute apps that present dangerous and harmful content.”
Apple specifically cites the “illegal activities” that took place in Washington D.C. on January 6 as an example. Apple writes that Parler was used to “plan, coordinate, and facilitate” what happened.
“We have received numerous complaints regarding objectionable content in your Parler service, accusations that the Parler app was used to plan, coordinate, and facilitate the illegal activities in Washington D.C. on January 6, 2021 that led (among other things) to loss of life, numerous injuries, and the destruction of property. The app also appears to continue to be used to plan and facilitate yet further illegal and dangerous activities.”
If the content cited by Apple is not removed within the next 24 hours, then the Parler app will be kicked out of the App Store.
More to come…
FTC: We use income earning auto affiliate links. More.