Parler said on Monday that it plans to make a comeback on the Apple App Store next week after Apple said it has approved several proposed changes that would lead to Parler’s reinstatement.
Parler’s anticipated return to the Apple ecosystem comes after months of communication between the two companies, Parler said. As a result, the social media company has crafted several new safeguards that will help it detect posts that do not fall within the scope protected by the First Amendment.
It also said that the version of Parler that can be downloaded by Apple users will differ from the web-based and Android versions of Parler. The Apple version will not show some posts that are acceptable under Parler policy but prohibited by Apple.
“Parler has and will always be a free and open forum where users could engage in the free exchange of ideas in the full spirit of the First Amendment to the United States Constitution," Parler’s Interim CEO Mark Meckler said in a statement.
“We have worked to put in place systems that will better detect unlawful speech and allow users to filter content undesirable to them, while maintaining our strict prohibition against content moderation based on viewpoint.”
This comes after Apple responded to a letter from Sen. Mike Lee (R-Utah) and Rep. Ken Buck (R-Colo.), telling the lawmakers that it has agreed to bring Parler back on its App Store after the platform made changes to the way it moderates user content.
“Apple anticipates that the updated Parler app will become available immediately upon Parler releasing it.”
This is the latest in an ongoing feud between Parler and big tech companies, which had sought to terminate the platform’s operation following the Jan. 6 incident. Apple and Google removed Parler from its app stores, while Amazon removed the platform from its web hosting service. All three companies took issue with the company’s alleged lax approach to removing violent content posted by its users and “repeated violations” of their terms of service related to such violent content.
It made the disclosure in a letter to the House Committee on Oversight and Reform, in response to the panel’s request for documents.
The company also said that it had formalized its working relationship with the FBI in November 2020 and began to regularly forward screenshots of unlawful posts that called for violence or merited additional investigation for public safety. Such posts included users threatening to kill former Attorney General Bill Barr and other politicians.