By LAURA ROMERO, ABC News
(NEW YORK) — The popularity of a social media platform hinges in large part on a robust exchange of views. But what happens when it turns into an echo chamber where most people there share the same opinions?
That’s the hurdle now facing Parler, a Twitter-style social media platform that has gained popularity mostly among President Donald Trump’s supporters and right-wing conservatives. After the 2020 presidential election, experts ABC News spoke with say they believe it’s unlikely the platform will ever grow to rival the size of networks like Twitter or Facebook, though it could continue to influence a smaller sphere.
Parler was founded in 2018 by John Matze and Jared Thomson, two Nevada-based conservative programmers. The app receives financial backing by Rebekah Mercer, the daughter of Robert Mercer, a hedge fund manager and the co-founder of Cambridge Analytica, who revealed her involvement in a post on the app on Sunday.
“John and I started Parler to provide a neutral platform for free speech, as our founders intended, and also to create a social media environment that would protect data privacy,” Mercer said in the post. “The ever increasing tyranny and hubris of our tech overlords demands that someone lead the fight against data mining, and for the protection of free speech online. That someone is Parler, a beacon to all who value their liberty, free speech and personal privacy.”
Last week, the app gained over 3.5 million users, according to Jeffrey Wernick, the company’s chief operating officer. It is now at the top of Apple’s App Store list of free apps.
The app advertises itself as a platform for free speech, where users can post “without fear of being ‘deplatformed’ for your views.”
“Parler is a breath of fresh air for those weary and wary of the way they’ve been treated by our competitors,” said Wernick.
But Benedict Evans, a tech analyst, told ABC News he believes the new platform will not gain enough popularity to compete with Facebook and Twitter.
“Parler is a weak clone of Twitter, but you can go there to talk about one particular issue that’s now mostly blocked on Twitter,” said tech analyst Benedict Evans, referring to the fact that Twitter has been fact-checking or deleting posts and profiles promoting misinformation about things such as voter fraud in the presidential election.
“But how many people care about that one issue? And do they care enough to spend all their time there, and not on Twitter or Facebook where all the other news and discussions are happening?” Evans said.
Renée DiResta, a technical research manager at Stanford University who has been studying the way information spreads on the internet and social media platforms, noted that despite the creation of Parler accounts, Facebook and Twitter have not reported a significant dip in the usage of their platforms.
“It is not the first niche social platform to pop up, or to achieve a lot of downloads,” said DiResta. “In fact, for Parler in particular, some very similar articles speculating about a mass exodus were written in June 2020, as prominent conservatives announced they were creating accounts in response to Twitter and Facebook shadow banning conservatives, censoring the president and other similar rationales.”
Parler gained popularity after top social media platforms like Facebook and Twitter imposed stringent measures during and after the election to stop the spread of misinformation.
Sen. Ted Cruz, Congressman Devin Nunes, Fox News host Sean Hannity, radio personality Mark Levin and far-right activist Laura Loomer are among those on Parler.
For Quran, the campaign director at Avaaz, Facebook and Twitter are also to blame for the spread of election misinformation. He said the platforms are doing too little too late.
“We saw fake videos circulating social media after the election of people allegedly burning ballots that got millions of views before they were taken down,” said Quran. “Facebook could retroactively message them and say, ‘hey, we noticed you watched this video, it turned out to be misinformation’ but it’s not doing that.
Last week, Facebook shut down a “stop the steal” group that had gathered over 356,000 members.
A spokesperson for Facebook told ABC News that while there is no “silver bullet solution” to combat misinformation, the company has taken a multi-pronged approach that includes removing accounts and content that are against the platform’s community standards, reducing the spread of false news by fact checking, and adding warning labels to posts.
Similarly, Jesse Littlewood, vice president of campaigns at Common Cause, a nonpartisan watchdog group, told ABC News that some of Facebook’s actions have been inefficient.
“I would give them a mixed grade,” said Littlewood. “On the one hand, they’ve been quick on some issues, but I still think they step in after a post has been circulating on the web for a while when damage has already been done and there’s already a huge amount of influence.”
DiResta, the technical research manager at Stanford University, said “misinformation is less of a problem to solve than a condition to be managed.”
“We are never going to face-check our way to a misinformation-free internet,” said DiResta. “[There is] a lot of this is cat and mouse, minimizing or contextualizing information as people are searching for it, yet knowing that there will be another hashtag and another claim to address imminently.”
“This happens in particular when the stakes are high and the narrative directly impacts a lot of people’s lives,” DiResta added.
Copyright © 2020, ABC Audio. All rights reserved.