House chair wants to know who’s financing Parler

Photo By Tom Williams/CQ-Roll Call, Inc via Getty Images/Pool

Rep. Carolyn Maloney (D-NY), chair of the House Oversight Committee, is demanding documents from the free speech social network Parler related to its financing after last month’s deadly attack on the Capitol.

In her letter to Parler’s chief operating officer Jeffrey Wernick Monday, Maloney called on the company to hand over documents outlining who or what entities have ever had any control over the company, a list of some of its creditors, and any documents or communications tying the company to a Russian individual or entity.

“Since the attacks, numerous Parler users have been arrested and charged for their roles, with the Department of Justice citing in several instances the threats that individuals made through Parler in the days leading up to and following the attack,” wrote Maloney. “Individuals with ties to the January 6 assault should not—and must not—be allowed to hide behind the veil of anonymity provided by shell companies.”

Earlier this month, Maloney called on FBI Director Christopher Wray to “conduct a robust examination of the role that the social media site Parler played” in the pro-Trump assault on the Capitol.

After several Parler users were arrested for participating in the January 6th attack on the Capitol, the company’s finances have been the subject of numerous news reports and now a congressional inquiry. Last year, Rebekah Mercer, a prominent conservative donor, revealed that she was helping fund Parler. Mercer is the daughter of Robert Mercer, a hedge fund manager who bankrolled the now-defunct political data shop Cambridge Analytica.

BuzzFeed News reported last week that former President Donald Trump’s company, the Trump Organization, sought stake in Parler in order for him to start an account on the platform. BuzzFeed said that Parler offered the Trump Organization a 40 percent stake in the company.

Late last month, Parler’s CEO John Matze was fired. In an Axios interview Sunday, Matze said he felt “betrayed” by Rebekah Mercer after he was casted out from the company and said he didn’t want to make a deal with the Trump Organization.

Parler has been removed from both Apple and Google’s app stores and forced offline after Amazon Web Services pulled out from hosting the website last month. It’s unclear when or if Parler will come back online.


The fight against Parler is a witch hunt. Those Parler users were most likely Twitter, Facebook, Instagram, Gmail, etc. users too. The users that could have been using all of the services to (do whatever they did).

Parler itself did not do anything wrong. It is possible some of the users did something illegal on the site itself, if they did the law should handle those specific users. Social media companies should not be able to interfere with a US citizens First Amendment right. On the flip side of that, social media companies should also not be held liable for what their users do.

Parler was set up to host extremist content and publish disinformation, it’s good it’s gone.

Parler was predominantly populated by right leaning people, but there were others there too. It could have been a censorship free platform for all.

Except Parler censored and banned people too.

I don’t necessarily agree with them censoring or banning people.

If Parlor had censored or banned more people before January 6, they might still be around today. They paid the ultimate price for their naive view that allowing the inmates to run the asylum was a good idea.

And nothing of value was lost.

Twitter hosts Antifa who posts images depicting a burning Capitol, should that shut down all of Twitter?

Do you not understand the difference between posting an image depicting something and planning to murder actual people? One is legal, the other is not.

Twitter and other social media networks moderate. Parler doesn’t. That was literally the entire issue.

Social media companies should not be able to interfere with a US citizens First Amendment right.

That’s not what the First Amendment is about, how it works or what it is supposed to protect from.

I disagree.

The First Amendment is to protect citizens from the government from regulating what its citizenry can or cannot say. It has absolutely nothing to do with what private companies do.

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.

So, where exactly does it talk about private companies?

Furthermore, when you sign up for an account on those websites, you agreed to their terms and conditions for using their service. If you don’t agree with how they’re run, you’re more than welcome to delete your account and go off to another service.

I see what you are saying, and can agree. That said, I don’t think a private company should be limiting free speech.

So, Christian forums should host porn, and every private company should allow racial slurs? A private company limiting free speech means it’s setting the rules on what you can do on their property. And with that gone, it’s a free for all.

Perhaps in a case that like a user should have the ability to opt in to a filter provided by the site?

No moderation is hard and maybe not attainable, but I feel like we have went off the deep end with companies moderating their users. Its very scary to me how a company, especially a large influential company, can silent a voice.

So, what you’re saying is that the user should have the ability to opt into some sort of moderation provided by the site and then decry the idea of moderation on the company’s terms?

Private companies have always had the ability to moderate those who consume their services. This is literally nothing new.

After 1/6, when Facebook banned Trump and others, hate speech on Facebook dropped 73%.

Moderation isn’t that hard. You get rid of the problem users, and voila!

Reddit banned Qanon in 2018. Snap moderates extremely well.

The big problems are Facebook, Twitter, Google, and YouTube. Oh yeah. And Parler and Gab.

If that is the case, then why can’t I yell "FIRE!" in a crowded theatre? Is that not free speech, even if it means hundreds of people might be hurt by it?

No Shirt, No Shoes, No service.

There has never been a time in the entire history of the United States where speech has been free from consequence. This fantasy about "free speech" anywhere and everywhere is not based in reality. No speech has ever been free from pushback in the court of public opinion (which can be much more brutal than the law). Nor has it ever been free fform censorship when made on private property.

It doesn’t matter what you think. They can. And do. Every private company moderates speech. Go read an employee handbook.

What do you think would happen if someone walked into work one day and started to tell everyone they were going to go to Washington DC and kill Nancy Pelosi or Mike Pence? Do you think they would keep their job?

Let’s talk about forums. I’ll take Team Liquid as an example.

That place is cesspool of easily offended users and all kinds of privileges given to the "celebrities" there. They warn and ban users on posts that they don’t like, and don’t exercise the same sort of heavy-handedness to their favored users.

Would you say, that TL is limiting free speech to the nobodies, and just fragrantly playing favorites?

How do you think the First Amendment protects Parler, or any other social media users, from being moderated?

Serious question. If you don’t know what the First Amendment actually says, here’s a link:


Parler itself did not do anything wrong.

I guess if you believe that not moderating posts that invite and incite violence and death is OK, then you’re right. But that is a world the vast majority of us don’t want to live in.

I don’t think the First Amendment protects Parler. Did Parler do something illegal? I don’t think the lack of moderation should be wrong.

I think posts that are illegal should be brought to the correct institution’s attention, such as the police or FBI. Then they should decide from there. If that institution should deem it necessary, then the social media company can then censor.

View All Comments
Back to top ↑