Meta has rejected a demand from United Australia party leader Craig Kelly to suspend its community standards enforcement and factchecking on politicians’ posts on Facebook during the upcoming federal election campaign.
Kelly, who had his Facebook and Instagram accounts banned last year for allegedly breaching the company’s misinformation policy over posts promoting unproven Covid-19 treatments ivermectin and hydroxychloroquine, described the practice as “foreign interference”. Kelly has previously denied his social media posts contained “misinformation”, claiming they were backed up and their removal amounted to censorship.
At a hearing of the parliamentary committee examining social media and online safety, Kelly asked officials from Meta to guarantee “there will be no foreign interference by Meta in the Australian election” by blocking, shadow-banning, or deplatforming political candidates or parties.
Facebook’s head of public policy in Australia, Josh Machin, said the company would continue to apply its community standards policy. Kelly asked whether that meant posts would be factchecked and their reach reduced or posts removed, and Machin said the current rules would continue to apply.
“If a piece of content violates our community standards then yes, we’ll be removing it,” he said. “And that’s a really important protection that we have in place in order to protect the safety and the integrity of the election campaign.
Kelly argued the policy was potentially a breach of the implied freedom of political speech. Machin said the policy is applied evenly for politicians and the rest of the public.
Although Kelly is banned from the platform, his party is not, and according to Facebook’s transparency report, the United Australia party has spent over $513,000 on ads in the last 90 days alone on Facebook – the largest political ad spend in Australia on the platform. Party founder Clive Palmer has separately spent over $161,000 in the same period.
The Liberal party spent over $44,000 in the same period.
Separately, Meta has come under pressure from the Labor party in particular to crack down on misinformation on its platform ahead of the federal election expected in May.
The party has concerns of a repeat of the 2019 election where Facebook refused to remove content, unrelated to Kelly and the UAP, during the campaign that falsely claimed Labor would institute a death tax if it won the election.
Machin said Facebook did factcheck and demote the original posts and similar ones making the claim at the time, but would not remove politicians debating about what a potential government might do, or people expressing their opinions about it.
“We’ve discussed at the committee previously the importance of ensuring that we’re able to have proper democratic debate and that platforms such as ourselves, [are] only taking steps in relation to limiting what political parties say when there is a real-world harm element,” he said. “And so that category of material wasn’t subject to factchecking.”
Labor’s shadow assistant minister for communications, Tim Watts, asked the company to put a timeline on how long fact checks would take in the context of a four-week election campaign, however the company would not state how long such checks would take.
Meta also rejected Reset Australia research claiming five ads designed by the researchers and containing obvious election misinformation were approved by Facebook. The ads included claims the poll had been cancelled due to Covid, and people who weren’t vaccinated could not vote. They were deleted by the researchers before they could go live.
Machin said none of the ads went live on Facebook, and would have been picked up by other methods as they all violated the company’s policies on political advertising and misinformation.
“I think it’s important just to confirm that these ads did not go live, but was only the initial point of our enforcement approach which would have applied and so it’s not correct to look at the exercise undertaken by this lobby group as an indication of whether our systems are able to adequately detect or enforce potential misinformation,” he said.