#1 CBS (5,888,000; down 5%)
#2 NBC (5,218,000; down 13%)
#3 Fox (4,419,000; down 1%)
#4 ABC (4,111,000; down 1%)
#5 ESPN (2,186,000; up 2%)
#6 Fox News (2,109,000; down 15%)
#7 Univision (1,279,000; down 14%)
#8 MSNBC (1,185,000; up 2%)
#9 TNT (1,073,000; down 9%)
#10 Hallmark (1,012,000; down 13%)
#11 Ion (964,000; down 6%)
#12 HGTV (959,000; down 16%)
#13 Telemundo (936,000; down 6%)
#14 TBS (881,000; down 19%)
#15 History (814,000; down 5%)
#16 TLC (791,000; down 23%)
#17 Discovery (721,000; down 17%)
#18 Food Network (720,000; down 15%)
#19 INSP (708,000; down 14%)
#20 USA Network (672,000; down 13%)
#21 Me TV (638,000; down 10%)
#22 CNN (605,000; down 19%)
#23 Bravo (551,000; down 16%)
#24 The CW (543,000; down 16%)
#25 ID (537,000; down 11%)
As you can see here, the election was held on May 18th, and searches for ‘death tax’ and ‘inheritance tax’ in Australia significantly ramped up in that last week.
The Coalition clearly saw this as a key area of concern for voters, and worked to amplify such in the final lead-up to the vote. Given this, it could be argued that, with more time, the Labor Party may have been able to counter the concern more effectively.
As such, stopping the amplification of such messaging in that last week could actually be critically important – so while it’s not a full ban, as some had hoped for, and Facebook still won’t fact check political ads, it may be a more important measure than many are anticipating.
Only time, of course, will tell.
Removing Election Misinformation
Facebook will also expand its efforts to remove misinformation about voting.
“We already committed to partnering with state election authorities to identify and remove false claims about polling conditions in the last 72 hours of the campaign, but given that this election will include large amounts of early voting, we’re extending that period to begin now and continue through the election until we have a clear result.”
The act of voting itself will be a key element of focus, with US President Donald Trump repeatedly criticizing the voting process, and the variations being made to accommodate voters amid COVID-19.
Just this week, Trump suggested that voters test the integrity of the system by seeking to vote twice, which is illegal in every US state.
With doubts like this being cast over the process, Facebook is looking to get ahead of any such activity, and take more action to remove voting misinformation from its platform.
Limiting Message Forwarding
Facebook has also announced that it will implement a new limit on message forwarding in Messenger in order to restrict the spread of viral misinformation via message.
As per Facebook:
“We’re introducing a forwarding limit on Messenger, so messages can only be forwarded to five people or groups at a time. Limiting forwarding is an effective way to slow the spread of viral misinformation and harmful content that has the potential to cause real world harm.”
Facebook implemented the same in WhatsApp back in April, in order to stem the flow of COVID-19 misinformation campaigns, which, Facebook says, lead to a 70% reduction in the number of highly forwarded messages sent in the app.
With Facebook implementing more measures to restrict the flow of misinformation in its main app, many activists and campaigners have turned to messaging to continue their efforts, and this proactive step by Facebook could be a significant measure to restrict any such push.
Cracking Down on Voting Misrepresentation
Facebook is also expanding its enforcement efforts against voting misinformation in posts.
“We already remove explicit misrepresentations about how or when to vote that could cause someone to lose their opportunity to vote – for example, saying things like “you can send in your mail ballot up to 3 days after election day”, which is obviously not true. (In most states, mail-in ballots have to be *received* by election day, not just mailed, in order to be counted.) We’re now expanding this policy to include implicit misrepresentations about voting too, like “I hear anybody with a driver’s license gets a ballot this year”, because it might mislead you about what you need to do to get a ballot, even if that wouldn’t necessarily invalidate your vote by itself.”
The expanded crackdown will help to dispel falsehoods about the voting process.
In addition, Facebook is also implementing new rules against using threats related to COVID-19 to discourage voting.
“We’ll remove posts with claims that people will get COVID-19 if they take part in voting. We’ll attach a link to authoritative information about COVID-19 to posts that might use the virus to discourage voting, and we’re not going to allow this kind of content in ads.”
Already, claims about protest activity and COVID-19 have been used to discourage people from voting in some areas.
Policing Premature Claims About the Election Outcome
Finally, another key area of concern, which Facebook has already flagged, is the possibility of civil unrest as a result of the final outcome of the vote.
Last month, The New York Times reported that Facebook has been exploring measures it might take in case President Trump decides not to accept the results of the 2020 election.
Trump, who has repeatedly criticized the integrity of the voting process, has thus far avoided questions about whether he will accept the final result – and now, Facebook has announced a range of additional measures that it will take to counter any effort to claim victory, or question the result, in the wake of the poll.
First, Facebook says that it will partner with Reuters and the National Election Pool to provide authoritative information about election results.
“We’ll show this in the Voting Information Center so it’s easily accessible, and we’ll notify people proactively as results become available. Importantly, if any candidate or campaign tries to declare victory before the results are in, we’ll add a label to their post educating that official results are not yet in and directing people to the official results.”
Facebook will also add an “informational label” to any post which seeks to delegitimize the outcome of the election or discuss the legitimacy of voting methods – which, Facebook says, will include any posts by the President.
Facebook will also increase its monitoring and enforcement efforts for groups like QAnon, which some are concerned may seek to organize violence or civil unrest in the period after the election. Facebook removed thousands of groups and Pages associated with QAnon specifically last month.
These are some significant measures, and while Facebook still, as noted, won’t be fact checking political ads, the measures introduced here could go a long way in combatting efforts to manipulate voters during the campaign.
It’s difficult to know how effective the measures will be, and unfortunately, we won’t have any definitive insight till after the election, but within the parameters of Facebook’s approach to political content, these are important steps, which could have a major impact.
In respect to how effective they are, Facebook is also conducting a large scale analysis of its impact on the political process, which will involve it gaining permission from users to analyze their activity throughout the campaign period.
And this week, reports have emerged that as part of this effort, Facebook may actually pay some users not to use their Facebook and Instagram accounts.
That likely relates to a control group – if Facebook wants to measure the full impacts of its posts and updates on voting behavior, it needs to have a comparison. By having a group of users not use Facebook or Instagram, then getting insight into how they voted and engaged with political content without these platforms, it will help the researchers establish a better baseline of what impact Facebook actually has.
There’s a lot going on, and with Facebook set to come under intense scrutiny, it’s working to do all it can to protect users from manipulation.
Will it work? Should Facebook do more? We’ll soon find out, as the campaign is about to kick into overdrive. #TheMoreWeKnow #cnasophis