It was revealed yesterday by the Washington Post that Facebook has been assigning users a “trustworthiness rating” based on unknown criteria, giving everyone a score from zero to one.
As part of Facebook’s “war on fake news,” this new feature allows sophisticated algorithms to decide which users fall in line with the company’s own private view on ethics and credibility, potentially creating even more problems with distribution and shadow banning.
Tessa Lyons, the product manager in charge of misinformation, claims that the score is only one of thousands which are meant to determine a person’s credibility. However, the conditions of the score are kept secret to prevent users from gaming the algorithm.
“Not knowing how [Facebook is] judging us is what makes us uncomfortable, but the irony is that they can’t tell us how they are judging us — because if they do, the algorithms that they built will be gamed.” – Claire Wardle, director of First Draft research lab, Harvard Kennedy School
Lyons acknowledges that the system is not perfect, but many Facebook users are skeptical of having numbers assigned to them as they do not know the criteria of the score.
Based on the information we have available to us at the current time, the rating is in reference to how accurate a user can flag misleading content, but critics argue that this program allows Facebook to judge users based on a variety of conditions, including posting real content which criticizes the company or other firms which partner with them.
“One of the signals we use is how people interact with articles. For example, if someone previously gave us feedback that an article was false, and the article was confirmed false by a fact-checker, then we might weight that person’s future false-news feedback more than someone who indiscriminately provides false-news feedback on lots of articles, including ones that end up being rated as true.” – Tessa Lyons
Many journalists and political commentators are relating this to China’s Social Credit Score, a score delegated by the government of China to determine the ‘trustworthiness’ of Chinese citizens. The ‘Credit Score’ is supposed to be in full swing by 2020 and has already barred 11 million citizens from traveling on plane and 4 million from traveling via train, according to CBS.
Tarl Warwick, a popular YouTube commentator and author of several books on the occult, warns that the Facebook score may end in users being held back from certain jobs depending on their score.
“If you never rock the boat and don’t have any odd beliefs, you’re a trustworthy person. Since employers occasionally demand the Facebook pages of their employees, wouldn’t this be a little bit of a problem? Don’t you think that this could lead to people [saying] ‘well, this person’s applying for a job, but they’ve only got a .7 trustworthiness rating, we need at least a .75 – because Zuckerberg is God and he has everybody’s private messages.’” – Tarl Warwick, AKA Styxhexenhammer666
Facebook’s ‘trustworthiness rating’ is also similar to “Knowledge Graph,” a system already put in place by Google which ranks web pages based on their ‘truthiness,’ rather than by popularity. The New American describes their program as “an automated and super-charged version of Google’s manually compiled fact database.”
As many internet users are becoming increasingly concerned that powerful tech companies are gaining too much control over their online lives, alternative networks such as Minds and BitChute are gaining rapid popularity. Although they are not yet true competitors to the major tech giants, they do pose a threat as users become sick of tech companies determining what they can see and do online, now through the threat of ‘low ratings’.
“[It’s] not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher…I like to make the joke that, if people only reported things that were false, this job would be so easy!” – Tessa Lyons