Use + Remix

Google and Facebook are hoping to improve their news algorithms to provide a more balanced view. Media industry initiatives could have the solution they need.

Industry standards and certifications have been floated as solutions to journalism’s trust issues. (The Climate Reality Project, Unsplash) : The Climate Reality Project, Unsplash CC BY 4.0 Industry standards and certifications have been floated as solutions to journalism’s trust issues. (The Climate Reality Project, Unsplash) : The Climate Reality Project, Unsplash CC BY 4.0

Google and Facebook are hoping to improve their news algorithms to provide a more balanced view. Media industry initiatives could have the solution they need.

By Eleonora Maria Mazzoli, London School of Economics and Political Science (LSE)

All your news is curated by unseen hands. Content moderation and content curation measures are two sides of the same coin, and they sit at the heart of digital intermediary services. Soft behavioural nudges behind those measures can channel Google Search and Facebook audience choices in one direction or the other, through processes that law expert Karen Yeung describes as “subtle, unobtrusive yet extraordinarily powerful”. 

Questions about which content the public should see have always been part of media and communication discussions. Newsrooms and parliaments alike echo with debates about free speech and what content is in the ‘public interest’. And while governments mull how to intervene with the digital giants to preserve democratic institutions and provide diverse, trustworthy news to the public, the media industry is attempting to address the problem from within.

News organisations around the world are advancing principles and criteria that could define who “public interest news providers” are. One of the leading examples is the Journalism Trust Initiative (JTI) in Europe. It started as a collaborative standard setting process, according to the guidelines of the European Committee for Standardisation (CEN), led by Reporters Without Borders, and supported by the European Broadcasting Union (EBU), Agence France Presse (AFP), and more than 120 experts and entities.

In 2019, it published a reference document, establishing technical standards and professional norms for journalists and media outlets. These standards introduce a number of clauses, such as transparency requirements over ownership, funding, editorial mission, and data collection practices, but also accountability requirements aimed at fostering higher levels of professional norms, accuracy and both internal and external accountability systems for those media outlets that apply for the JTI certification.

These standards could enable trustworthy and public interest journalism to thrive in the digital era. As the implementation phase proceeds, JTI is also calling for these standards to be factored into the algorithms of search engines and social media platforms, in order to surface, recommend, and make more prominent “reliable and trustworthy sources of information online for the benefits of societies and democracy”. Such measures could be applied through either self-regulatory instruments, a code of practice, or more stringent co-regulatory frameworks.

To ensure the JTI’s standards are implemented through fair and accountable frameworks, the self-assessment and accreditation processes have to to be independent and auditable. This is especially the case if governments were to support its implementation through complementary co-regulatory interventions, as transparent and procedurally fair processes for reviewing the criteria for ‘public interest’ journalism will be crucial. While prominence algorithms have potential to promote trusted news sources, they can likewise be exploited for soft forms of censorship or propaganda, having implications for democracy and human rights.

These standards also need to find broad support and consensus among journalists and media outlets, which could be a challenge. In certain countries there are existing independent regulators for the press with their own industry standards, to which not all press and news providers have signed up to (such as IPSO and IMPRESS in the UK).

And with the rising concerns around misinformation online, other networks are also striving to develop principles and guidelines for ‘trusted’ or ‘public interest’ journalism, such as News Guard and its ‘trust ratings’, the Trust Project with its ‘trust indicators’, or the Credibility Coalition and its guidelines to promote online information quality.

As highlighted by the European Digital Media Observatory, using indicators as the single means for determining trustworthiness of content sources may create a media environment in which “established players gain further competitive advantage, while new players face unprecedented barriers to entry,” leading to problems for media pluralism and distortion in the media market. The benefits and consequences would have to be assessed, reflecting on the voluntary nature of both technical standards and other emerging indicators, and the need for transparency about the indicators’ methodology so that users can be aware of their limits.

Following growing public pressure, services like Google Search and Facebook have been improving the transparency of their algorithms. However, as highlighted by the 2020 Ranking Digital Rights Accountability Index, they still have a long way to go.

When it comes to prioritisation of content on these services, as argued by the 2020 study of the Council of Europe, these companies are increasingly mixing their usual commercial criteria with some vague public interest considerations. For instance, in Google’s search ranking guidelines, among criteria such as meaning of the query, relevance, recency, and context, its algorithms also take into account the “quality of content” and “expertise, authoritativeness and trustworthiness”.

Whether a news source is deemed to be trustworthy or authoritative, and therefore is granted a higher ranking, seems to be determined on several factors, one being whether “other prominent websites link or refer to the content”. However, it is unclear what other factors are considered and how they are weighed in the final recommender system. Industry standards do not seem to be one of the differentiating criteria. 

Facebook provides some general information on how the company curates and ranks “newsworthy content”. The company says its choices are based on a balancing test that weighs the public interest against the risk of harm. They mention that the tests and related judgments are based on “international human rights standards”, but which standards and what types of human rights risks are not clear.

These measures show that search engines and social media companies are willing to reduce misinformation also through measures that support public interest news providers. But there is an overall lack of coordination. Industry standards shared across the world, or at least benchmarked best practices, are likely to help guide the digital giants. 

Meanwhile, in both Google Search and Facebook’s cases, there are no independent evaluations of how these criteria feed into their content prioritisation measures, how they are weighed against other criteria that value popularity, relevance, or user engagement, and what impacts they have on users’ access and consumption of news. Increased transparency of these factors would promote real change.

Ultimately though, the core purpose of digital intermediaries like these is to moderate, curate, select, and filter what content can be found on their services. It remains to be seen whether they would embrace signing up to non-discrimination and public interest principles. Industry and policy practices could benefit from a more coordinated approach. If society is hoping to future-proof regulatory proposals that can also be scaled up at European and international level, then it needs to expand its scope, as ultimately, we are dealing with shared issues of flawed platform governance systems.

Good practice principles for prominence algorithms could include industry technical standards like JTI, but also leverage on existing Council of Europe’s recommendations relating to freedom of expression and information, media freedom and media pluralism. It is a question at the intersection of freedom of expression, media freedom and media pluralism that demands our attention.

Eleonora Maria Mazzoli is an ESRC-funded researcher at the LSE Media and Communications Department. Alongside her current academic work in the area of content curation and platform governance, she also acts as external expert and policy advisor, consulting on media and digital policy challenges to European institutions, media industry organisations, and civil society groups. Prior to this, she worked for the Legal and Policy Department of the European Broadcasting Union, and RAI, the Italian public service broadcaster. Ms Mazzoli declared no conflicts of interest in relation to this article.

Ms Mazzoli’s ongoing PhD research project is funded by the UK Economic and Social Research Council (ESRC) through the Doctoral Training Partnership (DTP) Scholarship n° 2098308.

This article was first published on February 14, 2022.

Originally published under Creative Commons by 360info™.

Are you a journalist? Sign up for our wire service