Categories
Tech

Submission on extremist movements and radicalism in Australia – Reset Australia



To: Parliamentary Joint Committee on Intelligence and Security (PJCIS)
From: Reset Australia

Reset Australia would like to thank PJCIS for the opportunity to input on the inquiry into extremist movements and radicalism in Australia.

Reset Australia is an independent, non-partisan organisation committed to driving public policy advocacy, research, and civic engagement agendas to strengthen our democracy. We are the Australian affiliate of Reset, the global initiative working to counter digital threats to democracy. As the Australian partner in Reset’s international network, we bring a diversity of new ideas home and provide Australian thought-leaders access to a global stage.

Executive Summary

Our submission details the increasing role that digital platforms play in facilitating extremist rhetoric, radicalisation and violence, whilst providing tangible policy recommendations to arrest this trend. We recognise that there are many factors to radicalisation and the development of extremist ideology, however the scope will be limited to the role of the digital platforms.

The digital platforms have fundamentally shifted the information landscape, powered by the relentless collection of personal data. This information has been used to actively manipulate users to remain on their respective products, all with the aim of serving more ads. This system has contributed to the rise in some negative externalities – hate speech, disinformation, echo chambers and radicalisation. In order for us to approach this issue, we must first understand the system. As such, our organisation’s recommendations include:

  • Mandatory investigative and audit powers to understand how the platform’s algorithms perpetuate harm
  • A transparent, independent regulator with a clearly defined scope
  • An enforceable disinformation code that focuses on disrupting monetisation features that promote false information
  • Comprehensive digital literacy programs

Introduction

The rise in populism, ideological fanaticism and extremist movements exists in a complex system. As such, to provide meaningful input to this inquiry, our organisation’s submission must clearly define a scope that is within our area of expertise.

This submission will illustrate how an ‘attention economy’ propped up by the digital platforms has been a contributing factor in facilitating the general rise in extremism we see in the early 21st century. We will provide a set of recommendations that focus on this aspect of radicalisation, and in doing so will address points 3 (d) and (f) of this Inquiry.

However, 3 (f) states:

The role of social media, encrypted communications platforms and the dark web in allowing extremists to communicate and organise

Through this submission, we hope to challenge the framing of this point and illustrate how the role of social media isn’t limited to only communication or organisation, but as a gateway to further harm that has been engineered (unintentionally) to do so.

Context

Internet and social media usage has become ubiquitous to the Australian way of life. With over 85% of Australians using social media ‘most days’, the role of the digital platforms such as Facebook (Instagram, WhatsApp, Facebook), Twitter, TikTok and Google (Google and Youtube) play in our society have become fundamental to how we live, work and entertain ourselves. Whilst there has been literature on how social media platforms play a role in radicalisation as a formal recruitment tool and providing a space for extremist communities to interact, in this submission, we go further to posit the fundamental design of these platforms and the ‘attention economy’ they have engendered is a key facilitator for extremist ideologies to develop.

The Attention Economy

What all the digital platforms have in common, is the commoditisation of user attention as their fundamental resource. These platforms have successfully and efficiently monetised our collective attention to fuel multi-billion dollar profits.

The business models of the digital platforms have a single objective – to capture and maintain user attention in order to maximise advertisements served and profits generated. As such, the algorithms which dictate the content and information we consume are optimised to fulfil this objective, resulting in an attention economy i.e. the effective commoditisation of attention. To feed this machine, the platforms have built a system of unfettered and limitless personal data collection, building comprehensive profiles of their users that encapsulate their interests, vices, political leanings, triggers and vulnerabilities. This data is then used to predict our engagement behaviour, constantly calculating what content has the greatest potential for keeping us engaged. This content has been shown to lean towards the extreme and sensational, as it is more likely to have higher engagement,.

Whilst not the intended design, this system has had wide ranging impacts on our society. From the breakdown of public discourse due to targeted ‘filter bubble’ polarisation to the manipulation of this online architecture by malicious actors, the myriad of issues can be collectively characterised by their effective facilitation of the breakdown of our ‘public square’ – fracturing social cohesion, decreasing trust in government and halting productive civic debate.

Whilst research in this area is much needed, the digital platform algorithms which push users to more and more ‘engaging’ content can be linked to radicalisation pathways (an example is through YouTube’s recommender system). From the US Capitol Insurrection to the Christchurch Massacre, the role of the internet and social media platforms are becoming intrinsically linked with how extremist, radical and increasingly violent movements are manifesting. Even within ASIO’s Annual Threat Assessment, the Director-General specifically mentioned the rise of right-wing extremism to be of concern.

These are all data points that show that a problem is festering. What we need now, is an ability to understand how to understand how these harms occur. This action must enshrine stringent adherence to individual privacy, and focus on how the platforms themselves are actors in this ecosystem.

How the ‘Attention Economy’ facilitates harm

The ‘attention economy’ has two key features which constitutie components in a potential radicalisation pathway. Just to reiterate, research in this space is still evolving and as such these examples aren’t comprehensive, the full scope of these harms are still coming to light – but hopefully this begins to illustrate the emerging scale of this issue.

Proliferation of Hate Speech and Misinformation

Extremist movements, particularly right wing extremists operate heavily on an ‘anti-other’ narrative, and this is largely driven by the content consumed online. A recent report put out by the Centre for Resilient and Inclusive Societies (CRIS) found that key themes for online right wing extremist discussions include anti-minority, BLM, and Muslim rhetoric. While vilification of such groups exist before the digital age, the attention economy promotes outrageous content that would fuel such views.

This argument is supported by the Australian Muslim Advocacy Network (AMAN) report on extremist movements and radicalism, Facebook and Twitter’s auto-detection and content review systems cannot detect violations of their own policies, leading to questions about the reliability of their processes.

In addition to the direct targeting of minority communities in Australia, the experiences and identities of these communities have also been utilised in order to stoke division for both political and financial (through pushing divisive content that redirects to ad-heavy sites controlled by the profiting entity) motives. This includes:

  • A collection of 21 Facebook groups (including one from Australia) with over 1 million followers disseminating targeted Islamophobic content, for apparent financial gain
  • A network of Facebook pages run out of the Balkans profited from the manipulation of Australian public sentiment. Posts were designed to provoke outrage on hot button issues such as Islam, refugees and political correctness, driving clicks to stolen articles in order to earn revenue from Facebook’s ad network

The way these platforms have been designed leaves them not just vulnerable and open to bad actors, but incentives this inflammatory and divisive content due to its engagement.

Polarisation and Echo Chambers

A related but distinct phenomenon is how the digital platforms accelerate polarisation, creating ‘filter bubbles’ and ‘echo chambers’ for discourse that are the antithesis to the concept of Habermas’ ‘public sphere’. These can be characterised as when people are exposed to facts, ideas, people and news that adhere to and are consistent with their own political or sociological ideology – i.e. an information diet that is fuelled by confirmation bias.

Research has shown that in information-rich ecosystems, we have significant psychological limitations to ability to process this information. From a range of tendencies that make us seek out beliefs similar to our own (polarisation) to our ability to see other people’s choices which leads us down ‘group-think’ paths that reduce our desire to seek out new information – our inherent cognitive heuristics take on a completely different implication in the information laden world of the internet.

Indeed, where people do not have strong ideological convictions otherwise, social information can lead to herding and undermine collective wisdom – a clear theory for the piecemeal radicalisation we are seeing.

These psychological quirks are taken advantage of and exploited by the attention economy. Algorithmic curation systems drive users to content that is engaging, regardless of your cognitive bias – pushing users down ideological rabbit holes. Whilst this has been clearly demonstrated on Twitter (due to its more public platform) as early as the 2010 US midterm elections, and across various geographies – research into this on more private channels (such as Facebook groups, messaging forums, Youtube) is regularly stalled as these companies restrict access to researchers and public officials.

The consequences of this were clearly seen on January 6 with the US Capitol Riots, after months of stoking narratives of a stolen election, Donald Trump incited a group of people to storm the Capitol Building. Evidence on the drivers, mechanics and implications of this, and in particular the role of social media is still being researched, however it is clear that social media wasn’t just a communication tool – but a platform for radicalisation. This is especially concerning as users migrate to alternative platforms, with more relaxed community guidelines and vastly different patterns of content and engagement.

The consequences of users exposed to only their view of the world, wherein their engagement further reinforces their perspectives is deeply concerning.

Policy Approach

Too often, policy approaches related to dealing with ‘online safety’ issues have been focussed on content moderation. Whilst the takedown of material that is clearly false, misleading or clearly intended to divide and misinform is important, these policy approaches will always leave us playing catch-up. The speed in which content can be distributed and amplified to Australian users (especially the types of content used to target, manipulate and exploit diverse and diaspora communities) means that these types of approaches do not have the adaptivity required to respond.

Reset believes effective policy to counter these harms is rooted in transparency, privacy/data rights and public oversight. Online algorithms are an unregulated black box, and regulators should no longer stand on the other end waiting to play catch up. Digital platforms have become deeply embedded in modern society, and thus these platforms should be the focus of change. We must begin to also pull policy levers that look upstream. Rather than pulling down extremist content (which is still important), we must begin to unpack these algorithmic curation systems structurally and systematically.

Current Focus: Content takedown/ moderation

  • The problem: is seen to be caused by malicious actors, whether they be terrorists, cyberbullies or perpetrators of hate speech
  • The scope: is content which is illegal (black & white)
  • The solution: is seen the be policies which enforce platforms to deploy more robust content moderation practices (take down)

Future Focus: The attention economy

  • The problem: is seen to be the exploitation of user data & algorithms to maintain user attention, resulting in the amplification of extremist and sensational content
  • The scope: becomes design, practices and models that cause societal harm and division
  • The solution: is policies that promote transparency, regulate algorithmic amplification, and protect data rights and privacy |

Recommendations

Transparency and Public Oversight

The current model of self-regulation and self-reporting is insufficient and disproportionate to the potential harm to the public.

Information on these harms is held solely by the digital platforms, who do not make it available for transparent independent review under any circumstances. It seems extraordinary that the digital platform companies have all the data and tools needed to track, measure and evaluate these harms – indeed these tools are a core part of their business, but they make nothing available for public oversight, even as they avoid all but the most basic interventions to protect the public from harm.

An algorithmic audit would review processes by which the outputs of algorithmic systems (in this case the curation systems of the digital platforms which might radicalise users and promote disinformation) can be assessed for unfavourable, unwanted and/or harmful results.

As such, an independent regulator (such as the ACMA or eSafety Comission) must be empowered to have:

  • Compulsory audit and inspection powers
  • Enforced information-gathering powers that extend beyond just training data, but include evidence on policy, processes and outcomes
  • Powers to access and engage third-party expertise both within and outside government

Recommendation: Institute an audit authority under an independent regulator empowered to conduct mandatory investigations and audits on the impact of algorithmic amplification on Australian society.

These powers must operate under a system of transparency, legitimacy and due process, and should include checks and balances such as:

  • Mandatory transparency reporting
  • Avenues for recourse, objection and appeal for the digital platforms
  • Specific guidelines and scope for the audits

Additionally, there is a key gap in knowledge on this issue at the moment. Actors which serve the public interest – researchers, civil society, regulators – are operating in the dark when it comes to understanding the platform’s algorithmic systems, with selective, opaque and unclear access when it comes to research. This empowered regulator must work with industry to open up this access (with the appropriate privacy and trade-secret disclosure arrangement) so that research can be conducted.

This must have enforcement mechanisms that underpin it, as highlighted by the second assessment of the EU’s Disinformation Code which found that the goals under Pillar 5 (research cooperation) had largely not been achieved, with a ‘shared opinion amongst European researchers that the provision of data and search tools required to detect and analyse disinformation cases is still episodic and arbitrary, and does not respond to the full range of research needs’. Specifically, the discretionary approach the platforms take of entering into bilateral relationships with specific members of the academic and fact-checking community flies against the open and non-discriminatory approach needed for the levels of research, analysis and accountability required.

Recommendation: Commit to developing data sharing arrangements that empower academic researchers, civil sector actors and think tanks to undertake the requisite research on the role of social media and disinformation and radicalisation. These arrangements must preserve user privacy but also make good faith attempts to increase transparency on data that is vital for our understanding of disinformation (e.g. demographic data of user engagement, content engagement).

An example of a proposal for such an agreement can be found in a policy memo we developed called the Data Access Mandate for a Better COVID-19 Response in Australia. Whilst this memo focuses specifically on COVID-19 disinformation, this transparent data access proposal can and should be extended to other areas of disinformation research that impact our community.

The first step in crafting a solution is understanding the problem. As such, until there is greater transparency over the digital platform’s black box, we will never see true progress on this (and other) issues.

Disrupt Disinformation – an enforceable Code and disrupting monetisation incentives

address issues within the fundamental profit models of these digital platforms that have allowed for the propagation of disinformation. to address the underlying financial drivers that are used to propagate disinformation.

As highlighted within the EU’s Second Assessment of their Code of Practice on Disinformation, inconsistent implementation of measures intended to address placement of advertisements on platforms’ own services limited progress against this commitment. Additional challenges were seen regarding implementation of measures intended to limit ad placements on third-party websites that spread disinformation. Furthermore, the Assessment goes on to state that ‘the Code does not have a high enough public profile to put sufficient pressure for change on the platforms in this area’. These limitations were largely put to ineffective participation and collaboration by relevant stakeholders, including the advertising sector, fact-checking organisations and the platforms themselves.

The financial drivers which propagate disinformation represent the key opportunity for initial action, and this Objective is a valuable first step in the recognition of these responsible economic incentives. Whilst these measures have been referred to in the development of the ACMA’s Disinformation Code, a voluntary code will be wholly insufficient to fulfill these aims.

Recommendation: Work towards developing defined and enforceable guidance, practices to collaboration pathways that will effectively disrupt the economic drivers of disinformation, moving beyond the broad commitments and self-regulatory approaches. This should include:

  • Developing a common structure for risk assessment and escalation framework for ad accounts that propogate disinformation
  • Developing an application-approval system for actors intending on using advertising based on agreed-upon trustworthiness indicators
  • Defining concrete ways in which transparency can be embedded in on-platform advertisements to users, as well as wider transparency measures to the public and relevant stakeholders to ensure accountability.
  • Defining pathways for greater collaboration with other relevant stakeholders, in particular the advertising sector.

Digital literacy education

The information landscape is changing, and increasingly so every day. The infinite ability for people to obtain and share information is a social experiment that is actively unfolding. Governments (Federal and State), either within curriculum or through civil sector service providers must resource and development education materials that assist the next generation navigate this world.

Additionally, the root of many of these problems is the unregulated use and exploitation of personal data. As such, children must have the highest safeguards put in place, so that their data might be used to fuel the attention economy.

Recommendation: Develop a comprehensive policy which both provides young people with the educational tools in navigating online threats as well as rights which protect young people from certain behaviour. This should include:

  • Educational resources on how to identify fake news and how to take appropriate action
  • Establish the maximum amount of privacy protections for children – similar to the Age Appropriate Design Code in the UK
  • Commit to build ongoing literacy in regards to personal data and privacy rights

Thank you for the opportunity to engage with this inquiry. This submission was prepared by:

Matt Nguyen
Policy Lead – Reset Australia
[email protected]

Amal Wehbe
Policy Intern – Reset Australia

If you require any further information, please do not hesitate to get in contact with our organisation.




Categories
Tech

Submission on Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020 – Reset Australia



Who we are

Reset Australia is an independent, non-partisan organisation committed to driving public policy advocacy, research, and civic engagement to strengthen our democracy within the context of technology. We are the Australian afliate of Reset, the global initiative working to counter digital threats to democracy. As the Australian partner in Reset’s international network, we bring a diversity of new ideas home and provide Australian thought-leaders access to a global stage.

Executive Summary

We commend the Government for putting forward a proposal which seeks to reign in the influence of the digital platform companies, especially within a sector which is vital for a well-functioning democracy. Our core recommendations can be found in our prior submission to the consultation run by the ACCC on the final draft of this proposed amendment in 2020. In summary, our primary recommendations:

  1. Work to clearly define within the legislative text:

    a) The scope and types of data required under Section 52R, Subsection (3)(a)

    b) The measure of ‘significant efect’ under Sections 52S-U, Subsections (1)(c)

  2. Ensure a rights-based framework of user protections, similar to the EU GDPR prior to the passage of wholesale data sharing provisions outlined under Section 52R

  3. Institute an audit authority under an independent regulator to both verify the provisions set out under the Minimum Standards of this Code, and empowered to investigate/audit the impact of algorithmic amplification on Australian society

  4. Ensure proper regulatory oversight and guidance for how the information from the data sharing and advance notifications provisions under the Minimum Standards should be used

  5. Impose specific investigative powers on the independent regulator to conduct inquiries on market implications of new products and/or services

  6. Conduct a comprehensive assessment into how this Code has impacted the information and news media landscape annually 1 Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020 Submission 54

Context

We must recognise, that whilst the stated purpose of this amendment is to address bargaining power and competitive imbalances between media companies and the digital platforms, the true impact of this legislation will be changes to the news, media and journalism landscape in Australia. In order for us to strengthen our democracy, we must ensure that this impact is positive, with a goal of promoting greater diversity and pluralism within our media landscape.

This goal must be the guiding principle for both the final version of this amendment, and any iterations and complementary pieces of legislation that might be built from this Code into the future.

We implore the Government not to lose sight of the real goal: of ensuring a diverse and pluralistic media landscape.

This submission will endeavour to focus on recommendations that focus on rectifying market power imbalances between the platforms and media companies, however we stress that Big Tech are in the middle of a complex web of issues, and a singular commercial lens for a single industry seems short sighted. The ‘harmonised framework’ of the ACCC Digital Platforms Inquiry must be expanded beyond just the media to recognise the integral and expansive ways the digital platforms shape our lives. 2 Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020 Submission 54

1.0 Primary Recommendations – Minimum Standards

Our recommendations centre on the provisions set out under Division 4, Subdivision B – Minimum Standards.

1.1 Vague Definitions

Currently the data explanation and provision (Section 52R) requirements and advance algorithmic notification (Section 52S, 52T and 52U) requirements have impossibly vague definitions that, based on interpretation, have far-reaching and significant implications on data rights and privacy as well as limiting the ability for this Code to be meaningfully implemented.

The specific passages are:

Section 52R, Subsection (3)(a) – [that lists and explanations of data must be given] that relates to interactions of users of the designated digital platform service with covered news content made available by the designated digital platform service.

Whilst some eforts have been made to explain what ‘interactions’ means, both in Section 52C of the Bill and under point 1.107 of the Explanatory Materials, the wording remains vague. This opens up a spectrum of interpretation with significant downstream implications. All interactions of users related to news media content? Profile data of users who engage with this type of content? If a user shares a news link with another user, is that profiling data also shared?

For comments around why this level of transparency without the appropriate safeguards is concerning and further recommendations, please see Section 1.2 of this document.

Section 52S, 52T and 52U, Subsection (1)(c) – Changes to algorithm or practice will bring about significant efect on the respective components under each subsection.

Similarly, the term ‘significant efect’ has not been adequately defined, despite attempts to clarify under Section 52W of the Bill. Whilst the Explanatory Materials point 1.127 mentions that a significant efect constitutes a 20% or more change in referral trafc, as this is not reflected in the Bill text, it is again open to a risk of difering interpretation. Furthermore, there is no guidance around what ‘significant’ means in relation to Section 52U and changes to the distribution of advertising.

Recommendation: Work to clearly define within the legislative text:

  1. The scope and types of data required under Section 52R, Subsection (3)(a)
  2. The measure of ‘significant efect’ under Sections 52S-U, Subsections (1)(c)

1.2 Section 52R – Explanation and provision of data and implied data privacy risk

Whilst we support the intention behind the data sharing and explanation requirements detailed under Section 52R, we are concerned about the lack of a rights-based user protections framework that will support these provisions.

The EU’s proposal of the Digital Markets Act (DMA) has provided pathways for business users to gain access to the data generated from the usage of the digital platform’s services. This provision is a good step in balancing market imbalances and we are tentatively supportive of the Government’s proposal to incorporate similar measures within this Code. Additionally stipulations should also be considered to clarify how platforms should not impose barriers and facilitate this release of data, such as mandating that they must provide high quality APIs free of charge.

However, our support for this section is conditional on several fundamental changes to ensure that user protections are guaranteed.

Firstly – addressing vague definitions outlined under Section 1.1 of this submission.

Secondly – this provision must be supported by a rights-based framework to data privacy.

Whilst the EU DMA sits under protections aforded by the General Data Protection Regulation (GDPR), there is no Australian equivalent protective framework. The lack of end user rights around consent, data processing, erasure, automated individual decision-making (profiling) amongst others is especially concerning when taken in concert with the vague definition of scope. Therefore, we believe that the current passage of this Section is untenable until we update our privacy framework to recognise data rights. We respect that this is a current and ongoing process, and you can find more of our comments within our submission to the Privacy Act Review.

Recommendation: Ensure a rights-based framework of user protections, similar to the EU GDPR prior to the passage of wholesale data sharing provisions outlined in this Code.

1.3 Audit Authority – The need for verification and algorithmic audits

Ensuring that the Minimum Standards would work: No matter how the provisions under the Minimum Standards are interpreted, under the current Code there is no way to verify if the information provided by the digital platforms is accurate. The digital media platforms operate with near-monopoly status and hold a tight grip of control over how they use their data. Whilst this Bill orders them to share information, how will the media companies ascertain whether this data is meaningful or not, even if it is ‘accurate’ under the proposed law.

This entire section of the Bill becomes redundant with no verification measures. Thus there is an integral need for an audit authority to be instituted sitting under the independent regulator, most likely the ACCC.

The case for a broader remit and algorithmic audits: Whilst verification provides a clear cut case for instituting an audit authority, a discussion on how this authority might work to ensure data transparency measures work to serve broader public interests must be had. We recognise that the harms caused by the digital platforms, ranging from foreign interference to disinformation, needs a holistic approach and the remit of this authority should expand to provide insights into bigger questions – such as how platform curation algorithms open up risk and create harm to the public. Importantly, this isn’t at the exclusion of platform/publisher content visibility issues remedied by this Bill, merely an expansion that might provide a systematic legislative approach, rather than one focussing on a specific sector.

A purely commercial lens to the data sharing and advance notification provisions (particularly 52R, 52S and 52T) completely misses the systematic impacts of algorithmic amplification- that is the promotion/demotion of content that is currently dictated by the digital platform’s internal algorithmic processes. It is an issue that goes far beyond trafc and advertising revenue, and requires an expansive remit to address. Whilst market imbalances are important and addressed (and they would be under this authority), unilateral algorithmic curation and amplification has an outsized impact on harming the Australian public and our democracy. Whilst this is most clearly seen within the news media sector, and as such this Bill provides the perfect springboard to enact this kind of reform, we must not ignore that these harms go far beyond just news content.

Information on these harms is held solely by the digital platforms, who do not make it available for transparent independent review under any circumstances. It seems extraordinary that the digital platform companies have all the data and tools needed to track, measure and evaluate these harms – indeed these tools are a core part of their business, but they make nothing available for public oversight, even as they avoid all but the most basic interventions to protect the public from harm.

Without mandated access, regulators are forced to rely on the companies to police themselves through inefective codes of conduct. This failed approach has been seen overseas and yet is still being tried here in Australia.

This is not an impossible suggestion as the digital platforms might make you believe. Algorithmic audits have been specifically proposed in the EU Digital Services Act (DSA), and represent a clear model to emulate here in Australia. Our legislative approach must be as flexible and encompassing as the harms we seek to address.

Recommendation: Institute an audit authority under an independent regulator to both verify the provisions set out under the Minimum Standards of this Code, and empowered to investigate/audit the impact of algorithmic amplification on Australian society.

What would an algorithmic audit authority do?

An audit authority under an independent regulator (most likely the ACCC) must have the tools and powers to verify the actions of the digital platforms, test the operation of algorithms and to undertake inspections themselves.

Its responsibilities with respect to this Bill would be:

  1. Verification: This authority must be empowered not just to oversee but to verify the obligations of the digital platforms under this Code are being met. Its expanded (and in our opinion necessary) responsibilities would be holistic investigation of how algorithmic curation systems impact wider society,
  2. Algorithmic Audits: An algorithmic audit is a review process by which the outputs of algorithmic systems (in this case the curation systems of the digital platforms which display news media content) can be assessed for unfavourable, unwanted and/or harmful results. In addition to assessing if design decisions within the digital platform algorithms are actively anti-competitive, this process can also be used to assess numerous online harms to wider society and democracy such as disinformation and foreign interference.

How would an audit authority work?

The authority must have the ability to carry out an algorithm inspection with the consent of the digital platform company; or if the company doesn’t provide consent, and there are reasonable grounds to suspect they are failing to comply with requirements, to use compulsory audit powers. The resourcing to carry out these investigations could sit within the ACCC, but they should also have the power to instruct independent experts to undertake an audit on their behalf. Examples for how this might be structured can be seen in multiple industries from aviation to drug therapeutics. Recommendation Institute an audit authority under an independent regulator to both verify the provisions set out under the Minimum Standards of this Code, and empowered to investigate/audit the impact of algorithmic amplification on Australian society.

1.4 Regulatory Oversight

This Code does little to define the responsibilities of the news media companies for what happens after the data sharing and advance notification provisions are enacted. Whilst we respect that commercial entities should be free to use this information (to a certain degree) as they wish, recognising that these impacts have implications beyond commercial competitiveness including safeguarding democracy, public health information and security – we recommend that the Commission or an appropriate independent regulator be tasked with the necessary oversight, guardrails and powers to address potential issues of harm.

For example, the Explanatory Materials states that the advance notification requirements are intended to capture internal practice changes, with examples including:

  • Removal of inappropriate content
  • Suspending user accounts
  • Rules around permitted types of advertising content

It is our opinion the intended changes these advance notification seeks to capture as specifically referenced by the Government, represent a significant public and democratic interest that regulatory oversight must be incorporated.

This might include:

  • clear limitations and guidance around usage of shared information obtained through the Minimum Standards
  • provide a transparent procedural route for organisations to contest decisions (such as for content takedown and user removal)
  • mandated risk assessment prior to these provisions are enacted to ensure that the information isn’t used to harm the public
  • public reporting on how these provisions have been enacted and the results annually

Recommendation: Ensure proper regulatory oversight and guidance for how the information from the data sharing and advance notifications provisions under the Minimum Standards should be used.

1.5 Market Investigations

Impose specific powers, under a standardised framework, for the independent regulator to conduct investigations on the market implications of new products and/or services. By instituting these investigative powers in the natural policy lifecycle of this Code rather than relying on sporadic investigations led by the ACCC, we can ensure proper resourcing and agile delivery, as well as engendering an iterative and future-forward approach to the implementation and evolution of this Code. These powers should allow for the regulator to deepdive into anti-competitive practices and implications to the wider sector as the use case for digital platforms change, and as new products and services enter the market.

This should be modelled after corresponding sections of the DMA. Chapter IV details the specific circumstances in which these investigations can be instigated, of particular relevance is Article 17 around new products and/or services.

Recommendation: Impose specific investigative powers on the independent regulator to conduct inquiries on market implications of new products and/or services

1.6 Monitoring and Evaluation

As mentioned in our previous submission, annual impact assessment of this Code must be resourced and undertaken to ensure that it is actively working towards increasing media diversity in Australia.

Understanding how this Code has impacted the media landscape is vital to ensuring that this legislation is appropriately iterated to adapt to the rapidly evolving information landscape.

Questions may include:

  • Has this Code contributed to an increase in the number of journalists, news media companies and news innovation?
  • Has this Code contributed to an increase in the quality and objectivity of reporting?
  • Has this Code contributed to an increase in diversity within the Australian news media landscape?
  • Has this Code inadvertently concentrated bargaining power amongst a few news media outlets?
  • How has this Code affected regional, minority and independent news media companies and journalists?

Recommendation: Conduct a comprehensive assessment into how this Code has impacted the information and news media landscape annually

2.0 Future Directions

Attempts to rectify market and information imbalances between digital platforms and news media companies must not end with this Code. The impact of the digital platforms on society is expansive and emerging, and a broad outlook at reform must be had to ensure that legislation will impact these issues systematically and fairly.

1. News, Media and Journalism: Looking solely within the news and information sector, anti-competitive practices that ‘steal ad revenue’ isn’t the sole reason for the decline of news media organisations. From adequately resourcing public interest journalism to potentially exploring a digital ‘sin’ tax to do so, there are a suite of policy options that have not been explored to ensure a pluralistic media landscape. The Government must constantly reafrm its goal to achieve a diverse media landscape, and work to iterate, evolve and build upon this Code to achieve this.

2. Holistic Regulation: We must move beyond just the news sector as the impact of the digital platforms doesn’t just impact news publishers. From commerce to small business, innovation to defence, the policy approach to regulating the digital platforms must reflect the diverse and interdependent impacts they have on our society. To achieve the core intent of this proposed Code, we must open a discussion on what a holistic framework underpinned by integral user rights would look like.

The EU’s proposal for the DSA, European Democracy Action Plan and DMA provides an example of how disparate pieces of legislation (such as this Code) might be tied together under a cohesive framework. As the Government moves forward with scheduled (namely the Privacy Act review, Online Safety Act and Code on Disinformation) and future regulatory action, we look forward to contributing to the work to organise this action under an appropriate framework.

Please review our previous submission, where we made additional recommendations on:

  • Adequately resourcing public interest journalism
  • Exploring new frameworks to holistically classify the digital platforms in order to appropriately address their impacts and ensure that they are systematically addressed

3.0 Conclusions

We thank the Government for allowing the opportunity to share our thinking on an exciting and leading piece of legislation that will curb the over-sized influence of Big Tech in Australia. News media is a fundamental pillar of our democracy, and we are looking forward to working with you, the platforms and civil society to ensure that the media remains open and accessible.