Home  |   The federal government’s social media big brother bill

The federal government’s social media big brother bill

 

“It is not difficult to deprive the great majority of independent thought. But the minority who will retain an inclination to criticize must also be silenced….Public criticism or even expressions of doubt must be suppressed because they tend to weaken pubic support….When the doubt or fear expressed concerns not the success of a particular enterprise but of the whole social plan, it must be treated even more as sabotage.”

― Friedrich August von Hayek, The Road to Serfdom

“The principle that the end justifies the means is in individualist ethics regarded as the denial of all morals. In collectivist ethics it becomes necessarily the supreme rule.”

― Friedrich Hayek

“It is one of the saddest spectacles of our time to see a great democratic movement support a policy which must lead to the destruction of democracy and which meanwhile can benefit only a minority of the masses who support it. Yet it is this support from the Left of the tendencies toward monopoly which make them so irresistible and the prospects of the future so dark.”

― Friedrich August von Hayek, The Road to Serfdom

“Our faith in freedom does not rest on the foreseeable results in particular circumstances but on the belief that it will, on balance, release more forces for the good than for the bad.”

― Friedrich A. Hayek

 

Introduction

The Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2023 (“The Bill“) if passed will allow the government to monitor social media content it disapproves of and to conscript social media companies to do their dirty work of curtailing such content through the Australian Communications and Media Authority (“ACMA”).

As such, the Bill represents a dangerous threat to freedom of online discourse.

The Bill

The Bill’s definitions make it clear that this is concerned with regulating online social media platforms and websites which aggregate content from publishing or permitting content the government  considers to be misleading and harmful:

2 Definitions

harm means any of the following:

(a) hatred against a group in Australian society on the basis of ethnicity, nationality, race, gender, sexual orientation, age, religion or physical or mental disability;

(b) disruption of public order or society in Australia;

(c) harm to the integrity of Australian democratic processes or of Commonwealth, State, Territory or local government institutions;

(d) harm to the health of Australians;

(e) harm to the Australian environment;

(f) economic or financial harm to Australians, the Australian economy or a sector of the Australian economy.

disinformation has the meaning given by subclause 7(2).

misinformation has the meaning given by subclause 7(1).

 

4 Digital platform service

Digital platform services

(1) For the purposes of this Schedule, a digital platform service is a digital service that is:

(a) a content aggregation service (see subclause (2)); or
(b) a connective media service (see subclause (3)); or
(c) a media sharing service (see subclause (4)); or
(d) a digital service specified by the Minister in an instrument under subclause (6);
but does not include a digital service to the extent to which it is:
(e) an internet carriage service; or
(f) an SMS service; or
(g) an MMS service.

7 Misinformation and disinformation

(1) For the purposes of this Schedule, dissemination of content using a digital service is misinformation on the digital service if:

(a) the content contains information that is false, misleading or deceptive; and

(b) the content is not excluded content for misinformation purposes; and

(c) the content is provided on the digital service to one or more end-users in Australia; and

(d) the provision of the content on the digital service is reasonably likely to cause or contribute to serious harm.

 

(2) For the purposes of this Schedule, dissemination of content using a digital service is disinformation on the digital service if:

(a) the content contains information that is false, misleading or deceptive; and

(b) the content is not excluded content for misinformation purposes; and

(c) the content is provided on the digital service to one or more end-users in Australia; and

(d) the provision of the content on the digital service is reasonably likely to cause or contribute to serious harm; and

(e) the person disseminating, or causing the dissemination of, the content intends that the content deceive another person.

(3) For the purposes of this Schedule, in determining whether the provision of content on a digital service is reasonably likely to cause or contribute to serious harm, have regard to the following matters:

(a) the circumstances in which the content is disseminated;

(b) the subject matter of the false, misleading or deceptive information in the content;

(c) the potential reach and speed of the dissemination;

(d) the severity of the potential impacts of the dissemination;

(e) the author of the information;

(f) the purpose of the dissemination;

(g) whether the information has been attributed to a source and, if so, the authority of the source and whether the attribution is correct;

(h) other related false, misleading or deceptive information disseminated;

(i) any other relevant matter.

 

The Bill requires online platforms to provide ACMA with any and all information it wants to monitor online discourse:

14 ACMA may make digital platform rules in relation to record

Records

(1) The digital platform rules may require a digital platform provider of:

(a) a digital platform service specified in the rules; or

(b) a digital platform service in a class of digital platform services specified in the rules;

to make and retain records relating to the following:

(c) misinformation or disinformation on the service;

(d) measures implemented by the provider to prevent or respond to misinformation or disinformation on the service, including the effectiveness of the measures;

(e) the prevalence of content containing false, misleading or deceptive information provided on the service (other than excluded content for misinformation purposes).

 

15 Compliance with the digital platform rules

(1) A digital platform provider must not contravene digital platform rules made for the purposes of clause 14.

Civil penalty provision

(2) Subclause (1) is a civil penalty provision.

(3) A digital platform provider who contravenes subclause (1) commits a separate contravention of that subclause in respect of each day (including a day of the making of a relevant civil penalty order or any subsequent day) during which the contravention continues.

 

18 ACMA may obtain information and documents from digital platform providers

Scope

(1) This clause applies to a digital platform provider of a digital platform service if:

(a) the ACMA has reason to believe that the provider:

(i) has information or a document that is relevant to a matter mentioned in subclause (2); or

(ii) is capable of giving evidence which the ACMA has reason to believe is relevant to a matter mentioned in subclause (2); and

(b) the ACMA considers that it requires the information, document or evidence for the performance of the ACMA’s function under paragraph 10(1)(mb), (mc), (md), (me), (mf),  (mg) or (q) of the Australian Communications and Media Authority Act 2005.

(2) For the purposes of paragraph (1)(a), the matters are as follows:

(a) misinformation or disinformation on the service;

(b) measures implemented by the provider to prevent or respond to misinformation or disinformation on the service, including the effectiveness of the measures

(c) the prevalence of content containing false, misleading or deceptive information provided on the service (other than excluded content for misinformation purposes).

(3) The ACMA may, by written notice given to the provider, require the provider:

(a) to give to the ACMA, within the period and in the manner and form specified in the notice, any such information; or

(b) to produce to the ACMA, within the period and in the manner specified in the notice, any such documents; or

(c) to make copies of any such documents and to produce to the ACMA, within the period and in the manner specified in the notice, those copies; or

(d) if the provider is an individual—to appear before the ACMA at a time and place specified in the notice to give any such evidence, either orally or in writing, and produce any such documents; or

(e) if the provider is a body corporate or a public body—to cause a competent officer of the body to appear before the ACMA at a time and place specified in the notice to give any such evidence, either orally or in writing, and produce any such documents; or

(f) if the provider is a partnership—to cause an individual who is:

(i) a partner in the partnership; or

(ii) an employee of the partnership;

to appear before the ACMA at a time and place specified in the notice to give any such evidence, either orally or in writing, and produce any such documents.

(5) A digital platform provider must comply with a requirement under subclause (3).

Civil penalty provision

(6) Subclause (5) is a civil penalty provision.

(7) A digital platform provider who contravenes subclause (5) commits a separate contravention of that subclause in respect of each day (including a day of the making of a relevant civil penalty order or any subsequent day) during which the contravention continues.

Designated infringement notice provision

(8) Subclause (5) is a designated infringement notice provision.

 

19 ACMA may obtain information and documents from other persons

Scope

(1) This clause applies to a person if:

(a) the ACMA has reason to believe that the person:

(i) has information or a document that is relevant to a matter mentioned in subclause (2); or

(ii) is capable of giving evidence which the ACMA has reason to believe is relevant to a matter mentioned in subclause (2); and

(b) the ACMA considers that it requires the information, document or evidence for the performance of the ACMA’s function under paragraph 10(1)(md) of the Australian Communications and Media Authority Act 2005.

(2) For the purposes of paragraph (1)(a), the matters are as follows:

(a) misinformation or disinformation on a digital platform service;

(b) measures implemented by a digital platform provider to prevent or respond to misinformation or disinformation on a digital platform service, including the effectiveness of the measures;

(c) the prevalence of content containing false, misleading or deceptive information provided on a digital platform service (other than excluded content for misinformation purposes).

ACMA may require information, documents or evidence

(3) The ACMA may, by written notice given to the person, require the person:

(a) to give to the ACMA, within the period and in the manner and form specified in the notice, any such information; or

(b) to produce to the ACMA, within the period and in the manner specified in the notice, any such documents; or

(c) to make copies of any such documents and to produce to the ACMA, within the period and in the manner specified in the notice, those copies; or

(d) if the person is an individual—to appear before the ACMA at a time and place specified in the notice to give any such evidence, either orally or in writing, and produce any such documents; or

(e) if the person is a body corporate or a public body—to cause a competent officer of the body to appear before the ACMA at a time and place specified in the notice to give any such evidence, either orally or in writing, and produce any such documents; or

(f) if the person is a partnership—to cause an individual who is:

(i) a partner in the partnership; or

(ii) an employee of the partnership;

to appear before the ACMA at a time and place specified in the notice to give any such evidence, either orally or inwriting, and produce any such documents.

 

The bill even tramples against the right against self-incrimination:

 

21 Self-incrimination

(1) An individual is not excused from giving information or evidence or producing a document or a copy of a document under this Division on the ground that giving the information or evidence or producing the document or copy might tend to incriminate the individual in relation to an offence.

 

The Bill encourages online platforms to develop codes for containing misinformation and disinformation and allows ACMA to enforce those codes and impose rules and standards if platforms don’t crests their own codes:

Division 3—General principles relating to misinformation codes and misinformation standards

 

32 Statement of regulatory policy

The Parliament intends that one or more bodies or associations that the ACMA is satisfied represent sections of the digital platform industry should develop one or more codes (misinformation codes) that require participants in those sections of the digital platform industry to implement measures to prevent or respond to misinformation and disinformation on digital platform services.

 

33 Examples of matters that may be dealt with by misinformation codes and misinformation standards 

(1) This clause sets out examples of matters that may be dealt with by misinformation codes and misinformation standards.

(2) The applicability of a particular example will depend on which section of the digital platform industry is involved.

(3) The examples are as follows:

(a) preventing or responding to misinformation or disinformation on digital platform services;

(b) using technology to prevent or respond to misinformation or disinformation on digital platform services;

(c) preventing or responding to misinformation or disinformation on digital platform services that constitutes an act of foreign interference (within the meaning of the Australian Security Intelligence Organisation Act 1979)

(d) preventing advertising involving misinformation or disinformation on digital platform services;

(e) preventing monetisation of misinformation or disinformation on digital platform services;

(f) supporting fact checking;

(g) allowing end-users to detect and report misinformation or disinformation on digital platform services;

(h) giving information to end-users about the source of political or issues-based advertisements;

(i) policies and procedures for receiving and handling reports and complaints from end-users;

(j) giving end-users and others information about misinformation or disinformation on digital platform services.

 

Subdivision C—Compliance with misinformation codes

43 Compliance with registered misinformation code

(1) If:

(a) a misinformation code that applies to participants in a particular section of the digital platform industry is registered under this Part; and

(b) a digital platform provider is a participant in that section of the digital platform industry; the provider must comply with the code.

Civil penalty provision

(2) Subclause (1) is a civil penalty provision.

Designated infringement notice provision

(3) Subclause (1) is a designated infringement notice provision.

Warnings

(4) If the ACMA is satisfied that a digital platform provider has contravened subclause (1), the ACMA may issue a formal warning to the provider.

 

44 Remedial directions—contravention of misinformation code

Scope

(1) This clause applies if:

(a) a misinformation code that applies to participants in a particular section of the digital platform industry is registered under this Part; and

(b) a digital platform provider is a participant in that section of the digital platform industry; and

(c) the ACMA is satisfied that the provider has contravened, or is contravening, the code.

(2) The ACMA may give the provider a written direction requiring the provider to take specified action directed towards ensuring that the provider does not contravene the code, or is unlikely to contravene the code, in the future.

(3) A digital platform provider must not contravene a direction under subclause (2).

Civil penalty provision

(4) Subclause (3) is a civil penalty provision.

 

45 General requirement—consideration of freedom of political communication

Before determining a standard under this Division, the ACMA must consider:

(a) whether the standard would burden freedom of political communication; and

(b) if so, whether the burden would be reasonable and not excessive, having regard to any circumstances the ACMA considers relevant.

 

46 ACMA may determine standards—request for a code is not complied with

(1) This clause applies if:

(a) the ACMA has made a request under subclause 38(1) in relation to the development of a code that is to:

(i) apply to participants in a particular section of the digital platform industry; and

(ii) deal with one or more matters relating to the operation of digital platform services by those participants; and

(b) any of the following conditions is satisfied

(i) the request is not complied with;

(ii) if indicative targets for achieving progress in the development of the code were specified in the notice of request—any of those indicative targets were not met;

(iii) the request is complied with, but the ACMA subsequently refuses to register the code; and

(c) the ACMA is satisfied that it is necessary or convenient for the ACMA to determine a standard in relation to that matter or those matters in order to provide adequate protection for the community from misinformation or disinformation on the services.

(2) The ACMA may, by legislative instrument, determine a standard that applies to participants in that section of the digital platform industry and deals with that matter or those matters. A standard under this subclause is to be known as a misinformation standard.

(3) Before determining a standard under this clause, the ACMA must consult the body or association to whom the request mentioned in paragraph (1)(a) was made.

 

47 ACMA may determine standards—no industry body or association formed

(1) This clause applies if:

(a) the ACMA is satisfied that a particular section of the digital platform industry is not represented by a body or association; and

(b) the ACMA has published a notice under subclause 39(1) relating to that section of the digital platform industry; and

(c) that notice:

(i) states that, if such a body or association were to come into existence within a particular period, the ACMA would be likely to give a notice to that body or association under subclause 38(1); and

(ii) sets out one or more matters relating to the operation of digital platform services by participants in that section of the digital platform industry; and

(d) no such body or association comes into existence within that period; and

(e) the ACMA is satisfied that it is necessary or convenient for the ACMA to determine a standard in relation to that matter or those matters in order to provide adequate protection for the community from misinformation or disinformation on the services.

(2) The ACMA may, by legislative instrument, determine a standard that applies to participants in that section of the digital platform industry and deals with that matter or those matters. A standard under this subclause is to be known as a misinformation standard.

 

48 ACMA may determine standards—total failure of misinformation code

(1) This clause applies if:

(a) a registered misinformation code that:

(i) applies to participants in a particular section of the 6 digital platform industry; and  (ii) deals with one or more matters relating to the operation of digital platform services by those participants;

has been registered under this Part for at least 180 days; and

(b) the ACMA is satisfied that the code is totally deficient (as defined by subclause (6)); and

(c) the ACMA has given the body or association that developed the code a written notice requesting that deficiencies in the code be addressed within a specified period; and

(d) that period ends and the ACMA is satisfied that it is necessary or convenient for the ACMA to determine a standard that applies to participants in that section of the digital platform industry and deals with that matter or those matters.

(3) The ACMA may, by legislative instrument, determine a standard that applies to participants in that section of the digital platform industry and deals with that matter or those matters. A standard under this subclause is to be known as a misinformation standard.

(6) For the purposes of this clause, a misinformation code that:

(a) applies to participants in a particular section of the digital platform industry; and

(b) deals with one or more matters relating to the operation of digital platform services by those participants;

is totally deficient if, and only if, the code is not operating to provide adequate protection for the community from misinformation or disinformation on the services.

 

Concerns

The Bill allows the federal government through ACMA to surveil and investigate the political and public discourse of citizens and imposes penalties on online platforms who do not comply with what ACMA determines to be adequate ways of combating misinformation or disinformation.

In our view, this represents a dangerous step in the direction of more government oversight and control of public discourse. What is true and not true and what is harm and not harmful is often the subject of reasonable disagreement. History shows that governments should not be the arbiters of truth. Governments are often mistaken about what is true, and inevitably political interests consciously or subconsciously predominate. Although the Bill professes to not be in conflict with the implied Constitutional right to political communication in Australia, it is difficult to see how provisions which require online platforms to curtail public speech the government deems untrue and harmful could have any real effect consistent with that right.

In a day and age where social media censorship is already an issue of concern, the last thing Australia needs is for government to be encouraging and inducing social media websites to censor information it deems to be incorrect, or to punish platforms for not agreeing to go along with it.

 

Posted on Categories Human rights, Liberty Tags , , , ,

4 thoughts on “The federal government’s social media big brother bill”

Leave a Reply

© Sterling Law QLD . All Rights Reserved. Copyright 2017-2026 Sterling Law (Qld) Pty Ltd ACN 165 643 881

Discover more from Sterling Law QLD

Subscribe now to keep reading and get access to the full archive.

Continue reading