Online Content and Services
Overview
- The regulation of online content and services in Australia is governed by the Online Safety Act 2021 (Cth). The Act establishes a framework and grants powers to the eSafety Commissioner to protect Australians from online harms. It operates alongside a co-regulatory model where the technology industry is required to develop mandatory and enforceable codes for regulating harmful content.
- Recent developments include the introduction of mandatory industry standards for certain services, a new law to restrict social media access for children under 16, and the tabling of a statutory review of the Act in February 2025. The review recommends changes, including a new 'duty of care' for online platforms to prevent foreseeable harm.
- The Online Safety Act commenced on 23 January 2022, replacing the previous legislative scheme under the Enhancing Online Safety Act 2015 (Cth) and parts of the Broadcasting Services Act 1992 (Cth). The Online Safety Act creates a stronger system to keep pace with technological advances and address threats posed by harmful online content and abusive behaviour.
Legal Framework
The
Online Safety Act empowers the eSafety Commissioner to administer several complaints-based schemes that can compel online service providers to remove specific types of harmful content, such as cyberbullying material, adult cyber-abuse, and non-consensual intimate images. This works alongside a co-regulatory model where eight sectors of the online industry are required to develop mandatory codes to regulate illegal and restricted content, with the Commissioner having the power to create binding industry standards if codes are inadequate. The Act also imposes proactive obligations on platforms, including the Basic Online Safety Expectations and a new requirement to take reasonable steps to prevent users under 16 from having social media accounts.
The office of the eSafety Commissioner was originally established by the earlier
Enhancing Online Safety Act 2015 (Cth). The
Online Safety Act continued the office of the Commissioner and significantly expanded its functions and powers.
- Takedown Schemes: The Act includes several complaints-based schemes that allow the eSafety Commissioner to require the removal of harmful content, often within a 24-hour timeframe. These schemes include:
- An Adult Cyber-Abuse Scheme (Part 7) for severe online abuse directed at an Australian adult, which operates at a high threshold of 'serious harm'. Non-compliance with a removal notice can attract a civil penalty of up to 500 penalty units.
- A broadened Cyberbullying Scheme (Part 5) for Australian children, covering a full range of online services including social media, games, and messaging apps.
- An updated Image-Based Abuse Scheme (Part 6) providing a mechanism for victims to seek help directly from eSafety for the removal of intimate images shared without their consent. This complements offences under the Criminal Code Act 1995 (Cth) for the non-consensual sharing of private sexual material (see Computer-Based Crime).
- An Online Content Scheme (Part 9) to regulate illegal and restricted content (defined as Class 1 and Class 2 material by reference to the National Classification Scheme) no matter where in the world it is hosted. The Commissioner has a range of powers including issuing removal notices, remedial notices to restrict access, and link deletion notices to search engines. Class 1 material includes content that advocates the doing of a terrorist act (see Counter-Terrorism).
- A rapid website-blocking power for responding to online crisis events by requiring internet service providers to block access to material depicting abhorrent violent conduct, such as live-streamed terrorist attacks.
- Social Media Minimum Age (Part 4A): The Online Safety Amendment (Social Media Minimum Age) Act 2024 (Cth) introduced a new Part 4A into the Act, receiving Royal Assent on 10 December 2024.
- It requires providers of 'age-restricted social media platforms' to take "reasonable steps" to prevent Australian children under 16 from having accounts with the platform. An 'age-restricted social media platform' is an electronic service whose sole or significant purpose is to enable online social interaction between users. The Minister may provide exemptions for certain platforms, such as messaging apps or online games.
- The obligation has a lead-in period of up to 12 months from the date of commencement. The eSafety Commissioner is responsible for formulating guidelines on what constitutes "reasonable steps".
- Platforms are prohibited from using personal information collected for age verification for any other purpose unless consent is obtained.
- Failure to take reasonable steps is a systemic penalty and not a user-by-user penalty. The maximum civil penalty is up to 150,000 penalty units (currently AUD $49.5 million).
- Industry Codes and Standards (Part 9, Division 7): The Act establishes a co-regulatory scheme requiring eight sections of the online industry to develop enforceable codes to regulate illegal and restricted content.
- After industry-drafted codes for 'Relevant Electronic Services' (RES) and 'Designated Internet Services' (DIS) were rejected because the Commissioner was not satisfied they provided "appropriate community safeguards," the Commissioner developed two mandatory industry standards which came into effect on 22 December 2024:
- These standards require providers to conduct a risk assessment by 21 June 2025 to determine the risk of Class 1 material on their service, with obligations tiered based on the assessed risk. The eSafety Commissioner has indicated an initial focus on awareness and education, with no enforcement action to be taken in the first six months except in cases of serious or deliberate non-compliance.
- The Social Media Minimum Age amendment Act also increased the penalty for non-compliance with industry codes and standards to 30,000 penalty units (AUD $49.5 million for corporations).
- Basic Online Safety Expectations (BOSE) (Part 4): The Act empowers the Minister to determine a set of expectations for providers. The BOSE include the expectation that providers will take reasonable steps to ensure users can use the service safely and to minimise harmful content. The eSafety Commissioner can require providers to report on their compliance.
- The Broadcasting Services Act originally contained the main regulatory framework for online content, particularly in its Schedules 5 and 7. These schedules established the original Online Content Scheme. Most of the functions previously under the BSA's Online Content Scheme are now covered by the more expansive and powerful schemes in the Online Safety Act.
- Section 231 of the Online Safety Act explicitly states that the new Act does not limit the operation of Schedule 8 to the Broadcasting Services Act. Schedule 8 specifically regulates online gambling promotional content provided during live sports broadcasts.
- The Online Safety Act continues to rely on the BSA for some definitions. For example, section 5 of the Online Safety Act defines "broadcasting service" and "on-demand program service" by referring to their meanings in the Broadcasting Services Act.
The
Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019 (Cth) amended the
Criminal Code Act 1995 (Cth) (the Criminal Code) to create specific offences for providers dealing with Abhorrent Violent Material (AVM). AVM is defined as material produced by a perpetrator (or their accomplice) that records or streams a terrorist act, murder, attempted murder, torture, rape, or kidnapping.
- Failure to Notify (s 474.33): An internet content or hosting service provider commits an offence if they become aware of AVM relating to conduct in Australia and do not refer the details to the Australian Federal Police within a reasonable time. The penalty is up to 800 penalty units.
- Failure to Remove (s 474.34): A content service or hosting provider commits an offence if they do not ensure the expeditious removal of AVM from their service. The penalty for an individual is up to 3 years imprisonment or a fine of up to $2.1 million. For a corporation, the penalty is a fine of up to $10.5 million or 10% of the company's annual turnover.
Regulatory & Policy Framework
Key instruments in the regulatory framework include:
Relevant Organisations
Inquiries & Consultations
- Statutory Review of the Online Safety Act 2021
- An independent statutory review of the Act, authored by Delia Rickard, was tabled in Parliament on 4 February 2025. The report made 67 recommendations to overhaul Australia's online safety laws. Key recommendations include:
- A new statutory 'Duty of Care': To introduce an overarching legal duty on service providers to take reasonable steps to prevent foreseeable online harms, shifting the onus onto platforms to be proactive. This would require risk identification, embedding safety by design, and transparency reporting. The Report recommends that eSafety be empowered to make mandatory codes about how to comply, but notes that compliance with a code would not create a safe harbour.
- Decoupling from the National Classification Scheme: This recognises that the classification rules for traditional media are not suited for the dynamic and user-generated nature of online content.
- Increased Penalties: Increasing maximum civil penalties, such as a fine for a duty of care breach to be the greater of 5% of global annual turnover or $50 million, and up to $10 million for non-compliance with removal notices.
- Restructuring eSafety: Replacing the single eSafety Commissioner with a multi-member 'Online Safety Commission' to resemble agencies like the ACCC or ACMA.
- Domestic Presence: Recommending the Government consider requiring major online platforms to establish a domestic legal presence in Australia as a condition of operation.
Industry Materials
- Valeska Bloch et al, 'New industry standards for online safety: what service providers need to know', Allens (Web Page, 6 February 2025).
- Philip Catania et al, 'Social media use in Australia to be restricted for under 16s', Corrs Chambers Westgarth (Web Page, 13 January 2025).
- Michael Swinson, Luke Hawthorne and Whye Yen Tan, 'Report of the Statutory Review of the Online Safety Act 2021 released', King & Wood Mallesons (Web Page, 10 February 2025).
- Paul Kallenbach, Dean Levitan, Milashni Richardson, 'Online Safety Act 2021 – Statutory review released', MinterEllison (Web Page, 13 February 2025).
- Industry Codes and Standards:
- Industry bodies like the Communications Alliance have been developing new industry codes to comply with the Online Safety Act. Six of these codes, covering sectors such as social media and search engines, were registered in 2023. The older codes from 2005 and 2008 have been superseded by this new framework.
- Digital Rights Watch, Submission to the Office of the eSafety Commissioner, Draft Consolidated Industry Codes of Practice for the Online Industry (Class 1C and Class 2 Material) (22 November 2024).
- Digital Rights Watch, Submission to the Environment and Communications Legislation Committee, Online Safety Amendment (Social Media Minimum Age) Bill 2024 [Provisions] (22 November 2024).