Internet & online safety

Contributed by Luke Hannath and current to June 2025

Introduction

Media includes print (newspapers, magazines, newsletters); digital broadcast (radio, television and subscription television); and online via the internet.

The media plays a vital role in sharing information, shaping opinions, and connecting people. The internet has expanded the scope and reach of the media exponentially. Through the emergence of new technologies, and digital platforms, we live in a world where consumers are also creators and disseminators of media content. These advancements have brought with them both opportunities and challenges.

Although the law has often lagged behind the development of new media such as the internet, in general, internet 'publishers' face the same set of legal issues. The publication on the internet may be an act of contempt, defamation, or a breach of copy right, for example (see Copyright; Defamation).

This section explains the legal and regulatory framework of online safety, in particular issues relating to image-based abuse, technology-facilitated abuse and cyberbullying, guidance on preventing and reporting online harms.

The eSafety Commissioner

Australia’s scheme for regulating internet content is administered by the federal government. The eSafety Commissioner (eSafety) is Australia’s independent regulator for online safety.

The Online Safety Act 2021 (Cth) (the OSA) establishes the eSafety Commissioner, giving eSafety substantial powers to keep Australians safer online. Providers of certain kinds of online services captured under the OSA are also expected to adhere to a set of Basic Online Safety Expectations, set by the Government, which include taking reasonable steps to proactively minimise material or activity that is unlawful or harmful, ensuring that users can use a service in a safe manner, putting in place user-reporting mechanisms, clearly outlining providers’ terms of service and enforcing penalties for people who breach these terms. eSafety, as a complaints-based regulator, also has investigation teams that investigate online content related to the cyberbullying of children, adult cyber abuse, image-based abuse (sharing, or threatening to share, intimate images without the consent of the person shown) and illegal and restricted content. Part 9, Division 7 of the OSA provides for developing industry codes and standards, where industry bodies or associations are to develop codes to regulate certain types of harmful online material, and for eSafety to register and enforce the codes. Codes and standards ensure the safeguards for the prevention and removal of harmful content online.

Four investigation schemes

eSafety support people experiencing online harms by administering four complaints and investigations schemes. These are:
  • the Adult Cyber Abuse Scheme,
  • the Cyberbullying Scheme,
  • the Image-Based Abuse Scheme; and
  • the Online Content Scheme.
These schemes compel the removal of abusive and harmful content, limiting the ability of perpetrators to continue their abuse, and enable eSafety to take enforcement action against those platforms and service providers who fail to comply with regulatory notices.

Note: Reports of cyberbullying/adult cyber abuse can be made to eSafety at: https://www.esafety.gov.au/report/forms.

Detailed information about eSafety’s investigations schemes can be found in guidance documents on eSafety’s website: Regulatory guidance | eSafety Commissioner.

Adult Cyber Abuse Scheme

Adult cyber abuse means online communication to or about a person who is 18 years or older, which is intended to cause them serious harm and is menacing, harassing or offensive in all the circumstances. Part 7 of the OSA establishes the adult cyber abuse scheme. ‘Adult cyber abuse’ is the term reserved for the most severely abusive material intended to cause serious psychological or physical harm. This would include material which sets out realistic threats, places people in real danger, is excessively malicious or is unrelenting. The scheme is not intended to regulate hurt feelings, purely reputational damage, bad online reviews, strong opinions or banter.

For eSafety to investigate, the material must target a specific Australian adult, not groups of people. Under the OSA, ‘an Australian adult’ means an individual over 18 who ordinarily resides in Australia. eSafety cannot use its powers under the Adult Cyber Abuse scheme to help an adult who does not ordinarily reside in Australia.

Sections 88, 89 and 90 of the OSA give eSafety the power to issue removal notices to the platform that provided the cyber abuse material, to people who have posted cyber abuse material online, or to a hosting service provider that has hosted cyber abuse material, for the material to be taken down.

Cyberbullying Scheme

Section 6 of the OSA defines cyberbullying material as online communication to or about an Australian child, where the material was intended to have a particular effect on an Australian child, and the material was likely to have the effect of seriously threatening, seriously intimidating, seriously harassing or seriously humiliating the Australian child. It can include posts, comments, emails, messages, memes, images and videos.

As with Adult Cyber Abuse, for the Cyberbullying scheme to investigate, the material must target a specific Australian child, not groups of children. When the OSA refers to ‘an Australian child’ it generally means any individual under 18 who ordinarily resides in Australia.

Section 6(4) of the OSA recognises that a person in a position of authority over a child (such as their parent, carer, teacher or employer) may need to send, post or share material that could upset the child. If this action is considered reasonable in the circumstances, it will not be treated by eSafety as cyberbullying. For example, if a teacher posts class exam results online or an employer emails a young person to notify them of their dismissal, neither of those materials would meet the definition of cyberbullying.

The OSA gives eSafety the power to give removal notices to online service providers (section 65), to a hosting service provider (section 66) and to people (end-users) who have posted, shared or sent cyberbullying material, requiring them to remove the material (section 70).

Adult cyber abuse and child cyber-bullying: Intersection with other laws

There is no specific criminal offence of cyberbullying and/or adult cyber abuse, but by engaging in cyberbullying/adult cyber abuse behaviour, a person may commit several offences. In the Northern Territory, these offences include:
  • Using a telecommunications network with intention to commit a serious offence, under section 474.14 of the Criminal Code Act 1995 (Cth);
  • Using a carriage service to make a threat, under section 474.15 of the Criminal Code Act 1995 (Cth);
  • Using a carriage service for a hoax threat, under section 474.16 of the Criminal Code Act 1995 (Cth);
  • Using a carriage service to menace, harass or cause offence, under section 417.17 of the Criminal Code Act 1995 (Cth);
  • Using a carriage service for suicide related material, under section 414.29A of the Criminal Code Act 1995 (Cth);
  • Stalking, under section 189 the Criminal Code Act 1983 (NT);
  • Defamation, under sections 203 and 204 of the Criminal Code Act 1983 (NT); and
  • Assault, under section 187 of the Criminal Code Act 1983 (NT), or at common law.

Image-Based Abuse Scheme

Part 6 of the OSA Act establishes the image-based abuse scheme. Image-based abuse (sometimes called ‘revenge porn’) means sharing online, or threatening to share, an intimate image without the consent of the person shown. This can include intimate images that have been digitally altered, like deepfakes.

Image-based abuse is generally intended to cause harm, distress, humiliation and embarrassment. This can be through making the images or videos visible to particular people such as a person’s workplace, family and/or friends, or the public using an online service, or by threatening to make them visible (often to control, coerce, ‘punish’ or blackmail the person targeted by the image-based abuse).

Under section 75 of the OSA, it is unlawful for a person to post online (or threaten to post online) an intimate image of a person without their consent, if either the person posting or threatening to post the intimate image and the person whom which the photo is of, are residents of Australia.

eSafety has a range of formal compliance and enforcement options available when a person contravenes section 75. This includes issuing a service provider notification informing an online service provider that an intimate image has been shared without consent on its service or giving a removal notice to the end-user, service provider or hosting service provider requiring them to take all reasonable steps to remove the material within 24 hours. A remedial direction can also be given to the end-user who posted or threatened to post the intimate images requiring them to take specified action directed to ensuring the person does not contravene section 75 in the future e.g. deleting all intimate images of the complaint that they have in their possession.

Section 21 of the OSA details that consent must be ‘express, voluntary and informed’, which means that the person understands what they are being asked and has not been tricked or forced into agreeing to their intimate image being shared.

It is against the law to share an intimate image of someone who is under the age of 18 or someone who cannot give express, voluntary and informed consent even if that person has said that they agree.

In the NT, it is a crime to distribute intimate images of a person without their consent and threatening to distribute intimate images. These offences are in Part VI Division 7A of the Criminal Code Act 1983 (NT) (specifically sections 208AB and 208AC). Blackmail is also an offence in the NT under section 228 of the Criminal Code Act 1983 (NT).

If you, or someone you know is experiencing image-based abuse, please see the eSafety Commissioner’s Image Based Abuse webpage for information and guidance.

Online Content Scheme

The Online Content Scheme, established by Part 9 of the OSA, is designed to protect Australians, particularly children, from exposure to harmful material online. The Online Content Scheme allows members of the public to make complaints to eSafety about illegal or offensive content online, and eSafety will assess these complaints. This scheme also allows eSafety to issue servicer provider notices to online service providers to direct the removal of certain material, or remove access to material, from their service or ensure that access to certain types of material is restricted. This scheme also provides for the development of industry codes and standards that relate to illegal and restrict online content.

The OSA defines illegal or restricted content as either 'class 1 material' or 'class 2 material', defined by reference to Australia's National Classification Scheme, a cooperative arrangement between the Australian Government and state and territory governments for the classification of films, computer games and certain publications.

Class 1 material includes that which is or would be refused classification under the National Classification Scheme as it:
  • depicts, expresses or otherwise deals with matters of sex, drug misuse or addiction, crime, cruelty, violence or revolting or abhorrent phenomena in such a way that they offend against the standards of morality, decency and propriety generally accepted by reasonable adults;
  • describes or depicts in a way that is likely to cause offence to a reasonable adult, a person who is, or appears to be, a child under 18; or
  • promotes, incites or instructs in matters of crime or violence.
Class 2 material is material that is, or would likely be, classified as either:
  • X18+; or
  • R18+
under the National Classification Scheme, because it is considered inappropriate for general public access and/or for children and young people under 18 years old.

Where class 1 or class 2 material is reported, eSafety has several regulatory options, including giving a service provider notification, which is a written notice that informs an online service provider that eSafety is aware of illegal or restricted online content on its service. eSafety can also give a removal notice, requiring the recipient to take all reasonable steps to remove class 1 material or class 2A material from a service within 24 hours or a longer timeframe specified by eSafety, or a link deletion notice, requiring the recipient to stop providing a link that gives Australian service users access to class 1 material within 24 hours (or longer as determined).

If eSafety considers particular material to be of a sufficiently serious nature to warrant referral to a law enforcement agency, eSafety must notify a member of an Australian police force. Sufficiently serious online material will ordinarily include material that:
  • depicts or describes child sexual exploitation;
  • advocates a terrorist act, or
  • promotes, incites or instructs in matters of crime. eSafety has Memorandums of Understanding in place with the Australian Federal Police, and all State and Territory law enforcement agencies, to enable the fast referral of sufficiently serious material.

Abhorrent violent conduct material

Interaction with the Criminal Code 1995 (Cth)

There are several offences aimed at reducing the incidence of online platforms being misused by perpetrators of violence, which include:
  • Failure to report: an offence for internet service providers, hosting and content providers that fail to notify the Australian Federal Police within a reasonable time about material relating to abhorrent violent conduct occurring in Australia (section 474.33 of the Criminal Code Act 1995 (Cth)); and
  • Failure to remove: offences for content service and hosting services that fail to remove access to abhorrent violent material expeditiously where that material is reasonably capable of being accessed within Australia (section 474.34 of the Criminal Code Act 1995 (Cth)).
Powers of eSafety Commissioner

The OSA includes a number of powers which allow eSafety to request or require an internet service provider to block material that promotes, incites, instructs in or depicts ‘abhorrent violent conduct’.

Abhorrent violent conduct occurs when a person: engages in a violent terrorist act; murders another person; attempts to murder another person; tortures another person; rapes another person; or kidnaps another person using violence or the threat of violence. Abhorrent violent conduct is defined in section 474.32 the Criminal Code Act 1995 (Cth).

eSafety may give notices relating to abhorrent violent material under the Criminal Code Act 1995 (Cth). These are not removal notices, but the purpose is to make certain online service providers aware of abhorrent violent material on or hosted by their services. If a service is later prosecuted for failing to remove or cease hosting abhorrent violent material, the notice creates a presumption that the service was reckless as to whether the material was abhorrent violent material and could be accessed using their service.

These powers protect the Australian community by seeking to prevent the viral, rapid and widespread distribution online of terrorist and extreme violent material, such as the video created by the perpetrator of the March 2019 Christchurch terrorist attack.

It is intended that eSafety can issue blocking requests or blocking notices in situations where an online crisis event has been declared by eSafety.

Any blocking direction made under the Act would only be in place for a limited time, to be determined on a case-by-case basis. Following the initial blocking period, eSafety can take further action to address the relevant material, in consultation with the ISPs and affected websites.

Basic Online Safety Expectations

The Basic Online Safety Expectations, known as ‘the Expectations’, are a key element of the OSA, and are established under Part 4 of the OSA. The Expectations are legislated through the Online Safety (Basic Online Safety Expectations) Determination 2022 (Cth).

The Expectations set out the Australian Government’s expectations of the steps that should be taken by providers of social media services, messaging services, gaming services, file sharing services, apps and certain other sites accessible from Australia to keep Australians safe online.

The Basic Online Safety Expectations are another systemic tool designed to improve online service providers’ safety practices, transparency and accountability. They set out the Australian Government’s non-binding expectations that certain service providers will, among other things, take reasonable steps to minimise certain material, including Class 1 material and material that depicts abhorrent violent conduct.

The OSA provides the eSafety Commissioner with the power to give enforceable notices to service providers requiring them to report on the steps they are taking to comply with the Basic Online Safety Expectations. Information obtained from the reporting notices is published in transparency summaries where appropriate. These reports are available on eSafety’s website: Responses to transparency notices | eSafety Commissioner.

Industry Codes and Standards for Illegal and Restricted Content

The OSA also provides for the development of industry codes to protect Australians from illegal and restricted online content by setting obligations across eight online industry sectors to deal with this material at a systemic level.

These industry codes and standards contain measures to address class 1 (inclusive of ‘class 1A’ and ‘class 1B’) online material. These classes cover the most seriously harmful online content, such as child sexual exploitation material and pro-terror material. The Phase 1 Codes for class 1 material require providers of certain online services to take proactive steps to reduce the availability of seriously harmful online content, such as child sexual exploitation and pro-terror material.

Industry associations representing these sectors have the opportunity to draft the codes and submit them to the eSafety Commissioner for consideration as part of this co-regulatory approach. If the codes provide appropriate community safeguards, the eSafety Commissioner may register them, and they become enforceable. If not, the eSafety Commissioner may determine enforceable standards instead.

Industry codes

As at June 2025, six industry-drafted codes and an overarching set of head terms have been registered and are in effect, with obligations for social media services, search engines, app stores, hosting providers, equipment providers and internet service providers (ISPs) to address Class 1 material. Examples of measures in the codes include:

  • Using systems, processes and/or technologies to detect and remove certain types of pro-terror material.
  • Implementing systems, processes, and technologies that enable the provider to take appropriate enforcement action against end-users who breach policies prohibiting pro-terror material.
  • Providing tools which enable Australian end-users to report, flag, and/or make a complaint about pro-terror material accessible on the service.

Industry standards

As at June 2025, the eSafety Commissioner has determined two standards, for relevant electronic services (such as messaging and gaming services) and designated internet services (including some generative AI services, and online file and photo storage services).

Complaints and reports

eSafety provides a safety net for Australians harmed by serious online abuse or exposed to illegal and restricted content. eSafety has legal powers to protect Australians across most online services and platforms, via complaints reporting schemes.

eSafety investigates complaints and helps to stop and remove serious online abuse, as well as illegal and restricted online content.

Where appropriate, eSafety’s investigators work with the online industry to resolve individual complaints. Where a collaborative approach is not appropriate or sufficient to protect Australians, we draw on our robust range of regulatory options.

You can also report online crimes to the police on the ReportCyber website.

What you can complain about

Anyone who is affected can report and complain about online content that is perceived as Cyberbullying, Adult Cyber Abuse, Image-based Abuse and Illegal and Restricted Content under the above described schemes.

The harmful content could be a post, comment, text, message, chat, livestream, meme, image, video or email. It can be sent or shared via an online or electronic service or platform, including a:
  • social media service
  • email service
  • chat app
  • interactive online game
  • forum
  • website.
Further information about how to lodge a report or complaint and which steps to take first can be found on the eSafety Commissioner’s website: https://www.esafety.gov.au/report/what-you-can-report-to-esafety

When a complaint is received

Valid complaints will be assigned to an investigator for review. The investigator will contact the reporter using the contact information supplied. The reporter may be asked to supply more information or material. The reporter will also be notified if eSafety decides not to investigate the report, or if the report does not meet the threshold required to take action.

eSafety investigates reports of illegal and restricted online content as quickly as possible but prioritise reports about seriously harmful content such as images and videos showing the sexual abuse of children.

Possible outcomes of an investigation

Cyberbullying: Removal of harmful content, issuing a notice requiring the person responsible to refrain from further cyberbullying and/or apologise, issuing fines or penalties for services or platforms that don’t remove content, further legal action.

Adult cyber abuse: Removal of harmful content, fines or penalties for services or platforms that don’t remove content, fines or penalties for the person responsible if they don’t remove the content, further legal action.

Image-based abuse: Removal of intimate images and videos, fines, penalties or other regulatory action against the person responsible.

Illegal and restricted content: Removal of illegal content, removal or restriction of access to content that is inappropriate for children, referral of content to law enforcement agencies for further investigation.

In serious cases the Australian Federal Police or the relevant overseas law enforcement agencies may become involved.

eSafety can refer material it considers to be of a sufficiently serious nature to an Australian police force for investigation.

eSafety regulatory powers and enforcement

Regulatory powers

eSafety has a number of regulatory tools available to it under the OSA, which allows it to restrict or remove access to harmful material in Australia. These powers include:
  • Removal notices for service providers: requiring the provider of an online service to remove or take all reasonable steps to remove or stop hosting online material that meets the criteria for child cyberbullying, image-based abuse, adult cyber abuse or illegal and restricted content within 24 hours (or longer as directed).
  • Removal notices for users of a service (‘end-users’): requiring the end-user to take all reasonable steps to remove online material that meets the criteria for image-based abuse and adult cyber abuse within 24 hours (or longer as directed).
  • End-user notices: requiring an end-user who is sharing cyberbullying material targeting a child to take specific steps including removing the material, refraining from sharing further cyberbullying material and apologising to the child.
  • Link deletion notices: requiring a provider of an internet search engine service to stop providing a link to class 1 material within 24 hours (or longer as directed).
  • App removal notices: requiring a provider of an app distribution service to stop enabling end users in Australia to download from the service an app facilitating the sharing of class 1 material within 24 hours (or longer as directed)
  • Blocking notices: requiring an internet service provider to take steps to disable access to abhorrent violent conduct material.
  • Remedial notices: a written notice requiring the recipient to take all reasonable steps to remove restricted content from a service, or place it behind a restricted access system, within 24 hours or a longer timeframe as specified by eSafety.

Enforcement

Enforcement action is available to eSafety in a number of circumstances under the OSA. These range from informal to formal action or seeking civil penalties in court. eSafety’s Compliance and Enforcement Policy enforcement policy is available here: Compliance and Enforcement Policy.pdf (esafety.gov.au). It outlines that eSafety takes a graduated approach, where appropriate, to compliance and enforcement that strives to balance the protection of Australians with ensuring no undue burden is imposed on online service providers and individuals. The types of decisions that eSafety makes in exercising our compliance and enforcement functions include:
  • whether it is appropriate or desirable to exercise our discretion to take no action;
  • whether to commence an investigation;
  • whether compliance or enforcement action is appropriate in the circumstances;
  • what is the most effective way to facilitate the removal of harmful material;
  • whether to direct regulatory action towards an individual responsible for harmful material or conduct;
  • what, if any, investigative and/or information gathering powers should be used and how; and
  • whether extending the time for compliance with a notice, direction or similar action under the Act (for example the 24 hour period to comply with a removal notice) is appropriate.
The following are the types of enforcement action that eSafety may take, following an alleged breach of the OSA:
  • Formal warning: places an end-user or online service provider on notice where they have breached a civil penalty provision or otherwise failed to comply with certain provisions under the OSA.
  • Infringement Notice: sets out the particulars of an alleged contravention of the OSA and specifies a penalty that can be paid in lieu of further action being taken.
  • Enforceable notice or undertaking: a formal promise to act, or refrain from acting, in a particular manner to ensure compliance with the OSA.Once eSafety accepts an undertaking, it becomes enforceable by a court.
  • Injunction: an order made by a Court compelling or restraining specific action by a person. The eSafety Commissioner has statutory power to apply to the Federal Court of Australia seeking an injunction pursuant to s 165 of the OSA.
  • Civil penalty proceedings: a court order requiring a person who is found to have contravened a civil penalty provision of the OSA to pay the Commonwealth a penalty. A civil penalty order is the most serious enforcement option available to eSafety. Generally, a civil penalty order will be sought by eSafety where the person has caused significant harm, has engaged in multiple contraventions or other compliance and enforcement options have been ineffective.

Contemporary issues

Social media age restrictions

The Online Safety Amendment (Social Media Minimum Age) Act 2024 (Cth) introduces a mandatory minimum age of 16 for accounts on certain social media platforms, forming one part of a broader strategy to create safer digital spaces for everyone.

The change aims to strengthen existing measures for protecting young users, especially where there are particular risks associated with accessing potentially harmful social media content and features such as persistent notifications and alerts that have been found to have a negative impact on sleep, stress levels and attention.

The onus is on the applicable service providers to introduce systems and processes that ensure people under the minimum age cannot create or keep a social media account. This means there will be no penalties for age-restricted users who gain access to an age-restricted social media platform, or for their parents or carers.

Service providers that don’t comply with the new minimum age will face civil penalties, and the penalties for breaches of industry codes or standards have been increased to ensure they take their responsibilities under the OSA Act seriously.

Technology facilitated abuse (TFA) and domestic violence

Family, domestic and sexual violence is a pressing national policy priority within Australia. Tech-based abuse (sometimes known as ‘technology-facilitated abuse’) is behaviour that uses an online space or digital technology to threaten, intimidate, bully, harass, humiliate or coerce someone.

Women are likely to experience tech-based abuse more frequently than men and must deal with it over longer periods of time. The abuse is often more severe in its nature and damaging in its psychological impact, leaving women twice as likely (26.3%) to fear for their safety compared to men (12.6%).

Digital technology use has provided an opportunity for perpetrators to obtain the power to exert greater control, and to monitor, stalk and harass victim-survivors beyond the physical space (Duerksen and Woodin 2019; Harris 2018; Harris and Woodlock 2022). This often includes tech-based coercive control, as well as cyberstalking and image-based abuse. Perpetrators of gender violence TFA can use a range of digital technologies, including mobile phones, social media services, global positioning system (GPS) tracking devices and online accounts (such as email), to control, abuse, track and intimidate victim-survivors.

Examples include:
  • Controlling your online communication
  • Restricting or controlling your access to devices and online accounts
  • Financially abusing you using technology
  • Harassing or threatening you online or with a digital device
  • Sharing or threatening to share an intimate image or video of you online without your consent, also known as ‘image-based abuse’ or ‘revenge porn’
Section 474.17A of the Criminal Code Act 1995 (Cth) makes it unlawful to use the internet to transmit sexual material without consent, with a penalty of 6 years imprisonment.

Tech-based coercive control

Tech-based coercive control is almost always a factor in family and domestic violence. It can be part of sexual violence. Coercive control is an ongoing pattern of behaviour used to control another person through manipulation, pressure and fear.

Tech-based coercive control can be used to:
  • undermine your self-worth, confidence and independence
  • cut you off from social supports such as friends, family, services and money
  • pressure or threaten you to make you do things, or stop doing things
  • track where you are going and what you are doing
  • ‘gaslight’ you to make you unsure about what is real
  • isolate you so you feel trapped and unable to leave the relationship.
People who are at greater risk of experiencing technology-facilitated abuse as part of family, domestic and sexual violence include: women and girls, Aboriginal and Torres Strait Islander women, women from culturally and linguistically diverse backgrounds, women with disability, LGBTQI+ people and women in rural areas.

If you or someone you know is experiencing tech-based domestic, family or sexual violence you can find steps to help them on the eSafety website.

It may be best to make contact from a trusted person’s phone or device, if you think yours is being tracked or monitored. If an abusive person learns that you are seeking help and information, their behaviour may get worse, so it’s a good idea to ask a support worker to help you.

ESafety’s online safety checklist can help if you’re experiencing tech-based domestic, family and sexual violence. Follow eSafety’s steps for reporting online abuse if it's safe to do so. It is usually best to make a safety plan first.

Find out how to access family and domestic violence leave. All employees are entitled to 10 days of paid family and domestic violence leave each year. This includes full-time, part-time and casual employees experiencing ‘tech abuse’.

Cyberstalking

Cyberstalking is when a person uses digital technology to keep constant track of you online in a way that makes you feel uncomfortable, worried or threatened. Cyberstalking behaviour can include:
  • Constantly checking in on someone and trying to get their attention, even when they make it clear that they are not interested;
  • Making repeated unwanted contact with someone by calling, emailing, texting, messaging, or asking inappropriate questions;
  • Repeatedly sending, posting or sharing unwanted sexual requests, sexual or offensive content, abusive comments or false accusations to or about someone;
  • Monitoring someone’s movements using location technologies that are built into the operating systems of phones and fitness apps or using tracking devices or spyware;
  • Following or contacting someone across multiple online accounts and making it known they cannot hide;
  • Accessing or hacking someone’s online account;
  • ‘Gaslighting’ a person by changing their environment in small ways that are difficult to prove to togethers, such as using remotes to turn internet-connected devices on and off within their home.
Section 474.17 of the Criminal Code Act 1995 (Cth) makes it unlawful for a person to use the internet in a way that a reasonable person would regard as being, in all the circumstances, menacing, harassing or offensive. A person could face a maximum penalty of 5 years imprisonment.

Section 189 of the Criminal Code Act 1983 (NT) also makes it unlawful for a person to stalk another (cyberstalking included), imposing a penalty of 2 years imprisonment.

Sexual extortion

Sexual extortion or ‘sextortion’ is a form of blackmail where someone threatens to share a nude or sexual image or video of another person unless that person gives in to their demands. If you’re under 18, the best way to get help is to report it to the Australian Centre to Counter Child Exploitation (ACCCE). If you’re 18 years or older, report it to any platforms or services where the blackmailer contacted you. If your intimate image or video is shared, you can report it to eSafety.

Sexual extortion can be perpetrated by individuals, usually for financial profit, sexual gratification or, in the case of domestic and family violence, to exercise control over the victim. Sexual extortion is also used by organised crime syndicates to obtain money and/or sexual content that can be sold or bartered.

As for the NT, section 205 of the Criminal Code Act 1983 (NT) makes it unlawful for any person to publish, directly or indirectly, or threaten to publish defamatory material concerning another person with the intent to extort any property from that person or another, or with the intent to induce any person to give, confer or obtain, or to attempt to obtain property or benefit. A person liable under section 205 could face 3 years imprisonment.

Doxing

Doxing is an abbreviation for ‘dropping documents’. This is when a person shares or publishes someone’s personal details online, such as their home address, email address or phone number, without their consent.

The information that is doxed may be sourced through publicly available information, research of public records or through unauthorised access to private databases and computer systems (hacking).

Unlike defamation, doxing does not have to reveal something untrue or damaging about an individual the information is usually accurate, whether or not it has been sourced lawfully.

The Criminal Code Act 1983 (NT), section 276G makes it an offence to unlawfully obtain confidential information from any register, document, computer or other repository of information with intent to cause loss to a person or with intent to publish the same to a person who is not lawfully entitled to have or to receive it, or with intent to use it to obtain a benefit or advantage. A person is liable to imprisonment of 3 years for this offence.

In 2024, the Australian Government passed the Privacy and Other Legislation Amendment Act 2024 (Cth). It introduces two new ‘doxing’ criminal offences to the Criminal Code Act 1995 (Cth):
  • An offence under section 474.17C of the Criminal Code where someone makes available another person’s personal data in a way that a reasonable person would regard as being menacing or harassing. The penalty for this offense is up to six years of imprisonment.
  • An offence under section 474.17D of the Criminal Code where someone makes available the personal data of one or more members of a group, and does so on the belief that the group is distinguished certain attributes such as race, religion, nationality, or sexuality. The conduct must also be deemed by a reasonable person to be menacing or harassing. The penalty for this offense is up to seven years of imprisonment.

This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding AustLII Communities? Send feedback
This website is using cookies. More info. That's Fine