Social media is a key communication tool for children to engage with each other and the outside world. Thousands of businesses use social media to provide a platform to interact with children and young people and to allow children to communicate with each other. The issue of cyber-bullying has long been a concern for children, parents and the operators of these social media platforms.
On 24 March 2015, the Federal Parliament of Australia passed the Enhancing Online Safety for Children Act 2015 (the Act), with support from all major Australian political parties.
The Act seeks to enhance online safety for children through the establishment of a Children’s e-Safety Commissioner (Commissioner) and the implementation of a complaints system to remove cyber-bullying material targeted at Australian children from social media sites, such as Facebook. Failure to comply with the Act can result in significant fines.
Social media is entrenched in modern life. Companies rely on it to attract, advertise, retain and communicate with their clients and the world at large. At an individual level, it a primary method of communication between people of all ages and particularly between children and young people. Social media takes many forms: from ubiquitous applications like Facebook, Twitter and Instagram, to “in game” social forums that are a regular part of online gaming, to simple chat rooms, blogs and even websites that allow people to communicate with each other in “comments” sections.
The Act establishes the office of a Children’s e-safety Commissioner, to administer a complaints system monitor and require organisations to remove social media posts consisting of cyber-bullying material, and can seek injunctions and levy fines.
The Act applies to any social media service – which is any electronic service with a primary purpose of enabling social interaction between 2 or more end-users, where end-users post material under the service.
The Commissioner is an independent statutory office within the Australian Communications and Media Authority (ACMA). Alastair MacGibbon has been announced as Australia’s first Commissioner. Mr MacGibbon was the founder of the Australian Federal Police’s High Tech Crime Centre.
The Commissioner’s primary role is to administer a complaints system for cyber-bullying material targeted at Australian children. In conjunction with this, it also oversees the compliance of social media services with the Act’s basic online safety requirements (see below for more detail).
The Commissioner is also responsible for promoting online safety for children; coordinating activities of federal government departments, authorities and agencies in this regard; and reporting to the Minister for Communications (Minister) on children’s online safety issues.
To comply with the Act’s basic online safety requirements, a social media service must:
There is an expectation in the Act that each social media service will comply with the basic online safety requirements. While the Commissioner can publish a statement of non-compliance on its website, non-compliance is not otherwise enforceable.
A complaint can be made by or on behalf of an Australian child if the complainant believes cyber-bullying material targeted at an Australian child is accessible or delivered to one or more of the end-users using a social media service.
As above, “cyber-bullying material targeted at an Australian child” is material that an ordinary reasonable person would conclude that was likely intended to have the effect of seriously threatening, seriously intimidating, seriously harassing or seriously humiliating a particular Australian child, regardless of whether the particular child accessed the material.
A complaint to the Commissioner can only be made if a complaint has already been made to the relevant service, and evidence of that complaint must be provided to the Commissioner. The Commissioner will only proceed in cases where the service has not removed the offending material within 48 hours of receiving the original complaint.
The Commissioner has powers to investigate each complaint and conduct the investigation as it sees fit. Following investigation, the Commissioner can:
The difference between tier 1 and tier 2 is described further below.
If a tier 2 service or end-user who receives a notice fails to remove the relevant material within 48 hours, the Commissioner can take enforcement action. For a tier 2 service, that can include a fine of up to $17,000, an enforceable undertaking or an injunction. The Commissioner can also issue a formal warning and publish a statement of non-compliance on its website.
The key difference between tier 1 and 2 services is the fact that the Commissioner can take enforcement action against a tier 2 service, but not a tier 1 service.
A social media service can apply for tier 1 status, and that status will be granted if the Commissioner is satisfied the service meets the basic online safety requirements.
A social media service will only be determined to be a tier 2 service if the Minister makes a declaration to that effect by legislative instrument. This will only occur if it is a large social media service and the Commissioner makes a recommendation it should be categorised as such, or if the service requests tier 2 status.
The new legislation has important implications for both social media providers and corporations and institutions that work with children.
For companies that provide social media services (in any of its many forms), there are a range of matters that must be acted upon to ensure compliance with the Act.
For organisations that work with children, it will be important to understand what can be done to prevent cyber-bullying, in order to minimise harm to children.
The content of this publication is for reference purposes only. It is current at the date of publication. This content does not constitute legal advice and should not be relied upon as such. Legal advice about your specific circumstances should always be obtained before taking any action based on this publication.