The Online Safety Act 2021 (OSA) introduced significant regulatory standards that impact a broad spectrum of businesses and digital services. With mandatory compliance measures commencing on 22 December 2024, it’s crucial for businesses to prepare now.
New online safety measures require organisations to take reasonable steps to proactively minimise unlawful or harmful activities occurring on their services and to conform to basic online safety regulator expectations. Boards need to check with management on their risk registers and other measures which will be affected.
Australia’s online safety framework
The OSA is supported by a framework of industry codes and standards for eight segments of Australia’s online industry. These include social media, Internet Service Providers (ISPs,) app stores, equipment providers, search engines, hosting services and categories of “Relevant Electronic Services” (RES) and “Designated Internet Services” (DIS).
While industry codes applying to ‘Class 1’ content are already in place for the initial six segments, mandatory regulatory standards for ‘Class 1 content’ for RES and DIS commence on 22 December 2024. The RES and DIS categories include a wide range of ‘everyday’ businesses that may be unaware of their online safety obligations.
Industry codes for ‘Class 2’ content, applicable to all segments are currently being drafted, and will be presented to the eSafety Commissioner in December 2024.
Class 1 and Class 2 content
Class 1 content includes child abuse imagery, pro-terror content and other content that would be refused classification in Australia. Class 2 content includes material rated R18+ or X18+, or other harmful content deemed harmful or unsuitable for Australian children.
It is important to note that the OSA not only regulates sites providing this content, but also imposes preventative compliance and risk assessment obligations on a wide range of businesses with a digital presence. Many websites that offer features such as customer reviews, customer chat or virtual product ‘try on’ options will have to demonstrate how they manage and mitigate online safety risks.
Businesses covered by the RES and DIS Standards
Some of the features and services covered by the Standards include:
Relevant Electronic Services |
Designated Internet Services |
|
|
The Basic Online Safety Expectations
In addition to the Codes and Standards, the Commonwealth government introduced a set of Basic Online Safety Expectations (BOSE) (currently in force) applying to social media services, RES and DIS. The definition of ‘social media service’ is broad, capturing product review sites and comment functions on some websites, as well as platforms like Facebook or TikTok.
The BOSE requires organisations to take reasonable steps to:
- ensure users can use digital services safely
- proactively minimise unlawful or harmful activities occurring on their services
- consult with the eSafety Commissioner and adhere to guidance on reasonable steps to ensure safe use
The BOSE also require organisations leveraging generative AI tools to consider user safety, and incorporate safety features in design and implementation.
Questions all directors should ask management
- Have we allocated responsibility and accountability for online safety compliance?
- Have we assessed how the OSA applies to our technology stack and our website?
- Can management provide the Board with a comprehensive overview of our digital services and their categorisation under the OSA?
- Have we assessed whether we comply with the OSA Standards for relevant electronic services and designated internet services commencing in December 2024?
- Are we meeting all Basic Online Safety Expectations applicable to our organisation?
- Is online safety included in our risk register?
- Have we designated accountability and responsibility to implement, measure and monitor proactive controls to minimise online safety risks?
By proactively addressing these considerations, organisations will meet upcoming regulatory requirements and foster a safer online environment for customers, staff and stakeholders.
More about Carolyn Hough GAICD
Carolyn Hough GAICD is the CEO and Founder of Policy Australia, a boutique consultancy firm focused on integrated policy, public affairs, compliance and strategy advice. She is an expert adviser on digital transformation and online safety, and a member of the Steering Committee for the Commonwealth Attorney-General’s copyright and AI reference group.
Latest news
Already a member?
Login to view this content