Summary
- New NSW WHS laws require your business to assess and manage psychosocial risks created by AI and digital work systems, including workload, monitoring and decision-making tools.
- You remain responsible for risks caused by these systems, even if a third-party provider supplies them, and regulators expect active risk management beyond policies.
- Failure to comply can lead to significant penalties, compensation claims and reputational damage, particularly as enforcement in this area increases.
- This guide explains how NSW WHS laws apply to AI and workplace stress for Australian businesses and outlines practical compliance steps.
- LegalVision, a commercial law firm that specialises in advising clients on employment law and workplace safety, provides this practical overview.
Tips for Businesses
Identify any digital systems that allocate work or monitor performance and assess how they affect workload and employee wellbeing. Speak regularly with staff, adjust systems that create pressure or reduce autonomy, and document your actions. Focus on practical changes to how work is performed rather than relying solely on policies or training.
AI and digital systems can now be workplace hazards under Australian law. NSW businesses must assess whether the technology they use to manage work creates psychological harm, such as stress, unreasonable workloads or discriminatory outcomes. This article outlines your legal obligations and the practical steps you should take to manage risk.
NSW Expands WHS Duties to Cover AI and Digital Systems
The Work Health and Safety Amendment (Digital Work Systems) Act 2026 changes how NSW businesses must approach workplace technology. The law makes it clear that AI, algorithms and automated systems can create workplace risks you must manage.
If an algorithm creates impossible workloads, you remain legally responsible regardless of whether you designed the software yourself or purchased it from a vendor.
Psychosocial Hazards National Regulation
Victoria joined the rest of Australia in implementing psychosocial hazard regulations in December 2025. This means all Australian businesses must actively manage work-related mental health risks.
SafeWork NSW recently issued a prohibition notice to pause a major restructure at the University of Technology Sydney due to “serious and imminent risk of psychological harm” from inadequate consultation, as reported by Employment Hero.
James True, who heads the Employment Law team at LegalVision, says the regulatory focus has shifted: “Historically, businesses didn’t pay much attention to their safety obligations unless they were in a safety-critical industry, maybe construction, mining, or oil and gas. Now there have been legislative changes which have brought psychosocial hazards expressly into the hazards that a business must control.”
Continue reading this article below the formCall 1300 544 755 for urgent assistance.
Otherwise, complete this form, and we will contact you within one business day.
Written Policies Are Not Enough
You cannot satisfy your WHS obligations by drafting policies or running training sessions. Regulators now expect structural changes in performing work.
“Telling people to be safe is not enough,” True explains. “Whereas historically a business might have said, ‘Look, we’ve got a WHS policy, we’ve got grievance policies’, that approach really isn’t viable anymore. Employers are now required to make changes to the way that work is performed.”
SafeWork NSW’s AI WHS Scorecard identifies three common design flaws:
| Fragmented Design | Productivity tools that bombard workers with automated feedback without considering personal circumstances. |
| Low Job Control | Workflow systems that eliminate worker autonomy and professional judgment. |
| Machine-Paced Work | Systems that impose relentless rhythms without allowing breaks or recovery time. |
In most cases, the risk does not come from the technology itself, but from how you design and use it.
Surveillance and Management Practices Create Liability
Excessive monitoring can create psychosocial risks, even where it is introduced with good intentions. As True explains, “It can lead to things like unnecessary surveillance. Surveilling an employee in circumstances where you might be going a little bit further than you need to, that poses a risk to that employee’s health and safety, and those things can happen unintentionally when you start letting AI run riot in the business.”
These risks often emerge where your systems do not give managers clear visibility or control over workload. For example, True explains, “For example, if you’ve got a system where managers are not really aware of people’s workload, and don’t have a meaningful way to interrogate it and change it, you might be exposing your employees to psychosocial hazards. If you don’t have anything structurally which looks at acknowledging or rewarding employees for the good work they’re doing, then you’re exposing yourself in relation to safety obligations.”
When managing performance, you must ensure your approach reflects current legal expectations. As True warns, “If your approach to feedback is rooted in 1950s philosophy, then it’s something the law is going to have limited sympathy for you on, and you’re going to expose your business. This is where regulators are interested in taking up these cases. They’re interested in prosecuting businesses for not keeping people safe at work.”
Financial Consequences of Non-Compliance
In 2024-25, the average cost of a psychological injury claim in NSW reached $288,542, according to Employment Hero’s analysis. These claims accounted for 38% of the total cost of all workers’ compensation claims despite representing only 12% of claim volumes.
Beyond formal penalties, non-compliance drives employee turnover. “You’ve got the risk of people leaving. You will find new generations of employees really value the environment they’re working in and have a limited tolerance to being subjected to anything other than a reasonable cultural environment, and won’t stay,” True says.
What You Should Do
Assess Your Digital Systems
You do not need complex systems or external consultants to start managing psychosocial risks. However, you must take practical steps to assess how your business uses digital systems and how they affect your workers.
True emphasises that your obligations are not absolute, but you must still act. “It’s important when it comes to safety obligations that they’re not absolute. You’ve ultimately got to take reasonable steps to keep people safe.”
Understand Employee Experience
You should take steps to understand how your employees are experiencing these systems in practice. As True explains, “Compliance can be as limited here as occasionally surveying your employee, taking an interest in your workers to ask how they are, giving them an opportunity to raise issues that they might be facing at work.”
Document and Demonstrate Compliance
Finally, you should document the steps you take, even if they are simple. This can help demonstrate that your business is actively managing risk. True also notes, “The critical thing for any small business is to just make sure you’re doing something at a minimum, because then you’ve always got an argument that you have actually taken some reasonable steps.”
For more information on how AI and digital systems impact workplace safety, see Employment Hero’s coverage of these regulatory changes.
As an employer, understand your essential employment obligations with this free LegalVision factsheet.
Key Takeaways
NSW WHS laws require you to manage risks from AI and digital systems. You remain responsible where these tools create excessive workloads or unfair outcomes. Psychosocial hazards are regulated across Australia and apply to businesses of all sizes. You must go beyond policies and make practical changes to how work is designed and managed. Regular check-ins, reviewing systems and documenting your actions can help demonstrate compliance and reduce legal risk.
LegalVision provides ongoing legal support for businesses through our fixed-fee legal membership. Our experienced employment lawyers help businesses manage contracts, employment law, disputes, intellectual property, and more, with unlimited access to specialist lawyers for a fixed monthly fee. To learn more about LegalVision’s legal membership, call 1300 544 755 or visit our membership page.
Frequently Asked Questions
Do small businesses need to comply with these laws?
Yes. WHS obligations apply to all businesses, regardless of size. You must assess and manage psychosocial risks created by digital systems, including AI and automated tools, even if you operate a small business or use third-party software.
No. Policies alone are not sufficient. You must make practical changes to how work is designed and managed, including reviewing digital systems, monitoring workloads and addressing risks that may impact employee mental health.
Psychosocial hazards are factors in the workplace that can cause psychological harm. These include excessive workloads, poor management practices, lack of support, low job control and systems that create stress or unfair working conditions.
You can demonstrate compliance by taking reasonable steps to identify and manage risks. This includes reviewing your digital systems, speaking with employees about workload and stress, making adjustments where needed and documenting the actions you take to address psychosocial hazards.
We appreciate your feedback – your submission has been successfully received.