Return to Gemcell:
Click here.

How to use an ethical framework for AI

October 1, 2025
How to use an ethical framework for AI

Using an ethical framework for AI

Whether you’re a dedicated AI advocate or a confirmed AI sceptic, you need to tread carefully. Because while AI can introduce a lot of good to your business, it can also bring a lot of risk.

It’s safe to say AI is not a fad. It’s here to stay and it will be an increasingly important part of how we work and play. You only have to look at the investments going into AI – and where that investment is coming from – for proof it’s not going anywhere.


As business owners, we’ve got a decision to make about AI. It’s not if, but when, we use it. Some of you will already be experimenting with it, some of you may have it embedded into your business across a few different areas. Others may have steered well clear up until now.

Wherever you are on your adoption curve, one thing needs to be kept in mind above everything else: security and safety, and the ethical use across the business. For this, you need to create a framework that enables everyone to use AI responsibly and ethically.

Because just as AI and how it can be used is constantly evolving, so too are the security implications, loopholes and the illegal uses. And you need to sense-check AI’s biases, too.

Implementing AI safely and securely is essential for any business – and while you can retrofit some processes and protocols, it’s better to consider it all when you’re right at the beginning. Here’s how you can build an ethical AI framework in your business.

Many of us have played around with AI in our personal lives – we’ve signed up for a ChatGPT account or some other AI. Which is great in terms of familiarising ourselves with what’s possible. What you want to avoid, however, is different people within the business bringing different AI’s to work, and uploading company information. For one, it’s really bad business practice. For two, you’ve got no visibility or control over what information is going into a platform. No AI should be used by your people unless you’ve authorised it and it’s a company account that you have access to. Otherwise, you risk sensitive information being uploaded to who knows where – and it’s impossible then to step back.

On many ‘free’ iterations of AI platforms, the small print will read something like ‘your interactions will be used to train our AI models’. In other words, we’ll store everything you upload and use that information as we wish. Which, on some levels, is fine if it’s a generic interaction you’re having. If it’s more sensitive information – customer details or business strategy info, for example – you need to be more careful.

The likes of ChatGPT and Gemini, for example, are what’s called genAI – Generative AI – and as such, they learn and develop. Unlike Excel, where you’ll get the same answer today as you would have yesterday and you will in 10 years, genAI is different. You’re not guaranteed to get the same answer to the same query today and tomorrow. In fact, it’s unlikely you will. 

There is a good opportunity for AI to help every day – from regulations to manuals, knowledge can be uploaded that you and your team can access, ask questions of and get guidance on. However it’s vitally important to always remember that any AI model can hallucinate, can get things wrong, can interpret something incorrectly. And, ultimately, you’re responsible. So remember to never take the AI as gospel – because it could get it wrong.

When you’re implementing AI, having the right guardrails in place is important. What’s it going to be used for, and what are you going to put in it? Where’s the process for checks? If you’re using it to assess CVs for interviewing new recruits, what have you done to ensure there’s no inherent bias in relation to the candidates the AI is selecting?

In September 2024, the Australian Government published a set of voluntary guardrails it advises businesses to work to:

1. Establish, implement and publish an accountability process including governance, internal capability and a strategy for regulatory compliance.Guardrail one creates the foundation for your organisation’s use of AI. Set up the required accountability processes to guide your organisation’s safe and responsible use of AI, including: an overall owner for AI usean AI strategyany training your organisation will need.
2. Establish and implement a risk management process to identify and mitigate risks.Set up a risk management process that assesses the AI impact and risk based on how you use the AI system. Begin with the full range of potential harms with information from a stakeholder impact assessment (guardrail 10). You must complete risk assessments on an ongoing basis to ensure the risk mitigations are effective.
3. Protect AI systems, and implement data governance measures to manage data quality and provenance.You must have appropriate data governance, privacy and cyber security measures in place to appropriately protect AI systems. These will differ depending on use case and risk profile, but organisations must account for the unique characteristics of AI systems such as: data qualitydata provenance cyber vulnerabilities.
4. Test AI models and systems to evaluate model performance and monitor the system once deployed.Thoroughly test AI systems and AI models before deployment, and then monitor for potential behaviour changes or unintended consequences. You should perform these tests according to your clearly defined acceptance criteria that consider your risk and impact assessment.
5. Enable human control or intervention in an AI system to achieve meaningful human oversight across the lifecycle. It is critical to enable human control or intervention mechanisms as needed across the AI system lifecycle. AI systems are generally made up of multiple components supplied by different parties in the supply chain. Meaningful human oversight will let you intervene if you need to and reduce the potential for unintended consequences and harms.
6. Inform end users regarding AI-enabled decisions, interactions with AI and AI-generated content. Create trust with users. Give people, society and other organisations confidence that you are using AI safely and responsibly. Disclose when you use AI, its role and when you are generating content using AI. Disclosure can occur in many ways. It is up to the organisation to identify the most appropriate mechanism based on the use case, stakeholders and technology used.
7. Establish processes for people impacted by AI systems to challenge use or outcomes.Organisations must provide processes for users, organisations, people and society impacted by AI systems to challenge how they are using AI and contest decisions, outcomes or interactions that involve AI.
8. Be transparent with other organisations across the AI supply chain about data, models and systems to help them effectively address risks.Organisations must provide information to other organisations across the AI supply chain so they can understand: the components used, including data, models and systemshow it was builtand manage the risk of the use of the AI system.
9. Keep and maintain records to allow third parties to assess compliance with guardrails.Organisations must maintain records to show that they have adopted and are complying with the guardrails. This includes maintaining an AI inventory and consistent AI system documentation.
10. Engage your stakeholders and evaluate their needs and circumstances, with a focus on safety, diversity, inclusion and fairness. It is critical for organisations to identify and engage with stakeholders over the life of the AI system. This helps organisations identify potential harms and understand if there are any potential or real unintended consequences from the use of AI. Deployers must identify potential bias, minimise negative effects of unwanted bias, ensure accessibility and remove ethical prejudices from the AI solution or component.

From industry.gov.au

Latest

From noise to insight: leveraging AI to analyse customer feedback and perfect your electrical service delivery

Learn how electricians can use AI tools to analyse customer feedback, spot recurring issues, and improve service quality. [...]<p><a class="btn btn-secondary understrap-read-more-link" href="https://gemcell.com.au/news/from-noise-to-insight-ai-customer-feedback/">Read More...<span class="screen-reader-text"> from From noise to insight: leveraging AI to analyse customer feedback and perfect your electrical service delivery</span></a></p>

Six alternative work Christmas parties for your team

While there’s a lot to be said for a traditional Christmas party, there are other activities that can unlock some serious business benefits. [...]<p><a class="btn btn-secondary understrap-read-more-link" href="https://gemcell.com.au/news/work-christmas-party-alternatives/">Read More...<span class="screen-reader-text"> from Six alternative work Christmas parties for your team</span></a></p>

Using text messages to communicate with clients and customers

Messaging your mates is one thing – but should you be texting your clients? [...]<p><a class="btn btn-secondary understrap-read-more-link" href="https://gemcell.com.au/news/text-messages-clients-customers/">Read More...<span class="screen-reader-text"> from Using text messages to communicate with clients and customers</span></a></p>

The AI Issue

Out Now

The AI Issue

Current Issue

The AI Issue

OCT - NOV 2025

The AI Issue

Past Issues

View all
  • The Mental Health Issue

    Issue 188

    The Mental Health Issue

    AUG - SEPT 2025

  • The Infrastructure Issue

    Issue 187

    The Infrastructure Issue

    JUN - JUL 2025

  • The Heritage Issue

    Issue 186

    The Heritage Issue

    APR - MAY 2025

  • The Resources Issue

    Issue 185

    The Resources Issue

    FEB - MAR 2025

  • The Renovation Issue

    Issue 184

    The Renovation Issue

    DEC 2024 - JAN 2025

  • The Showtime Issue

    Issue 183

    The Showtime Issue

    OCT - NOV 2024

  • The Bias Issue

    Issue 182

    The Bias Issue

    AUG - SEPT 2024

  • The Distance Issue

    Issue 181

    The Distance Issue

    JUN - JUL 2024

  • The Growth Issue

    Issue 180

    The Growth Issue

    APR - MAY 2024

  • The Workfit Issue

    Issue 179

    The Workfit Issue

    FEB - MARCH 2024

  • The Fire and Water Issue

    Issue 178

    The Fire and Water Issue

    DEC 2023 - JAN 2024

  • The Multi-Res Issue

    Issue 177

    The Multi-Res Issue

    OCT - NOV 2023

  • The Cost of Living Issue

    Issue 176

    The Cost of Living Issue

    AUG - SEPT 2023

  • The Winter Issue

    Issue 175

    The Winter Issue

    JUN - JUL 2023

  • The Light Issue

    Issue 174

    The Light Issue

    APR - MAY 2023

  • The Security Issue

    Issue 173

    The Security Issue

    FEB - MAR 2023

  • The Summer Issue

    Issue 172

    The Summer Issue

    DEC 2022 - JAN 2023

  • The Change Issue

    Issue 171

    The Change Issue

    OCT - NOV 2022

  • The E-Issue

    Issue 170

    The E-Issue

    AUG - SEPT 2022

  • The Future Living Issue

    Issue 169

    The Future Living Issue

    JUN - JUL 2022

  • The Transport Issue

    Issue 168

    The Transport Issue

    APR - MAY 2022

  • The Local Issue

    Issue 167

    The Local Issue

    FEB - MAR 2022

  • The Human Issue

    Issue 166

    The Human Issue

    DEC 2021 - JAN 2022

  • The Branding Issue

    Issue 165

    The Branding Issue

    OCT - NOV 2021

  • The Positivity Issue

    Issue 164

    The Positivity Issue

    AUG - SEPT 2021

  • The ‘Not My Fault’ Issue

    Issue 163

    The ‘Not My Fault’ Issue

    JUN - JUL 2021

  • The Fault Issue

    Issue 162

    The Fault Issue

    APR - MAY 2021

  • The Power Issue

    Issue 161

    The Power Issue

    FEB - MAR 2021

  • The Summer Issue

    Issue 160

    The Summer Issue

    DEC 2020 - JAN 2021

  • The Care Issue

    Issue 159

    The Care Issue

    OCT - NOV 2020

  • The Recreation Issue

    Issue 158

    The Recreation Issue

    AUG - SEPT 2020

  • The Recovery Issue

    Issue 157

    The Recovery Issue

    JUN - JUL 2022

  • The Solar Issue

    Issue 156

    The Solar Issue

    APR - MAY 2020

  • The Bigger Business Issue

    Issue 155

    The Bigger Business Issue

    FEB - MAR 2020

  • The Big Business Issue

    Issue 154

    The Big Business Issue

    DEC 2019 - JAN 2020

  • The Elsewhere Issue

    Issue 153

    The Elsewhere Issue

    OCT - NOV 2019

  • The Protection Issue

    Issue 152

    The Protection Issue

    AUG - SEPT 2019

  • The Emissions Issue

    Issue 151

    The Emissions Issue

    JUN - JUL 2019

  • The Retro Fit Issue

    Issue 150

    The Retro Fit Issue

    APR - MAY 2019

  • The Retail and Hospitality Issue

    Issue 149

    The Retail and Hospitality Issue

    FEB - MAR 2019

  • The Bush Issue

    Issue 148

    The Bush Issue

    DEC 2018 - JAN 2019

  • The Training Issue

    Issue 147

    The Training Issue

    OCT - NOV 2018

  • The Connected Home Issue

    Issue 146

    The Connected Home Issue

    AUG - SEPT 2018

  • The Virtual Reality Issue

    Issue 145

    The Virtual Reality Issue

    JUN - JUL 2018

  • The Wiring Devices Issue

    Issue 144

    The Wiring Devices Issue

    APR - MAY 2018

  • The Entertainment Issue

    Issue 143

    The Entertainment Issue

    FEB - MAR 2018

  • The Transport Issue

    Issue 142

    The Transport Issue

    DEC 2016 - JAN 2017

  • The Institution Issue

    Issue 141

    The Institution Issue

    OCT- NOV 2017

  • The Behind the Meter Issue

    Issue 140

    The Behind the Meter Issue

    AUG - SEPT 2017

  • The Winter Issue

    Issue 139

    The Winter Issue

    JUN - JUL 2017

  • The Project Issue

    Issue 138

    The Project Issue

    APR - MAY 2017

  • The Australia Issue

    Issue 137

    The Australia Issue

    FEB - MAR 2017

  • The Disaster Issue

    Issue 136

    The Disaster Issue

    DEC 2016 - JAN 2017

  • The Outdoor Issue

    Issue 135

    The Outdoor Issue

    OCT - NOV 2017

  • The People Issue

    Issue 134

    The People Issue

    AUG - SEPT 2016

  • The Environment Issue

    Issue 133

    The Environment Issue

    JUN - JUL 2016

  • The Safety Issue

    Issue 132

    The Safety Issue

    APR - MAY 2016

  • The Wireless Issue

    Issue 131

    The Wireless Issue

    FEB - MAR 2016

  • The Tools Issue

    Issue 130

    The Tools Issue

    DEC 2015 - JAN 2016

  • The Cable Issue

    Issue 129

    The Cable Issue

    OCT - NOV 2015

  • The LED Issue

    Issue 128

    The LED Issue

    AUG - SEPT 2015

  • The Smart Issue

    Issue 127

    The Smart Issue

    JUN - JUL 2015

  • The ANZAC Issue

    Issue 125

    The ANZAC Issue

    APR - MAY 2015

  • The Australia Issue

    Issue 125

    The Australia Issue

    FEB - MAR 2015

  • The Future Issue

    Issue 124

    The Future Issue

    DEC 2014 - JAN 2015

  • The Renovation Issue

    Issue 123

    The Renovation Issue

    OCT - NOV 2014

  • The Industry Issue

    Issue 122

    The Industry Issue

    AUG - SEPT 2014

  • The Future Shock Issue

    Issue 121

    The Future Shock Issue

    JUN - JUL 2014

  • The Light Issue

    Issue 120

    The Light Issue

    APR - MAY 2014

  • The Green Issue

    Issue 119

    The Green Issue

    FEB - MAR 2014

  • The Press Reset Issue

    Issue 118

    The Press Reset Issue

    DEC 2013 - JAN 2014

  • The Safety Issue

    Issue 117

    The Safety Issue

    OCT - NOV 2013

  • The Business Boot Camp Issue

    Issue 116

    The Business Boot Camp Issue

    AUG - SEPT 2013

The AI Issue

Explore Electrical Gems