Skip to content
Law Society of Scotland
Search
Find a Solicitor
Contact us
About us
Sign in
Search
Find a Solicitor
Contact us
About us
Sign in
  • For members

    • For members

    • CPD & Training

    • Membership and fees

    • Rules and guidance

    • Regulation and compliance

    • Journal

    • Business support

    • Career growth

    • Member benefits

    • Professional support

    • Lawscot Wellbeing

    • Lawscot Sustainability

  • News and events

    • News and events

    • Law Society news

    • Blogs & opinions

    • CPD & Training

    • Events

  • Qualifying and education

    • Qualifying and education

    • Qualifying as a Scottish solicitor

    • Career support and advice

    • Our work with schools

    • Lawscot Foundation

    • Funding your education

    • Social mobility

  • Research and policy

    • Research and policy

    • Research

    • Influencing the law and policy

    • Equality and diversity

    • Our international work

    • Legal Services Review

    • Meet the Policy team

  • For the public

    • For the public

    • What solicitors can do for you

    • Making a complaint

    • Client protection

    • Find a Solicitor

    • Frequently asked questions

    • Your Scottish solicitor

  • About us

    • About us

    • Contact us

    • Who we are

    • Our strategy, reports and plans

    • Help and advice

    • Our standards

    • Work with us

    • Our logo and branding

    • Equality and diversity

  1. Home
  2. For members
  3. Journal Archive
  4. Issues
  5. May 2022
  6. Employment: AI – programmed for inequality?

Employment: AI – programmed for inequality?

Guidance is expected later this year from the Equality & Human Rights Commission on how to avoid the risk of artificial intelligence tools producing discriminatory outcomes. It has much to cover
16th May 2022 | Elouisa Crichton

Artificial intelligence (“AI”) is everywhere, and is now used so frequently that we may not even think about it. Algorithms, automated decision-making and machine-learning technologies are increasingly used by organisations to help make decisions about individuals. Alongside the many benefits, there are legal issues to be wary of. We focus on where AI may breach the Equality Act 2010 in the workplace and beyond. Data protection considerations are outside the scope of this article.

The recently released Commission on Race & Ethnic Disparities report (the “CRED report”) acknowledged that AI can contribute to racial inequality. The Government has announced that the Office for AI will publish a white paper later this year on regulating AI, covering issues such as how to address any potential racial bias in algorithmic decision-making, and the Equality & Human Rights Commission (“EHRC”) will provide guidance on how to apply the Equality Act to any AI decision-making. We expect this will go wider than race and consider all protected characteristics (age, sex, race, religion, sexual orientation, gender reassignment, marriage/civil partnership, pregnancy/maternity and disability).

Impact on employers

Bias and discrimination can unfortunately be found in AI tools, often stemming unintentionally from the humans who program them, or inherent bias in the datasets used to “train” the technology. For example, in the work context, an automated recruitment system could discriminate if it:

  • favours one gender over another (including scoring language more typically used by male candidates more highly than language more commonly used by females);
  • values length of service in past roles disproportionately over experience/skills (which could lead to age discrimination risks); or
  • does not recognise overseas qualifications on a par with those from the UK (potentially exposing an employer to race discrimination claims).

There are examples of individuals whose disability impacts on their ability to complete multiple choice tests satisfactorily, despite them being able to answer the question using free text. Any automated decision-making process that does not build in triggers for human checks and reasonable adjustments for disabled individuals could breach the Equality Act.

We know that diverse teams work well, but that does not always play out in recruitment decisions. AI may recommend a candidate for recruitment that surprises an organisation. This could leave the recruiters wondering whether the AI tool has got it wrong, or whether it has instead shone a spotlight on potential bias in the human decision-making process, left unchecked until now.

In an employment setting, AI often has significant consequences for individuals and it is not uncommon to see claims arising from its use.

Impact on society

The Equality Act also applies beyond employment, covering education, the public sector and the provision of services to clients, customers and the public. AI’s influence can be more opaque when it is used in society, with the impact often being small for individuals but far-reaching when we zoom out.

For example:

  • AI can be used to filter applications for a range of services, from education courses to financial products.
  • US research highlighted the disparate impact of facial recognition technologies on women
    and certain racial groups, some having an error rate of more than 30% for darker-skinned women compared to less than 1% for lighter-skinned males.
  • Concerns have been raised about targeted advertisements online for products, jobs and services, for example AI tools resulting in job adverts for mechanic jobs being advertised to predominantly male users, and nursery nurse jobs to predominantly female users.

Taken in isolation, an individual may not realise that an error in face recognition or having sight (or not) of a particular advert has occurred, or is an example of discrimination, but across the population we can see the scope for far-reaching consequences as access to opportunities and support is influenced in part by protected characteristics.

What next?

AI can be a mechanism for achieving more equitable decision-making and reducing bias. To ensure AI is a positive tool, we need big-picture safeguards in place. The European Network of Equality Bodies produced a report highlighting the role of equality bodies, such as the EHRC, in regulating and ensuring that AI tools produce non-discriminatory outcomes. The UK Government’s response to the CRED report stated that the EHRC will advise on safeguards and produce guidance to explain how to apply the Equality Act to algorithmic decision-making.  The further guidance to be issued by the EHRC and Office for AI will hopefully assist employers in their use of AI, and also provide a safety net for society. 

The Author

Elouisa Crichton, counsel, Dentons UK, Ireland & Middle East LLP

Share this article
Add To Favorites
https://lawware.co.uk/

Regulars

  • People on the move: May 2022
  • Reading for pleasure: May 2022
  • Book reviews: May 2022

Perspectives

  • Opinion: Ian Maxwell
  • President's column May 2022
  • Editorial: Ball in their court
  • Viewpoints: Breaking the bias?
  • Profile: Arlene Gibbs

Features

  • Sector switch
  • Non-doms: some taxing issues
  • Hearings for the child
  • Trees: it's not (all) about the money
  • Feeling lonely? Get in touch
  • Peace dividend: Mediation for insolvency disputes

Briefings

  • Civil court: Suitable representative?
  • Employment: AI – programmed for inequality?
  • Family: Still living together?
  • Pensions: Dashboards – last lap before staging?
  • Property and VAT: The ground shifts again
  • In-house: Beyond the day job

In practice

  • Risk management: Scope is the key
  • WCAC: Seize the moment
  • Arbitration: Delivering together
  • Steps to turning green
  • Ask Ash: Right not to return?
  • No charge for complaint handling

Online exclusive

  • Ramadan: the need for team support
  • ESG: holding businesses to account
  • Litigation funding and the Post Office scandal
  • Possession is not nine tenths of the law
  • Biometrics in the workplace

In this issue

  • Outsourcing your cashroom – business model, not service
  • A new strategic partnership
  • All you need to know about the Recovery Loan Scheme
  • Scottish Solicitors' Discipline Tribunal
  • Why switch to cloud-based practice management software?

Recent Issues

Dec 2023
Nov 2023
Oct 2023
Sept 2023
Search the archive

Additional

Law Society of Scotland
Atria One, 144 Morrison Street
Edinburgh
EH3 8EX
If you’re looking for a solicitor, visit FindaSolicitor.scot
T: +44(0) 131 226 7411
E: lawscot@lawscot.org.uk
About us
  • Contact us
  • Who we are
  • Strategy reports plans
  • Help and advice
  • Our standards
  • Work with us
Useful links
  • Find a Solicitor
  • Sign in
  • CPD & Training
  • Rules and guidance
  • Website terms and conditions
Law Society of Scotland | © 2025
Made by Gecko Agency Limited