ICO - Information Commissioner's Office

03/31/2026 | Press release | Distributed by Public on 03/31/2026 03:08

Here’s what jobseekers need to know about automated recruitment decisions

From helping managers to review CVs to scoring online assessments, more employers than ever are turning to automation and AI in recruitment.

Almost three quarters of employers (70%) anticipate increasing their use of AI and automation in their recruitment process over the next five years, according to a survey from the Institute of Student Employers last year.

It's a top priority for us. Insight from our recent focus groups suggested jobseekers don't always know how AI and automation are being used. As more employers turn to technology, we are speaking to businesses to make sure proper safeguards are in place to protect people.

Our AI specialist, Declan McDowell-Naylor, explains what jobseekers need to know.

What is automated decision making (ADM)?

'Automated decision making' in recruitment means algorithms are used to help decide what happens to your job application. Instead of a person manually reviewing every CV, some of the hiring process is handled by automated technology.

This can speed up the recruitment process, so it can be useful when there is a high volume of applications or varied work experience - for example, with graduate or early careers roles.

So what does it look like? Here's an example:

  1. You complete a job application - this may involve submitting your CV or answering questions with a chatbot.
  2. The employer's system analyses your application, looking for certain words, skills, or qualifications.
  3. It may score, rank, or filter your application and use this information to make a decision.
  4. If you score or rank highly, you may be progressed to interview stage with the hiring manager.
  5. If your score is below a set threshold, your application may be rejected before a human recruiter can review it.

Is this legal?

Yes, employers can use ADM in the hiring process if they have a valid reason. In fact, the law has changed to make it easier for organisations to innovate and use automation in this way. It can make it more efficient for everyone involved - decisions can be quicker and more consistent.

But safeguards must in place to protect your data protection rights and ensure nobody is unfairly impacted by automated decisions.

It's our job at the ICO to make sure employers are taking steps to protect you and your personal information if they wish to use ADM.

That's why we've spoken to more than 30 employers about the use of automation in recruitment and published a new report and draft guidance today. This sets out our expectations for organisations, so they can make sure all automated decisions are lawful, fair and transparent.

We also wrote to 16 organisations likely to be using ADM to make decisions about jobseekers, and they have now committed to acting on our recommendations to improve practices.

What are my rights?

When decisions are made without a real person involved, you have the right to:

  • A fair outcome: We've heard some concerns about the potential for bias and discrimination towards certain characteristics when automated decisions are made. We expect organisations to test regularly for bias and make sure they are mitigating this - so you can trust that the employer is treating all candidates fairly.
  • Know if ADM is being used: Employers need to tell you if automated decisions are being made about your application. They should clearly explain how any automated tools work and how it may impact the application process.
  • Challenge a decision and ask for a 'human review': If you believe an automated decision is wrong, you can ask how the decision was reached. You have the right to contest the decision, explain your point of view and request a real person to take a second look.

You can find more information about how to challenge automated decisions on the ICO's website.

ICO - Information Commissioner's Office published this content on March 31, 2026, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on March 31, 2026 at 09:08 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]