Trendy Tech News

What is fair when it comes to hiring, digital assessment and disability?

What is fair when it comes to hiring, digital assessment and disability?

Organisations are quick to say they have established diverse teams, however, for many living with a disability, it can be a fight just to get a foot in the door.

Though Donald Trump has just begun his second term as the president of the US, taking over from former president Joe Biden, he has already enacted a number of policies that threaten global security and peace of mind. For example his decision to repeal Biden’s Safe, Secure and Trustworthy Development and Use of Artificial Intelligence executive order, which essentially worked to ensure the safe development of advanced AI tech. 

In a move that has been criticised by the Associated Press as being a “sweeping dismantling of the federal government’s diversity and inclusion programmes”, Trump’s administration has also issued an executive order stating that all federal diversity, equity and inclusion staff (DEI) are to be put on paid leave and eventually dismissed, in an effort to “forge a society that is colourblind and merit-based”.

For many who are living with a disability, accessing the working environment, even with DEI measures in place to protect their interests, is already a challenge. A challenge that, according to the Centre for Democracy and Technology, often begins at the hiring stage, when people with different capabilities are subjected to a frequently biased digitised recruitment process.  

In late 2024, the organisation released Screened Out: The Impact of Digitised Hiring Assessments on Disabled Workers report, which spoke to a diverse group of people living with a variety of disabilities, to examine their experiences with AI-powered recruitment, the impact and how the risks might be reduced.

Key findings

One of the main findings within the report was that participants believe digitised hiring processes to be inherently biased and representative of significant accessibility barriers. While AI recruitment comes with a medley of benefits, for example cost effectiveness and timeliness, those who took part in the report stated that many of the tests are designed with a flawed standard in mind, that doesn’t take into consideration differing abilities.

Computer-based assessments, such as personality and cognitive tests, as well as AI-scored interview videos, are among the most common forms of digital assessment described within the report, with many of the participants of the belief that this application method only confirms the biases we as a society have already cultivated.

“They’re consciously using these tests knowing that people with disabilities aren’t going to do well on them and are going to get self-screened out, either because they don’t finish the tests or because they do so horribly,” stated a report participant. Another expressed the belief that it was intentionally difficult so a refusal to hire could be based on ‘merit’, not disability status. 

Because the tests have the potential to exclude people with cognitive, auditory or sight issues and many of the jobs advertised don’t offer accommodations, some felt only those with an invisible disability that does not affect performance, could benefit from a digitised assessment. However, overall opinion stated that regardless of an applicant’s intentions, the tests are designed to reveal potential disabilities. 

What can be done? 

Companies retain the right to hire who they want, via whatever process or method they so choose. However, the report highlighted a number of ways in which organisational leaders can ensure that their systems are fair and without bias. 

Firstly, employers should assess their own organisation to determine if digitised hiring is absolutely necessary, or whether an alternative, more transparent system could achieve the same goal without introducing potential risks or discrimination. If organisations are of the opinion that a digitised method is preferred then it should be implemented with core skills and job relevancy at its core.  

Assessments should also follow existing accessibility guidelines and should make accommodations for whomever may need them, so that people of diverse backgrounds are given equal opportunities. That may include access to e-readers, auditory aids, improved images and a smarter, more informative layout. 

Lastly, though some employers may feel that it negates the point of AI-powered technology, companies that are unsure about the overall fairness of digitised assessments could implement a degree of human oversight. This could take the form of using digitised tests as an add-on, rather than viewing as the entire evaluation. 

Ultimately, digital assessments, while they have their merits, are not infallible when it comes to selecting potential employees. The organisations that truly want to hire based on merit, while still maintaining much needed DEI initiatives, should ensure that their assessment process doesn’t slam the door shut on people before they have even had a chance to show what they are capable of. 

Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button