Author: Juyoun Han and Patrick Lin (i): Contact: EandBLaw.com
Table of contents and important points:
Artificial intelligence (AI) in job hiring and recruiting is a cause for concern as a new form of discrimination in the workplace.
Despite its convenience, AI can also be biased because of its race, gender, and disability status, and it can be used in ways that exacerbate systematic discrimination in the workplace.
A common example of AI can be found on LinkedIn, a website that connects job seekers with employers and recruiters.
People with disabilities face significant disadvantages in the workforce. According to the U.S. Equal Employment Opportunity Commission (EEOC) (1), the most common claims of all employment discrimination cases filed in 2019 were disability discrimination (33.4%), closely followed by racial and discrimination of gender. Today, a new form of discrimination in the workplace is a cause for concern: bias against artificial intelligence (“AI”).
What is Artificial Intelligence?
Artificial intelligence (2) is a branch of computer science that develops computers and machines to mimic intelligent human behavior. Common examples of AI in our daily life could be “Siri” or “Alexa”. AI is also built into assistive technologies such as Seeing AI, AVA, Voiceitt, and smart wheelchairs to name a few.
How is artificial intelligence used in hiring and how does it affect people with disabilities?
AI is also widely used in job hiring and recruiting. According to Glassdoor, AI recruitment tools (3) are used in a variety of industries, from Allstate to Hilton to Five Guys Burgers. A common example of AI can be found on LinkedIn, a website that connects job seekers with employers and recruiters. LinkedIn’s AI (4) suggests a job to employees they might be interested in based on their profile and work experience, and also suggests connections with potential employers. Other examples of AI recruitment tools include text search technology, which is used to review numerous applications, facial analysis technology, which is used to scan applicants’ facial expressions and body language during video interviews, and voice scanner technology which is used to evaluate an applicant’s language, sound and word decisions.
Despite its usefulness, AI can also be biased because of its race, gender, and disability status, and it can be used in ways that exacerbate systematic discrimination in the workplace. For example, researchers have found that assessing facial movement and voice in applications “can be massively discriminatory[e] against many people with disabilities (5) that have a significant impact on facial expression and voice: disabilities such as deafness, blindness, speech disorders, and stroke survival. “Online personality tests and web-based neuroscience games used in AI recruitment tools can also be used to test people with mental illness. (6)
In general, AI recruitment tools are programmed to identify the preferred characteristics of an employer based on the employer’s existing pool of employees. This means that if disabled people are not represented in the employer’s current pool of employees, the AI recruitment tool can learn to weed out applicants with a disability. In essence, the AI would treat “underrepresented features as undesirable features”. As a result, “people with disabilities – like other marginalized groups – run the risk of being excluded,” says Alexandra Givens (7), President and CEO of the Center for Democracy and Technology. To overcome bias, AI recruitment tools need to be trained with more diverse data, which includes employees with disabilities. Disabled people are currently underrepresented in the workforce and it is not surprising that technology is emulating this phenomenon. “If an algorithm’s training data is not diverse, it can anchor existing exclusion patterns in extremely harmful ways,” Givens wrote in an article for Slate. (8th)
Find solutions through legal counsel
The ADA limits an employer’s ability to make disability-related inquiries during the recruitment phase. AI hiring tools that employers can use to obtain information about an applicant’s disability and weed out qualified applicants would be held liable under the ADA as well as state and local human rights laws. According to Bloomberg, the U.S. Equal Employment Opportunity Commission is already investigating at least two potential claims (9) and lawsuits involving discriminatory decisions made by an AI tool in recruitment, promotions, and other workplace decisions.
State and local governments propose and enact laws that regulate the use of AI recruitment tools and investigate the discriminatory effects that such tools can cause. Illinois pioneered the AI Video Interview Act, which requires employers to notify, explain, and obtain consent for applicants to use AI hiring tools. New York City is currently considering a bill (10) that would require sellers of AI recruitment tools to conduct an annual “bias” check. While we wait for lawmakers to pass laws promoting AI accountability, lawyers in court will take action to tackle discrimination based on AI hiring tools. (11)
Juyoun Han is an attorney at Eisenberg & Baum LLP and heads the firm’s artificial intelligence, fairness and data protection department. As a trial attorney, Juyoun has represented deaf and hard of hearing clients in courts across the country, advocating that people with disabilities receive equal treatment in the workplace, in hospitals, law enforcement agencies and in prisons. Patrick Lin is a sophomore law student at Brooklyn Law School, where he is vice president of Legal Hackers and a contributor to the Brooklyn Law Review. Prior to studying law, Patrick worked as a technology consultant in the field of data management and regulatory compliance.
- 1 – https://www.eeoc.gov/newsroom/eeoc-releases-fiscal-year-2019-enforcement-and-litigation-data
- 2 – https://www.merriam-webster.com/dictionary/artificial%20intelligence
- 3 – https://www.washingtonpost.com/technology/2019/10/22/ai-hiring-face-scanning-algorithm-increasingly-decides-whether-you-deserve-job/
- 4 – https://engineering.linkedin.com/blog/2018/10/an-introduction-to-ai-at-linkedin
- 5 – https://benetech.org/wp-content/uploads/2018/11/Tech-and-Disability-Employment-Report-November-2018.pdf
- 6 – https://www.abajournal.com/news/article/do_job_personality_tests_discriminate_eeoc_probes_lawyers_complaint_filed_o
- 7 – https://cdt.org/staff/alexandra-reeve-givens/
- 8 – https://slate.com/technology/2020/02/algorithmic-bias-people-with-disabilities.html
- 9 – https://news.bloomberglaw.com/daily-labor-report/punching-in-workplace-bias-police-look-at-hiring-algorithms
- 10 – https://legistar.council.nyc.gov/LegislationDetail.aspx?ID=4344524&GUID=B051915D-A9AC-451E-81F8-6596032FA3F9
- 11 – https://www.eandblaw.com/services/ai-fairness-and-data-privacy/
- 12 – https://www.eandblaw.com/services/ai-fairness-and-data-privacy/
(i) Source / Reference: Juyoun Han and Patrick Lin. Disabled World makes no guarantees or warranties in connection therewith. Content may have been edited for style, clarity, or length.
• • Disabled World is a news and information website only and is provided for general informational purposes only and does not constitute medical advice. The materials presented are in no way intended to be, and should not be construed as, a substitute for professional medical care provided by a qualified physician. Offers or advertising from third-party providers on disabled-world.com do not constitute approval by Disabled World. Please report outdated or inaccurate information to us.
Journal: Disabled World. Language: English. Author: Juyoun Han and Patrick Lin. Electronic publication date: 2020-11-25. Last revision: November 25th, 2020. Reference Title: “Disability Bias in AI Settings Tools and Legal Protection”, Source: Disability Bias in AI Settings Tools and Legal Protection. Summary: The trend towards artificial intelligence (AI) in job hiring and recruiting is a cause for concern as a new form of discrimination in the workplace. Retrieved on December 3rd, 2020 from https://www.disabled-world.com/disability/legal/ai-hiring.php – reference category number: DW # 317-13899.