© 2024 Connecticut Public

FCC Public Inspection Files:
WEDH · WEDN · WEDW · WEDY
WECS · WEDW-FM · WNPR · WPKT · WRLI-FM · WVOF
Public Files Contact · ATSC 3.0 FAQ
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

U.S. employers are using AI to essentially reduce workers to numbers in the workplace

STEVE INSKEEP, HOST:

We've spent a lot of time debating the future of artificial intelligence and the way it may be used to control people. Our next guest contends the future is now. Ifeoma Ajunwa wrote a book called "The Quantified Worker." She says employers have spent generations trying to reduce their employees to statistics. They've gained more power to do this in recent decades using computers, and they've gained even more power in recent years with artificial intelligence. She's been following this trend since long before AI caught the world's imagination in recent months.

IFEOMA AJUNWA: I have thought to myself, it is already everywhere. People just have not been paying attention. And I also think that the conversation has been hijacked a bit by AI doomsday scenarios - right? - of this far off horizon when AI becomes sentient. And the fact is we now have AI that's being used as tools by humans to quantify workers in the workplace, and we need to pay attention to that because that's happening already now.

INSKEEP: Well, let's talk about some of the ways that this is happening, according to your research. Maybe we could start with hiring. How is artificial intelligence being used in hiring?

AJUNWA: Well, AI technologies are being used as part of automated hiring platforms. So it runs the gamut from simple AI systems that just parse your resume - so just searching for keywords in your resume - to the extreme end of automated video interviewing. And that's when humans are actually really being interviewed by AI.

INSKEEP: Which makes a judgment as to whether this person is the right kind of person we want to hire.

AJUNWA: So, yes, AI technologies are being deployed to make these consequential decisions about access to employment.

INSKEEP: Well, if somebody gets past the artificial intelligence gatekeeper and gets a job, they may encounter AI in the workplace, according to your writing. You use a phrase I hadn't seen before - mechanical manager. What is a mechanical manager?

AJUNWA: So for me, that's what I'm calling these AI technologies in the workplace. It's really the idea that we're now delegating functions in the workplace that we previously thought could only be done by a human - we're now delegating it to AI technologies. So productivity applications, for example, are being used to quantify how productive a worker is and really to distill that productivity to numbers. So those productivity applications can count, for example, how many keystrokes someone is typing per hour in the workplace. They can count how many emails you've sent per day. They can even track how many conversations you've had with your fellow coworkers.

INSKEEP: Are you saying this technology gives an employer more power of surveillance?

AJUNWA: Yes, certainly. These technologies have been referred to perhaps tongue-in-cheek as bossware (ph) - right? - because these technologies basically enable your boss to watch over your shoulder at all times without necessarily being physically present. So consider that productivity apps can track you whether you're in the workplace or whether you're working from home or whether you're working on an airplane even.

INSKEEP: What are you hearing from workers about this?

AJUNWA: Well, I'm hearing a lot of concern. You know, one worker shared with me that he felt that he was discriminated against during the automated video interviewing because although he spoke English fluently, English was not his first language. And so the AI system had trouble with his diction. The AI system then graded him poorly for that. There's also the issue of productivity apps or surveillance apps being misused by employers in the service of discrimination or harassment. So one particular case in California was a woman who found out that actually her productivity app was being used by her supervisor to track her over the weekend when she was supposed to be off work because even though she had turned off the app, the app could never be turned off. So the manager was actually using this to track her and then was harassing her and telling her things about her personal life. So there is a lot of concern that because there is really no regulation guiding how employers can use these technologies, there is ripe opportunity for misuse.

INSKEEP: It's easy to see the case for how this technology could be abused to harass someone as you just described. But you've also twice referred to discrimination, which maybe some people have a little harder time conceiving because they might think of the machine as an inanimate object and objective. What is the way in which a computer becomes a racist?

AJUNWA: There's actually a word for this, and it's called automation bias. We tend to see automated processes as less prone to bias themselves, but that's not actually true. You have to remember that it is still human beings that are coding these AI processes, and even the way that they are trained is a result of human decision-making of which training data to feed them. If you think of a company that has historically excluded women or historically excluded minorities, then the available training data for any automated hiring system for that company is not going to include that demographic. So then your automated hiring system is going to learn with a limited training data and will then replicate the same patterns you're trying to fight against.

INSKEEP: I guess this makes me wonder about the larger question - can a system like that be fixed? Can it be done right?

AJUNWA: I think that is an important question. So part of what I push for in my book is the proposal that there should be a federal mandate for the auditing of automated hiring systems because that's the way we're going to surface whatever bias that they have. And then I would advocate that companies should then be given safe harbor, so a period of time during which they may fix the problems without necessarily being liable.

INSKEEP: How would you respond to somebody who feels that this is going too fast to control already?

AJUNWA: Ninety-nine percent of us have to work for a living. We don't have a choice on that matter. So it behooves the government to ensure that access to employment is fair, is equal opportunity. And it behooves the government to also ensure that the experience of the workplace is not one of subordination and domination, but one that allows individual workers to flourish and reach their full potential.

INSKEEP: Ifeoma Ajunwa is the author of "The Quantified Worker: Law And Technology In The Modern Workplace." Thanks so much.

AJUNWA: Thank you so much.

(SOUNDBITE OF MUSIC) Transcript provided by NPR, Copyright NPR.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

Stand up for civility

This news story is funded in large part by Connecticut Public’s Members — listeners, viewers, and readers like you who value fact-based journalism and trustworthy information.

We hope their support inspires you to donate so that we can continue telling stories that inform, educate, and inspire you and your neighbors. As a community-supported public media service, Connecticut Public has relied on donor support for more than 50 years.

Your donation today will allow us to continue this work on your behalf. Give today at any amount and join the 50,000 members who are building a better—and more civil—Connecticut to live, work, and play.