top of page
Search

How Ageism + Cultural Misunderstanding could get you fired in the age of automated decisions

Imagine if… an AI system declared you a “flight risk”, invoked a machine learning-based employee monitoring system in your car, wearables, phone and social media activity and got you fired right when you needed the money to pay your kids’ college tuition, your elderly mother’s care and now impending early retirement.



You were the pride of the financial firm you worked at for 30 years. You came straight out of college and right into the firm. As one of the first Latina women to be hired there, you won award after award and landed over half of the firm’s biggest clients. You fought three times harder than any male or white woman – working nights and weekends, crying when you missed the first steps, music recitals, and quinceanera preparations. But the sacrifice would be worth it because you would teach your kids the biggest lesson of all, that if you worked hard you could achieve your dreams. You rose to the ultimate symbol of career achievement, Partner.

But then fortunes took a turn for the worse; and you couldn’t imagine why or what happened to cause this. Suddenly all that hard work meant nothing. A new performance management algorithm that the company’s human resources department was trying out flagged you as a “flight risk”. This in turn triggered an AI system that monitors your keystrokes, your health insurance-provided fitness tracker, your phone – including your calendar appointments, job-hunting apps, internet searches, but most importantly where you go at all times.

Ultimately after intensive scrutinization it decided that you were not trying to leave to go to a competitor. The performance management algorithm decided the competition would not hire you at the age of 50. However, because of all the recent visits you made to clients at their homes instead of offices (because of Mexican holiday, Día de los Muertos), the algorithm decided you were trying to steal the firms’ clients to launch a business of your own. It gave additional proof by noting that you had also recently subscribed on a social media site to the “50’s plus Entrepreneur” group and had been actively posting. No one told you what it found or why it decided you were a flight risk. There was no legal requirement for them to do so. You work in an “at-will” employment state. You could be fired at will.

Not only would you not get the retirement celebration and acknowledgement of a fantastic career from the firm and your colleagues, but your career would now come to an abrupt and disreputable end. Instead of a huge party, you were forced to hand over everything that was ever given to you by the firm in your 30 years of service and escorted out by guards. Because there was data to be gleaned from your fitness tracker, your car, and most of all your phone and laptop, those would now be held by the firm as evidence in case of a wrongful termination lawsuit.

You felt powerless and crumpled into the rideshare you were forced to get because you no longer have a company car. On the ride home, all you could think was how did this go so terribly wrong. Then grief turned to panic as you wondered how you would continue to support your grown children and mother who both lived with you. You were not prepared for an early retirement. All your savings went into paying for either college or constant care for your mother’s dementia. How could this happen?

As you sat holding your mother’s hand, tears streaming down your face, you replayed in your mind everything you did in the last month since the algorithm had been implemented. You had cultivated deep and meaningful relationships within the Mexican business community by being there for their daughters’ quinceaneras and more recently Día de los Muertos (Day of the Dead in English). In Mexican heritage, friends and family gather to pray for and remember lost loved ones on this day. While celebrating with them at their houses, you discovered many of them subscribed to a new group called “50’s plus Entrepreneur” so you joined as well. But you couldn’t fathom how any of this had relevance to your current situation. You dismissed it.

There was no time to look back; you didn’t have time or money to hire an attorney anyway. You had to keep moving forward. That meant getting a new job at an age when you were just ready to think about retiring. This time you would put money away for retirement. You would accept whatever job you had to in order to keep paying the bills.

Questions to think about. Take the Quiz.


1) Would the employee monitoring algorithm have flagged unusual activity around Day of the Dead if someone of Mexican heritage could have helped develop it?

2) Are there data sources that could have been used to flag this holiday in the algorithm?

3) Is it the usual practice of data scientists to look for such things before releasing an algorithm to be used? Is there a legal requirement?

4) Do data scientists have a standard operating model for algorithm development?

5) Will the algorithm ever get updated with this information? Was there any method to feedback to the algorithm?

6) Did the company have an appeals process for our 50 year-old flight risk?

7) Could the company have done anything differently in its implementation of both the performance management and employee monitoring algorithms?

8) Do you think it was tested on diverse groups before its release?

9) If the Financial firm continues using the algorithms as-is, what issues do you see the firm encountering in the future as more and more minorities could become subject to firing?

Inspired by new stories:

IBM can predict with 95% accuracy which employees will quit

Workers have a Big Secret, Their Age

Bosses can monitor your every step and move

Previous Case Study:

When AI keeps you from getting your dream job

395 views

1 Comment


Sharvari Dhote
Sharvari Dhote
Sep 02, 2020

Very interesting case study. I have seen case studies many places. However, I liked the quiz idea. It really forces you think about the ethical dilemma. Good way to make people aware of AI ethics through quizzes.

Like
bottom of page