Humiliated by automated business? $ 80 glitch by Virgin Mobile royally screws professor in mortgage deal

I am single dad and untenured professor with an average, by no means impressive 5-figure salary. I work hard, and I love my job. My son and I live comfortably enough in a small, rented apartment. By no means am I part of the 1%. As someone who earned a PhD, I was a debt-ridden student, then junior faculty until my late 30s. Like many of us, I made some quick decisions with credit cards and joint accounts when I was younger, and didn’t always play the credit game right. I learned my lessons, saved up, paid my bills, invested some, got help from my family, and dreamt of owning a home. As I am about to turn 40, my dream came true when a young couple accepted my offer on a sunny upper duplex in West Montreal. The local bank was pleased to see me bring a good sum for the payment (about 40% of the property value), and happily submitted my mortgage application. After their algorithm crunched my life history into a game of risk-obsessed ones and zeros, the bank computer ‘approved’ the mortgage, but asked for much larger sum in down-payment. My credit file, it turned out, contained a ‘delinquent’ item. I thought this would be easily sorted. The ‘delinquent’ item was an 84$ cell phone bill I had already paid, for a Virgin Mobile account I had closed four months prior. Again, I thought this would be easily sorted. I quickly found out that Virgin had failed to report my payment to the credit bureau. My account now being closed, I couldn’t access the zero balance statement required by my bank. I was wanted to produce proof that my account had in fact never been delinquent. I spent the next two days speaking to a dozen exploited call centre workers in the Philippines, being bumped back in the call queue by naïve customer service reps, and being hung up mid conversation after explaining my story to 5 different “supervisors”.

The bottom line is nail-bitingly depressing. None of the discharge emails sent to me by accounts payable ever reached my inboxes (yes, I did check my junk folders!). Real humans in Virgin Stores cannot help, as “only the accounts receivable team can access billing information”. The flustered call centre people in the Philippines in the mean time, explain that only “the system” can generate these emails, which take “less than one hour” to be delivered. After “confirming” that the discharge email had been sent 4 times, they then explained there was “nothing else they could do”.

“Surely”, I implored, you are a human person connected to the internet, talking to me, a real human person. You have information on a screen in front of you showing that my account was paid — there has to be a way for you to send this to me”.

My desperate plea to have a screenshot sent via personal email was met wih further rehearsed apologies from the system-bound workers. As it stands, I have missed the bank deadline for producing my proof of payment.

Many of us have had similar experiences. The usual response after a few days of call centre fatigue is to simply give up. Three weeks before the Virgin incident, for example, my girlfriend and I had decided to demand accountability and compensation from Avis after being hit with impossible-to-comprehend extra charges, and being initially charged a late return fee due to a “mistake in the system”. We had managed to get the fee removed, but not without wasting over two hours of our precious vacation time during two trips to the rental desk counter. Many anonymous forms and automated emails later, we were informed that “all was in order” with our file. No explanation. No compensation. We capitulated.

I won’t capitulate this time.

If you’ve read me so far, please share my story and join me in the campaign to demand transparency and accountability from big automated corporations.

Last year in Psychology Today, I wrote an article on the automated systems and algorithms that regulate most of our lives with no accountability whatsoever. I share the main points of the article below.

Have you ever had a job, credit card, mortgage, rental, travel visa, scholarship, or study permit application turned down without explanation? Have you ever been detained at airport security or immigration, or been denied boarding? Have you ever spent hours on the phone aching through automated menus, only to speak to underpaid, overworked, flustered attendants who do not understand your language, cannot spell your name right, and tell you there is nothing they can do?

Algorithms are responsible for most of the decisions (from travel, credit card, mortgages, schools, and job applications) made on our behalf in modern life. They also control what we see and who we associate with in our social media newsfeeds, and are now widely acknowledged as influencing us much beyond our consumption habits into the deeply personal realm of our political decisions, and even dating and mate choices.

Cathy O’Neil, a former math professor and data scientist turned activist has denounced these algorithms as Weapons of Math Destruction (WMDs). After working as a hedge fund statistician in the financial industry and witnessing the disastrous economic effect of bad algorithmic predictions, O’Neil has spearheaded a campaign that seeks to educate the broader public on the flaws of WMDs while encouraging us to demand transparency and accountability from the governments and corporations that rely on flawed algorithms to manage our lives.

O’Neil points out that efficiency-improving algorithms often turn into Weapons of Math Destructions through two fundamental design flaws:

1) The predictions they generate based on prior data (e.g.. statistical likelihood of someone repaying a loan) are not impartial. Risks in these models are defined according to preset criteria (e.g., living in a certain neighborhood) that are not free of moral and cultural assumptions. These assumptions often tend to reflect, and thereby reinforce the existing status quo (e.g., poverty and systemic racism).

2) Governments and companies are almost never transparent about their algorithms. They do not divulge the criteria and method used to generate predictions, and do not provide detailed reports to citizens and customers. They are in no way accountable for the consequences of how opaque datasets generated about our every trait and move are used to control what we can see and do.

In her recent book on WMDs, O’Neil recounts one teacher’s harrowing battle to demand an explanation for her dismissal from a DC public school after a decision made from a computer-generated evaluation score. “It’s an algorithm, and it’s complicated”, the teacher kept being told.

But the problem, rather, is that these algorithms are most often too simple.

When algorithms do not take into account (let alone seek to address!) the many socio-economic, poverty-related issues that contribute to consistently low test scores in poor school districts, for example, and when these test scores are used as the basis for dismissing teachers, WMDs are reinforcing, not fixing a problem. The same systemic problem underpins the infamous use of predictive-policing software to allocate police resources in poor neighborhoods in certain cities.

The e-scores generated by the invisible third-party companies who sell our browsing data (including personal email and messaging, medical and psychological information, etc.) to predatory schools, insurance, and business of all kinds are similarly opaque.

Demanding transparency.

Some systems, however, have made an effort to provide transparency. FICO credit scores may be problematic when they are used as a moral measure to determine a person’s worth, or their eligibility for jobs, housing, or even dating, but they do provide each of us with a copy of our report and an explanation for its score. Discrepancies can be contested, and eventually (but very slowly) altered. The system may not be fair by some standards, but it is at least transparent and contestable.

It is time for other companies to follow the FICO model. Why don’t Virgin, Avis, Google, Facebook, Netflix, and Amazon provide us with detailed reports on all our e-scores, including a breakdown of how they are measured, what they are used for, and whom they are sold or given to?

Did you ever ask yourself the following questions:

The way forward for all of us lies in demanding transparency and accountability. Only then will we have a say in the criteria used to measure and rank us as high or low risks in costing companies money, and high or low potentials in generating them money. Once that too becomes apparent, another way forward will be possible.

When we are done uncovering the racist and classist criteria used by WMDs to perpetuate stereotypes, limit our mobility, and reinforce systemic inequalities on a global scale, it will be up to us to demand a better, fairer system — one, we can only hope, that is not driven by an inhumane logic of profit-making the for the very few.

Do we have a long way to go? Maybe not.

Please share this post, and reclaim the use algorithms and automated systems to generate change from the people, for the people.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store