Are Algorithms Hurting Your Finances? What You Need to Know

Many or all of the products here are from our partners that compensate us. It’s how we make money. But our editorial integrity ensures our experts’ opinions aren’t influenced by compensation. Terms may apply to offers listed on this page.

Imagine if the world's decisions were made without bias. The color of your skin wouldn't affect whether you got a personal loan. You'd get a job interview because of your qualifications instead of your physical attractiveness.

Many people think that computers can provide this objectivity, but the truth isn't so clear-cut.

Computers make decisions using sets of instructions called algorithms. In many cases, it's illegal for these instructions to include factors like race and gender. But human bias can seep into algorithms in subtle ways.

And when biased decisions affect our credit scores, loan applications, and job prospects, they can seriously hurt our finances.

Here's how algorithmic bias happens -- and five groups of people whose finances are seriously affected by it. If you're in one of these groups, you may be feeling the effects of algorithmic bias already.

How does financial algorithm bias happen?

Companies (usually) don't design their algorithms to discriminate against specific groups of people. So how does it happen?

For algorithms to work properly, they need to "practice" on large sets of data. And humans choose which pieces of data go into those sets. This decision is largely subjective.

For example, which data points should a lender analyze to determine if someone is likely to repay their loan? Should it matter whether they have a college degree or which school that degree came from? Should it matter what city they live in?

Ray Walsh, a data privacy expert at ProPrivacy, talked to The Ascent about data bias. "Any time that human actors are charged with overseeing data used to train up algorithms, this creates a potential pain point, because it produces a need to trust those specific actors," he said.

Algorithmic black boxes

Most algorithms are developed by people, but they may be updated by computers. Algorithms that were originally transparent become opaque, creating a "black box." These algorithms still need training sets, though. And the combination of potentially biased training data and an opaque process can cause problems.

"It is vital to understand that AI algorithms do not actually have any intelligence and are not capable of independent thought," Walsh said. "Instead, algorithms perform tasks based on the data they were fed to train them up. If that data is corrupt, prejudiced, or otherwise tainted, the AI will output results that discriminate."

Companies might not even know that their algorithms are biased. And that can have a big effect on your finances. Especially if you belong to any of these groups.

Algorithms could be hurting your credit score if ...

… you're one of the millions of Americans with no credit score

Lenders are legally prohibited from basing their decisions on an applicant's race, color, religion, national origin, sex, marital status, or age. Automated credit scoring means an actual human being may never learn any of these things about you when you apply for a credit card (or another type of loan).

Instead, a computer takes the data from your application, combines it with what it learns from your credit report, and spits out a decision. An algorithm decides whether to give you a card, how high your credit line will be, and your interest rate. Your credit score is a big factor in those decisions.

But millions of Americans don't have credit scores. The data that feeds traditional credit scores largely comes from payments made on credit cards, mortgages, and personal loans. So credit scoring algorithms are biased in favor of people who engage in these borrowing activities and against those who don't. And whether you get approved for a financial product may depend on your credit score.

… you live in a credit desert

The incidence of credit invisibility is eight times higher in lower-income census tracts than in upper-income ones. People who are credit invisible either have no record at all or their files are too sparse to give them a score. The problem is even more pronounced in rural areas -- at all income levels.

If you live in a credit desert, you may find it especially difficult to get a loan or a credit card. And that makes it harder to get out of the credit desert.

Even if you have decent credit, it's not hard to imagine that an algorithm could penalize you for living in a place where many other people have little or no credit.

… you're looking for a job

How much you earn is one of the biggest determinants of your financial well-being -- even a small salary increase early in life can make a big difference to your lifetime earnings. And while algorithms might not decide how much you get paid, they play a big role in determining whether you get a job.

For many positions, when you submit a resume online, a program scans it for keywords and decides whether to send your information to a human hiring manager.

What happens when companies train their hiring algorithms based on the characteristics of applicants that have performed well in the past? They end up hiring more of the same type of people and skip over equally or better-qualified employees who have different characteristics.

Here's an example. Say a hiring algorithm learns that candidates with degrees from the University of Connecticut are highly valuable because they've performed well in the past. Now, if you didn't graduate from UConn, you'll be less likely to make it to the next step in the hiring process.

Bias and homogeneity can become self-perpetuating even when a computer helps make decisions. People from groups that have been underrepresented in the workforce -- or even at a single company -- can have fewer job opportunities, and their lifetime earning potential may be curtailed.

… you're part of a "high medical risk" group

The health care industry provides another example of how biased data can cause an algorithm to produce unequal treatment. And because our health has such a profound impact on our ability to earn a living and build savings, not to mention our ability to enjoy our lives and care for our loved ones, this source of bias, detailed in the October 2019 issue of Science, is a serious problem.

By using health care spending as a proxy to determine which patients need extra attention from their medical providers, an algorithm insurers use has led to racial disparities in treatment, particularly between black and white patients. A systematic problem in U.S. society -- inadequate access to care for black Americans -- becomes self-perpetuating thanks to a biased algorithm.

Lower-quality health care has a direct effect on how much you earn. If you're missing work due to chronic health conditions or an inability to afford medical care, your earnings can take a big hit. And more of those earnings may, in turn, go back to your health care bills.

… you're a woman

While your gender may not be explicitly included in the algorithms that determine your credit score or whether you get a job interview, it does play into other algorithms that can affect your finances.

Take a look at advertising algorithms, for example. These can -- and often do -- choose which ads you see based on your gender. When researchers conducted a field test to look for bias in who saw a STEM career ad, they found that women were less likely to see the ad, even though it was designed to be gender-neutral, because it costs more to advertise t young women than men.

Could we eliminate algorithmic bias in the future?

Efforts to improve the credit-scoring system have been underway for years.

FICO, Finicity, and Experian, for example, have created an alternative credit score called UltraFICO™. It evaluates some of these alternative data points that traditional credit scores don't use. The Petal Visa® credit card considers applicants with no traditional credit score. And Experian Boost helps applicants instantly raise their FICO® scores by adding phone and utility bill payments to their credit history.

Products like these broaden opportunities for borrowers (and potentially increase lenders’ and credit scorers’ profits, of course) by analyzing previously unconsidered data. Borrowers who would otherwise be excluded can now be included thanks to newer algorithms.

"In a way, it's sort of back to the future," David Shellenberger, Vice President of Scores and Predictive Analytics at FICO, told The Ascent. "We're using bank-statement-type information that was prevalent before credit bureaus became so prevalent. The technology that has really enabled this to occur is that, as opposed to photocopying your bank statements, that information is now accessible through open APIs and open banking."

Breaking into the black box

Of course, companies are also on the hook for policing their own algorithms.

"What's essential is that lenders are able to interpret both the inputs and outputs of any model to ensure it's not perpetuating bias," said Jay Budzik, Chief Technology Officer at Zest AI, a company that uses alternative data to help those deemed unscorable gain access to loans.

"AI models make this analysis slightly more complicated because they use more data and churn through millions of data interactions," Budzik said. "But they are proven to provide wider access for communities that have been locked out of housing, credit, and other opportunities because of discriminatory barriers."

Living with algorithmic bias

Because most algorithms are kept secret, it's hard to know if you've suffered because of algorithmic bias. But if you're part of any of the groups above, there's a good chance that you have.

Unfortunately, there's not much you can do about it at the moment. But if you believe that you may have been discriminated against because of a financial algorithm, it's worth keeping an eye on initiatives like UltraFICO™, Experian Boost, and the Petal Visa®.

These efforts may help financial institutions eliminate some of the algorithmic bias they're perpetuating today.

Our Research Expert