Quick background
Credit scoring started as simple point systems built on payment history and debt levels. Over the past two decades, lenders added machine learning and nontraditional data (like rent or alternative income signals) to improve predictions. These newer models can boost accuracy but also risk inheriting historical discrimination embedded in the data (CFPB; Federal Reserve).
In my practice I’ve seen well-intentioned models deny credit to otherwise creditworthy applicants because the model overweighted narrow historical signals. That’s why transparency and governance matter.
How these algorithms actually work
- Lenders train models on past lending outcomes and applicant features (credit accounts, payment history, income, public records).
- Features that correlate with repayment get weight; models then score new applicants to predict default risk.
- When correlated features are also correlated with race, neighborhood, or socioeconomic status, the model’s outputs can produce disparate impacts even if race is not an input. Regulators call this indirect or proxy discrimination (Equal Credit Opportunity Act; CFPB guidance).
Real-world evidence and consequences
Researchers and regulators have documented disparities in scores and pricing across demographic groups. The CFPB and academic studies show Black and Hispanic borrowers are more likely to receive lower scores or higher prices for similar financial profiles (CFPB research). That can mean higher interest rates, larger down payments, or outright denials — outcomes that widen wealth gaps.
Who is most affected
- People with thin or no credit files (young adults, recent immigrants).
- Residents of historically redlined neighborhoods whose financial patterns differ from the training data.
- Borrowers with nontraditional income or employment paths (gig workers, entrepreneurs).
Those groups often face higher false-negative rates (denied credit when they would have repaid), which reduces access to mainstream loans and pushes some toward costlier alternatives.
Practical steps for borrowers
- Ask for an adverse action notice and the specific reasons for denial — lenders must provide this under ECOA/FCRA. Reviewing those reasons helps you target fixes. (See CFPB resources.)
- Build a record lenders use: report rent and utilities where possible; consider secured credit or becoming an authorized user to establish history. (See how rent and utility reporting can improve scores: “How Rent and Utility Reporting Can Improve Personal Credit Scores”.)
- Shop lenders: community banks, credit unions, and fintechs using alternative underwriting may weigh cash flow or business performance instead of strict score cutoffs. Explore alternatives like “Alternative Underwriting: Using Cash Flow Instead of Credit Scores.”
- Document income and business performance clearly; provide bank statements and tax forms to supplement automated reviews.
What lenders and policymakers can do
- Test models for disparate impact using holdout groups and counterfactual analysis.
- Remove or de-emphasize proxy variables that correlate with protected classes.
- Use explainable AI methods so underwriting decisions are interpretable and auditable.
- Expand safe, privacy-protected data sources (e.g., verified rent or bill payment data) that improve fairness.
Many lenders are piloting these steps, but progress varies. Federal regulators and the CFPB have issued guidance encouraging fair lending review of automated systems.
Common misconceptions
- “AI eliminates bias.” No — algorithms reflect the data and objectives humans set. Without active fairness checks, they can entrench bias.
- “Only banks cause bias.” Fintechs, credit bureaus, and data brokers all influence model inputs and can contribute to unfair outcomes.
Quick FAQ
- What can I do if I suspect discrimination? File a complaint with the CFPB and consult a fair-lending attorney or community organization; collect your adverse action notice and supporting documents.
- Can models be fixed? Yes. With targeted re-training, feature selection, and fairness constraints, models can reduce disparate impacts while retaining predictive power.
Professional disclaimer
This article is educational and does not replace legal or personalized financial advice. For case-specific guidance, consult a qualified attorney or certified financial professional.
Sources and further reading
- Consumer Financial Protection Bureau (CFPB): https://www.consumerfinance.gov/
- Federal Reserve research and policy statements: https://www.federalreserve.gov/
- Equal Credit Opportunity Act (summary and rights): see CFPB resources
Internal resources
- Alternative underwriting: Alternative Underwriting: Using Cash Flow Instead of Credit Scores
- Rent and utility reporting: How Rent and Utility Reporting Can Improve Personal Credit Scores
If you want, I can produce a short checklist you can use when you receive an adverse action notice or a template request for the data a lender used in your application.

