HOUSING IT
What the Algorithm Misses Unjustly So, Again and Again
January 22, 2026
There are many ways to discover that housing algorithms measure the wrong things. Some people learn it when choosing between a child’s medication and rent results in a permanent eviction filing on their record. Others learn it when years of housing policy expertise don’t translate to passing an automated screening system. Different starting points, different circumstances—same destination: a system that optimizes for patterns instead of people.
I learned it both ways. As someone who experienced algorithmic rejection while working inside the systems that deploy these algorithms. And as someone who helped permanently house thirteen people after those same systems said no.
Marcus was one of those thirteen people.
What the algorithm saw:
Eviction filing from 18 months prior
Credit score: 510
Risk assessment: DENY
What I knew: The eviction filing came when Marcus chose to pay for his daughter’s asthma medication instead of rent during a particularly difficult month. The case was dismissed, but the filing stayed on his record. He was a father who prioritized his child’s ability to breathe over a landlord’s ledger.
What happened: Through a community-based housing initiative, we gave Marcus the chance that standard screening wouldn’t. Eighteen months later, he had never missed a rent payment, earned a promotion at work, and his daughters were thriving in stable schools for the first time in years.
The algorithm had measured his past. We measured his capacity.
Perhaps I write this because I am Insecure, or have been there before.
I applied for housing while working in housing policy. I understood screening systems because I’d helped implement them. I knew the credit thresholds, the background check protocols, the risk assessment formulas.
And I was rejected. The algorithm measured credit damage from caregiving costs and employment transitions—the same life circumstances millions of working Americans navigate. It saw patterns that correlated with risk in its training data. It didn’t see consistent rent payment history, stable employment, or a deep understanding of housing systems.
Different route from Marcus. Same destination: DENIED.
What does it mean when the screening system rejects both the father, prioritizing his daughter’s health, and the professional administering housing programs? When it fails across income levels, education levels, and life circumstances?
It means the algorithm isn’t measuring housing stability. It’s measuring conformity to historical patterns—patterns built on decades of discriminatory lending, healthcare inequity, and economic instability.
The Pattern Across Different Routes
That first year, we permanently housed thirteen people—every one of them rejected by standard algorithmic screening. They included a practicing architect relocating from California, two truck drivers navigating wage insecurity, a CNA with two children (one with significant health needs), recent college graduates, a traveling nurse, a couple working while homeless, and a family displaced by storm damage.
Thirteen different routes to algorithmic rejection. Zero evictions once housed. Zero failures. The algorithm was 0 for 13.
Beyond direct placement, we engaged nine additional community members as an advisory and working group—residents, family members, and stakeholders directly impacted by housing instability. They participated in program design, attended closings, and helped shape how the work evolved. We also provided emergency shelter when needed.
Twenty-two people served in different capacities that first year—because housing solutions require community leadership, not just professional administration.
What Gets Lost When Housing Becomes Strictly Data
When housing decisions are automated through algorithmic screening, we lose the ability to see what systems can’t measure:
The algorithm sees: Credit score below threshold | What it misses: Years of consistent rent payment despite economic shocks.
The algorithm sees: Eviction filing | What it misses: A parent choosing their child’s health over a landlord’s timeline.
This isn’t about making emotional decisions or ignoring financial sustainability. Property owners have legitimate interests in reliable tenants. The question is: what actually predicts housing stability? The data from our thirteen permanent placements suggests the algorithm is measuring the wrong things.
I-It vs. I-Thou: What Algorithms Can’t See
The philosopher Martin Buber described two ways we relate to the world: I-Thou and I-It.
I-Thou is relationship. It’s seeing another person fully—their context, their capacity, their humanity. It’s presence, not transaction.
I-It is different. It’s seeing another person as object, as means to an end, as data point. It’s necessary sometimes—we can’t have deep relationships with everyone. But it becomes dangerous when institutions enforce I-It relationships where I-Thou should exist.
Housing should be I-Thou. It’s about people finding home, building stability, and creating community. It requires seeing capacity, understanding context, and recognizing that someone’s ability to maintain housing can’t be reduced to a credit score.
But algorithmic screening enforces I-It relationships. When Marcus walked into that housing office, the algorithm saw: credit score 510, eviction filing, risk assessment DENY. It couldn’t see Thou—the father navigating impossible choices, the worker building toward stability, the human being whose capacity the numbers couldn’t capture.
This isn’t about abandoning data. It’s about whether our systems preserve the possibility of a relationship even as they process information. Can we build accountability mechanisms that allow us to see both the pattern AND the person? Can we measure what actually matters without reducing people to data points?
That’s the question algorithmic screening forces us to answer.
The Burning House
Dr. Martin Luther King Jr. feared we were “integrating into a burning house”—gaining access to systems already failing.
Today, we’re integrating housing into algorithmic systems—replacing human judgment with automated screening that encodes historical bias in code.
When algorithms reject both those experiencing poverty and those working to address it, when they fail across demographic and economic lines, when their predictions prove wrong more often than they prove right—we’re not building better systems. We’re automating the failures of the old ones.
The house isn’t just burning from historical discrimination. It’s burning from the belief that complex human decisions can be reduced to credit scores and pattern matching.
Building Systems That Measure What Matters
The solution isn’t abandoning data—property owners need sustainable models. But we need systems that measure what actually predicts housing stability.
At AVANTKOFA, we’re building solutions grounded in these principles:
Transparency by default: People should know when algorithms assess them, what factors matter, and how decisions are made. The current black-box approach—where applicants receive only “DENIED” with no explanation—prevents meaningful appeal or improvement.
Community oversight built in: The communities most affected by algorithmic screening should have power in how these systems are designed and deployed. Not just consultation—actual decision-making authority.
Human override always available: There must be pathways for human judgment to supersede algorithmic decisions when context matters. Marcus’s situation required someone who could understand that an eviction filing during a medical crisis says nothing about future housing stability.
Alternative assessment models: We should measure what actually predicts success. Rental payment history is weighted more heavily than credit scores. Employment trajectory over current income snapshots. References from community members who can speak to character and capacity.
Right to explanation and appeal: Anyone denied housing based on an algorithmic assessment should receive a clear explanation of the decision factors and a meaningful opportunity to contest errors or provide context.
These aren’t radical proposals. They’re basic accountability measures for any system with this much power over people’s lives.
What Comes Next
Housing insecurity is reaching crisis levels. The systems we’ve built to address it—including algorithmic screening meant to remove bias—are often perpetuating the problem.
We can build better. Not by choosing between data and humanity, but by designing systems that measure what actually matters.
This is the work of closing the Affect Gap™—building systems that see capacity, not just patterns. That measure what matters, not just what’s measurable. The Essential Servant™ navigating algorithmic rejection is experiencing the same structural friction as the first responder receiving a pin for a life they couldn’t save. Different contexts. Same gap.
The thirteen people we housed didn’t need the algorithm to be more lenient. They needed it to measure the right things. They needed someone to see their capacity, not just their patterns. They needed a system designed for housing stability, not pattern conformity.
If you’re a housing authority, state agency, or private sector leader ready to move from analysis to action, let’s build these alternatives together. The proof of concept exists. The framework works. The question is whether we have the will to implement it.
This essay connects to what I call the Affect GaaP™—the structural dissonance between human capacity and systemic measurement. When algorithms measure the wrong things, they widen the gap between what people can do and what systems allow them to become.
I explore this fully in my forthcoming book, Cause and Affect: Why the Essential Servant Is Burning Out—and How to Stop It. The CaaS™ (Capacity-as-a-Service) Framework offers a structural response—for individuals navigating precarity and for institutions ready to measure what actually matters.
At AVANTKOFA, we’re building the civic infrastructure that closes these gaps—housing, workforce, and digital systems designed for human capacity, not pattern conformity.
Subscribe to Forward Notes (Substack) for more, or reach out: hello@antoinemwilliams.com


