At Zillow, we see how people increasingly are turning to conversational AI to make their lives easier, and now we’re bringing that same simplicity to the home search process. With the new Zillow App in ChatGPT, powered by Zillow’s trusted housing insights and technology, buyers and renters can explore homes, neighborhoods and affordability in a natural, conversational way.
But with innovation comes responsibility. That’s why we designed this experience with fair housing front of mind. Increasing access to housing is core to Zillow’s mission, and we’ve built fair housing safeguards into this integration to make the experience as safe as possible for all consumers. No AI system is completely foolproof, but we’re taking this seriously by practicing responsible AI governance, and when issues arise, we use them to improve the safeguards so the experience becomes stronger and fairer over time.
Putting fairness first
Generative AI is powerful, but it needs guardrails, especially when it comes to something as important as housing. We know that even small details in search results or neighborhood descriptions can unintentionally cross the line into bias. Zillow took extra steps to prevent that:
- Built-in protections: We designed the Zillow App in ChatGPT with safeguards to reduce the risk of biased or inappropriate responses. For example, requests about neighborhoods “away from families” or “predominantly Hispanic areas” shouldn’t be answered.
- Smart filtering: With each request triggering Zillow’s app, we provide detailed guidance about fair housing and inform ChatGPT about our expected compliant behavior. When ChatGPT decides to leverage Zillow, we have additional checks in place based on the input that ChatGPT provides us with, including using our open-source Fair Housing Classifier to inform why specific homes would be surfaced to the user. If a query isn’t compliant, ChatGPT will inform the user about the fair housing issues and make suggestions on how they can rephrase in a compliant manner.
- Neighborhood descriptions under review: ChatGPT generates text to describe neighborhoods. Before showing these to consumers, we run every description through our Fair Housing Classifier. If content is flagged — for example, a request for neighborhoods that are “perfect for” or “undesirable for” protected classes of people — we automatically remove that description.
- Continuous testing: Our teams maintain a comprehensive set of fair housing “test queries” to see how the integration behaves in real scenarios. These include everything from direct violations (like “white neighborhoods”) to subtler cases (like suggesting listings that have accessible features for people with mobility impairments). We review failures, adjust thresholds and retrain safeguards regularly to strengthen protection over time.
- Human-in-the-loop testing: Technology alone isn’t enough. Zillow’s fair housing safeguards are backed by ongoing human review. Our Ethical AI team works alongside our Product and Engineering teams to design and refine mitigation strategies, with support and feedback from our Fair Housing Compliance team. Our teams run scenario-based tests, review edge cases flagged by our systems, and spot-check outputs to ensure the protections are working as intended, and have reviewed hundreds of multi-turn instructions that were likely to elicit a non-compliant response. And our Human in the Loop team performs separate rounds of testing and evaluation. This combination of automated filtering and human oversight helps us catch subtleties that machines alone might miss.
Why this matters
Fair housing is not just a legal requirement — it’s a moral one. Housing access shapes communities, opportunities and lives. Zillow is committed to making sure the tools we create empower everyone equally, no matter who they are or where they’re searching.
By creating and committing to these safeguards, we’re not only protecting consumers but also ensuring agents and partners can trust Zillow’s AI-powered tools to support their work responsibly.
No system is perfect. AI sometimes generates unexpected responses, and while our safeguards are designed to block and filter non-compliant content, there’s always a chance something could slip through. That’s why we combine automated protections with human oversight, and we continue to refine and strengthen the system over time.
The future of home search
The launch of Zillow App in ChatGPT is a huge step in what has been two decades of tech-forward innovation on behalf of consumers and the entire industry. As we continue that innovation, it will always have fairness and transparency at its core, continuing Zillow’s commitment to making home a reality for everyone, everywhere.