
Federal housing regulators need to exercise stronger oversight of facial recognition and other technology tools used in public housing operations, a Federal watchdog says in a new report, warning that without clearer rules, the technology could fuel discrimination against renters.
While “property technology tools” used for advertising, tenant screening, rent-setting, and facial recognition can improve safety, they pose risks related to discrimination and privacy that should be addressed by the Department of Housing and Urban Development (HUD), the Government Accountability Office (GAO) reported on August 11.
GAO explained that “potential renters may struggle to understand, and owners to explain, the basis for screening decisions made by algorithms,” and that facial screening systems “might misidentify individuals from certain demographic groups,” while landlords “might use surveillance information without renter consent.”
While some steps have been taken to address risks by public housing agencies to address risks through the court system, GAO officials said that agencies require more “specific direction on key operational issues” including privacy risks or data sharing.
“More detailed written direction could provide public housing agencies additional clarity on the use of facial recognition technology and better address tenant privacy concerns,” said GAO.
Facial recognition technology is often used for building security, GAO explained, which allows entry after comparing a renter’s face to a database of stored images using artificial intelligence-powered computer vision.
However, that technology can also result in high error rates for Black women in particular and has created concerns across privacy advocacy groups that surveillance data could be used without renter consent.
“In the rental housing context, such inaccuracies could result in frequent access denials for some individuals,” GAO said of error rates for minority communities.
Those findings aren’t new – the National Institute of Standards and Technology found in 2019 that Asian and Black communities had between 10 and 100 times the rate of false positives compared to white communities. Indigenous Americans had the highest rate of false positives across tested demographics, with NIST suggesting that “more diverse training data” could help prevent those outcomes.
Other factors that may contribute to errors, GAO said, could include “poor lighting, facial expressions, and obscured facial features,” and data quality such as “outdated or low resolution images used for comparison … could also impact camera’s “accuracy.”
To address these concerns, GAO said that HUD should provide additional written direction to public housing agencies on the use of facial screening technology which could “define what constitutes renter consent, and address data management and accuracy concerns.”
HUD officials had told GAO that the agency has no plans to issue additional guidance on facial recognition technology, instead stating that public housing agencies must have autonomy in choosing to implement the technology.
Specific risks related to tenant screening include inaccurate information that lead to unwarranted rental application denials, algorithmic decision making that isn’t transparent about what data is used or how it is weighted, and relying too heavily on criminal history and eviction records which can also disproportionately impact Black and Hispanic applicants, GAO said.