The Housing Bill: What You Need to Know

When God closes a door, he opens a window. Recent legislation concerning the use of technology to make decisions related to housing in Colorado is a good example of this. Whereas the industry celebrated the loss of a bill that would have flatly banned the use of algorithms, they may have missed that some of the effect of this bill might be visited  upon them anyway in the form of a more general bill targeted at artificial intelligence.

The Colorado Fair Housing Act was first introduced and passed by the Colorado General Assembly in 1959, making Colorado the first state to enact such a law, nine years before the federal Fair Housing Act of 1968. The 1959 law primarily prohibited discrimination based on race, creed, color, and national origin or ancestry in both public and private housing. Over the years, the Colorado Fair Housing Act has undergone several amendments to expand its protections. Notably, subsequent legislative efforts included additional protections for sex, marital status, religion, and physical handicap. In more recent years, the Act has been further updated to include protections against discrimination based on sexual orientation, gender identity, and familial status, keeping the law in line with evolving societal norms and legal standards.

Landlords and The Housing Bill

In the 2024 legislative session, the Colorado House sought to expand the Act again with House Bill 24-1057.  The bill sought to ban landlords from using algorithms to determine rent prices, particularly when these tools rely on non-public data or facilitate collusion among landlords to fix prices. The proposal was driven by concerns that these algorithms could exacerbate housing affordability issues by artificially inflating rents and reducing competition in the rental market.

The bill defined “algorithmic device” in an extremely broad way. It was defined as a device using algorithms on calculations of data concerning local or statewide rent amounts to advise on landlord on what to charge.  There were no limitations on algorithms that were discriminatory, for instance. It reads like an attempt to try and prevent landlords from easily finding the prevailing rent or quickly determining the maximum rent a market will bear.

Despite initial momentum, the bill faced significant opposition from various industry groups, including the Colorado Apartment Association and technology companies. These groups argued that the algorithms help landlords manage vacancies and optimize rent prices based on market conditions. The Colorado Senate added amendments that weakened the bill, including language that permitted algorithms to be used so long as the data being used was available to the public for a reasonable charge. The amendments also added exceptions for trade publications or related to affordable housing program guidelines. The House did not accept this, and the bill failed.

Or did it?

In the same session, Colorado amended the Colorado Consumer Protection Act to regulate the use of artificial intelligence used for various purposes. Access to housing is one of the regulated uses. Under Senate Bill 24-205, those who develop or deploy AI to  make decisions with a material legal or significant effect on the provision or denial to any consumer, or the cost and terms, in fields including education, employment lending, health care, housing, insurance, other fields must take steps to avoid discrimination on the basis of protected classes such as age, gender, ethnicity, genetic information, or veteran status. Such developers and deployers have to use “reasonable care” to protect from discrimination, including internal testing and monitoring, and have to report certain information to the Colorado Attorney General. The bill’s requirements do not kick in until 2026.

AI

The AI bill could potentially be a back door to imposing at least some of the requirements from the failed algorithmic housing bill. At least arguably an algorithm that sets prices is making decisions about the provision of housing. So, while these algorithms are not per se illegal, they will now have be carefully vetted to make sure they are not reporting information in a discriminatory manner. It is conceivable that the demographic data of the different regions feeding into the algorithm might be the subject of inquiry by regulators interested in ferreting out discriminatory practices.

The laws are not identical in enforcement, either. The AI bill has no private right of action and instead is entirely entrusted to the Colorado Attorney General for enforcement. Presumably, there will be more detailed regulations issued that will give industry more guidance on what is and is not covered by the bill. And, it remains to be seen whether the particular software being used by landlords falls within one of the many exceptions in the AI bill for traditional forms of software.

The fate of these two bills demonstrate a new reality where algorithms and AI are involved in many different fields. Stakeholders should be mindful that even bills that are not directly targeted at their industries or interests can still have a profound effect when it regulates those systems. This complexity, and finding connections in unexpected places, is precisely why it is so important to have a skilled attorney representing your entity.