In May, Colorado amended the Consumer Protection Act through Senate Bill 24-205 to add regulations for artificial intelligence. The law extends some of Colorado’s existing regulations against discrimination by algorithms and adds a large number of disclosure and reporting requirements to those who develop and deploy AI.

A General Outline of the New Law

The new law primarily concerns the use of artificial intelligence to discriminate against protected classes such as age, gender, ethnicity, genetic information, or veteran status. It is effective as of February of 2026 and requires developers of “high risk artificial intelligence systems,” defined generally as systems that make decisions with a material legal or significant affect on the provision or denial to any consumer, or the cost and terms, in fields including education, employment lending, health care, housing, insurance, and interestingly “legal service.”

The exclusions from “high risk” artificial intelligence systems include many kinds of traditional software including anti-fraud, anti-virus, and anti-malware technology, video games, calculator, networking, spell checking, spreadsheets, and other common software types. It also appears to exclude consumer chat bots so long as they are “subject to an accepted use policy that prohibits generating content that is discriminatory or harmful.” It requires developers to provide documentation about how they worked to avoid algorithmic discrimination and on how to monitor the output of the device for discrimination. They are also supposed to tell the Colorado attorney general about the known or foreseeable risks of discrimination.

Those who use “high risk” systems must use reasonable care to protect consumers from discrimination. If the deployer develops certain internal policies and provides the consumer with certain disclaimers, it gains a rebuttable presumption that it did use reasonable care. Certain deployers are excluded, however, including those with less than fifty full-time employees that do not use the developers own data to train the artificial intelligence so long as they have done something like an impact assessment on the use of the technology.

Deployers who makes artificial intelligence available to consumers directly are required to disclose the fact that the consumer is interacting with AI unless “it would be obvious to a reasonable person that the person is interacting with an artificial intelligence system.”

Importantly, the law does not seem to provide for any private right of action. So, for now, the state of Colorado alone will enforce its provisions.

An Advance Over Existing Laws

While the general scope of the new laws is broad, it is not as unprecedented as some news sources suggest. Colorado was already fairly advanced compared to other states when it comes to regulating the use of algorithms or AI to make decisions. Under the Colorado Privacy Act, C.R.S. s. 6-1-1301, et seq., certain businesses that collect data on Colorado consumers were already prohibited from using consumer’s personal data in violation of laws prohibiting unlawful discrimination. And, consumers had the right to opt out from profiling in the furtherance of decisions creating legal or similarly significant effects.

And, under C.R.S. s. 10-3-1104.9, the Colorado Division of Insurance was empowered to enact regulations to mitigate discriminatory impact of algorithms or predictive models used by insurance companies. This new law, then, expands an existing trend in Colorado to be suspicious of algorithms of any kind that bake in prejudices or bias.

Utilize Artificial Intelligence with Caution

While this new law does not go into effect for a year, those who are seeking to implement artificial intelligence into their businesses or create new software to do so should be keenly aware of the law and make sure the systems they build will comply with it. This is in addition to the previous concerns with the unpredictability and lack of confidentiality involved in using generative AI. If you are thinking of developing such a system and your business is engaged in any of the fields targeted by this new law, such as health care, housing, law, or education, you should consult with legal counsel.