In 2024, the Colorado legislature enacted SB24-205 [1] governing Consumer Protections for Artificial Intelligence (the “Act”) and established a regulatory framework aimed at lowering the risk of algorithmic discrimination in artificial intelligence-based decision-making technology. For example, “[a]lgorithmic bias occurs when systemic errors in machine learning algorithms produce unfair or discriminatory outcomes…reflect[ing] or reinforc[ing] existing socioeconomic, racial and gender biases.” [2] The Act applies broadly to “developers” and “deployers” of “high-risk artificial intelligence systems,” including employers, and was originally to be implemented by February 1, 2026. During a Special Legislative Session in August of 2025, however, the General Assembly amended the Act by extending the compliance deadline to June 30, 2026 [3] to allow industry leaders and legislators additional time to address concerns about the Act before it is implemented.
This blog discusses: (1) the Act as currently written, focusing on its application to employers; [4] (2) recent efforts to revise the law before employers are required to comply with its terms; and (3) steps employers should consider taking before implementation of the Act.
SB24-205: The Act and its Application to Employers
The Act requires a “deployer” of a “high-risk artificial intelligence system” to use reasonable care to protect “consumers” from any known or reasonably foreseeable risks of “algorithmic discrimination” in the system. [5] It also requires a deployer who deploys any kind of AI system intended to interact with consumers to make certain disclosures, alerting them to the fact that they are interacting with an AI system. [6] The Act gives the state attorney general rule-making authority to implement the Act and exclusive authority to enforce the requirements of the Act. [7] Violation of the Act constitutes a deceptive trade practice under the Colorado Consumer Protection Act (the “CCPA”).8 The CCPA allows the attorney general to impose expensive monetary penalties for each violation. [9]
Definitions
Under the Act, a “deployer” is defined as a “person doing business in this state that deploys a high-risk artificial intelligence system.”[10] “Artificial intelligence systems” means “any machine-based system that, for any explicit or implicit objective, infers from the inputs the system received how to generate outputs, including content, decisions, predictions, or recommendations, that can influence physical or virtual environments.”[11] A “high-risk artificial intelligence system” means “any artificial intelligence system that, when deployed, makes, or is a substantial factor in making, a consequential decision.” [12] As applicable to employers, a “consequential decision” means “a decision that has a material legal or similar significant effect on the provision or denial to any consumer of, or the cost or terms of … employment or an employment opportunity.” [13] A “consumer” is defined as “an individual who is a Colorado resident.” [14] “Algorithmic discrimination” means:
…any condition in which the use of an artificial intelligence system results in an unlawful differential treatment or impact that disfavors an individual or group of individuals on the basis of their actual or perceived age, color, disability, ethnicity, genetic information, limited proficiency in the English language, national origin, race, religion, reproductive health, sex, veteran status, or other classification protected under the laws of this state or federal law. [15]
Given the definitions in the Act, employers using AI to assist them, for example, in making employment decisions related to hiring, compensating, retaining, promoting, or terminating employees, will likely need to comply with the Act.
Rebuttable Presumption
The Act creates a rebuttable presumption that a deployer used reasonable care if the deployer complied with specified provisions in the Act, including but not limited to:
- Implementing a risk management policy and program for the AI high-risk system, which includes specific, detailed information required by the Act; [16]
- Completing an impact assessment of the high-risk system that includes the specific, detailed information required by the Act; [17]
- Annually reviewing the deployment of each high-risk system deployed by the deployer to ensure that the high-risk system is not causing algorithmic discrimination; [18]
- Notifying a consumer via specific, detailed notice, including all the information required by the Act if the high-risk system makes, or will be a substantial factor in making, a consequential decision concerning the consumer before the decision is made; [19]
- Providing a consumer with an opportunity to correct any incorrect personal data that a high-risk system processed in making a consequential decision; [20]
- Providing a consumer in certain situations to opt out of the processing of personal data concerning the consumer for purposes of certain types of profiling; [21]
- Providing a consumer with an opportunity to appeal, via human review if technically feasible, an adverse consequential decision concerning the consumer arising from the deployment of a high-risk system; [22]
- Making a publicly available statement summarizing the types of high-risk systems that the deployer currently deploys, how the deployer manages any known or reasonably foreseeable risks of algorithmic discrimination that may arise from deployment of each of these high-risk systems, and the nature, source, and extent of the information collected and used by the deployer; [23] and
- Disclosing to the attorney general the discovery of algorithmic discrimination, within 90 days after the discovery, that the high-risk system has caused. [24]
Affirmative Defense
The Act also provides an affirmative defense for a deployer, or other person to an action brought by the attorney general if: (a) the deployer discovers and cures a violation of the Act as a result of certain feedback, adversarial testing or red teaming, or an internal review process; and (b) the deployer, or other person involved in a potential violation is otherwise in compliance with a nationally or internationally recognized risk management framework for artificial intelligence systems that the Act or the attorney general designates. [25]
SB25B-004: Efforts to Revise the Law Before it is Implemented
When Governor Polis signed SB24-205 into law, he urged the General Assembly to meet with stakeholders and industry representatives to work on revising the bill before its implementation. Over the next year, stakeholders and legislators discussed and negotiated potential amendments to the Act, but were unable to reach an agreement during the 2025 Legislative Session. Before the session ended, the Governor and other state leaders asked the General Assembly to extend the Act’s implementation deadline to January 2027 [26], but the General Assembly declined to do so. In August of 2025, Governor Polis called the General Assembly back into session to, among other things, consider the fiscal and implementation impact of SB24-205 on consumers, businesses, and the state and local government.[27] During the Special Session, legislators passed SB25B-004, extending the implementation deadline of the Act to June 30, 2026. Stakeholders and legislators are expected to continue discussions about amendments in the months to come and during the 2026 Legislative Session. Some industry leaders have stated support for protecting Coloradans from unfair or biased uses of AI but believe the Act needs to be “clear, practical, and focused on real risks” to avoid negative impacts on the Colorado economy. [28]
Next Steps for Employers
Employers have until June 30, 2026, to prepare for the implementation of the Act and should consider taking steps now to ensure they are able to timely comply with its requirements. As a starting point, an employer would be wise to do the following:
- Familiarize yourself with the Act and stay informed about proposed amendments and changes to the law;
- Watch for new rules established by the state attorney general under the Act and consider engaging in the rule-making process to make your voice heard;
- Consult an attorney to determine whether the Act applies to your business and whether your business is complying with the Act;
- Inventory all AI being used by your business;
- Determine whether AI used by the business constitutes a high-risk system under the Act;
- Evaluate the terms and conditions of any AI user agreements to identify potential risk-shifting opportunities and liabilities;
- If your business is using a high-risk system to make employment decisions, develop and implement a plan to ensure that the rebuttal presumption applies by complying with all the steps outlined in the section above to take advantage of that presumption;
- Develop and implement a plan for alerting consumers and employees that they are interacting with an AI system used by the business for any purpose; and
- Determine whether the business is complying with a nationally or internationally recognized risk management framework for AI systems that the Act or the attorney general designates, and take measures to discover and correct violations of the Act.
Our Team
If you have questions about SB24-205 and its potential effect on your business, please contact BHGR’s Employment Group today.
This article is informational only. The information provided on this website does not, and is not intended to, constitute legal advice; instead, all information, content, and materials available on this site are for general informational purposes only. Information on this website may not constitute the most up-to-date legal or other information. Readers of this website should contact their attorneys to obtain advice with respect to any particular legal matter. No reader, user, or browser of this site should act or refrain from acting based on information on this site without first seeking legal advice from counsel in the relevant jurisdiction. Only your individual attorney can provide assurances that the information contained herein—and your interpretation of it—is applicable or appropriate to your particular situation. All liability with respect to actions.
- https://leg.colorado.gov/sites/default/files/2025a_144_signed.pdf.
- https://www.ibm.com/think/topics/algorithmic-bias#:~:text=Algorithmic%20bias%20occurs%20when%20systematic,socioeconomic%2C%20racial%20and%20gender%20biases.
- https://leg.colorado.gov/sites/default/files/2025b_004_signed.pdf
- This blog does not address the ways in which the Act may be applicable to employers who are also developers of AI systems for purposes of the Act.
- See https://leg.colorado.gov/sites/default/files/2024a_205_signed.pdf at § 6-1-1703.
- See id. at § 6-1-1704.
- See id. at §§ 6-1-1706, 6-1-1707.
- See id. at § 6-1-105.
- See Colo. Rev. Stat. § 6-1-112.
- https://leg.colorado.gov/sites/default/files/2024a_205_signed.pdf at § 6-1-1701(6).
- Id. at § 6-1-1701(2).
- Id. at6-1-1701(9)(a). A high-risk AI system does not include tools that perform narrow procedural tasks, detect decision-making patterns or deviations from prior decision-making patterns and are not intended to replace of influence a previously completed human assessment without sufficient human review, or technologies that are not used for making consequential decisions such as anti-virus software, cybersecurity, databases, firewalls, and a list of other types of technology. See id. at § 6-1-1701(9)(b).
- Id. at § 6-1-1701(3)(b).
- Id. at § 6-1-1701(4).
- d. at § 6-1-1701(1)(a). Algorithmic discrimination does not include “the offer, license or use of a high-risk artificial intelligence system by a…deployer for the sole purpose of: (A) The…deployer’s self-testing to identify, mitigate or prevent discrimination or otherwise ensure compliance with state and federal law; or (B) Expanding an application, customer, or participant pool to increase diversity or redress historical discrimination.” Id. at § 6-1-1701(1)(b)(I). Algorithmic discrimination also does not include certain acts or omissions by private clubs or establishments that are not open to the public. See id. at § 6-1-1701(1)(b)(II).
- See id. at § 6-1-1703(2).
- See id. at § 6-1-1703(3).
- See id.
- See id. at § 6-1-1703(4).
- See id.
- See id.
- See id.
- See id.
- See id. § 6-1-1703(7).
- See id. at § 6-1-1706.
- https://www.colorado.gov/governor/news/mayor-johnston-governor-polis-attorney-general-weiser-senator-bennet-congressman-neguse
- https://www.colorado.gov/governor/news/governor-polis-calls-special-session-address-budget-hole-created-federal-bill
- https://completecolorado.com/wp-content/uploads/Letter-on-SB24-205-_-Special-Session-2.1.1.docx-1.pdf
