Colorado AI Law: What Every Nonprofit Leader Needs to Know About SB 205 (Before It’s Too Late)

A woman with digital code projections on her face, representing technology and future concepts.

Colorado AI Law: What Every Nonprofit Leader Needs to Know About SB 205 (Before It’s Too Late)

 

Your nonprofit might already be breaking Colorado’s new AI law. And you don’t even know it.

That AI chatbot on your website? The resume screening tool you started using? The donor prediction software your development team loves?

All of them could trigger compliance requirements under Colorado’s groundbreaking AI legislation.

If you’re an Executive Director or board member at a Colorado nonprofit, this isn’t theoretical. It’s coming for you. And the clock is ticking.

This guide answers every burning question about the Colorado AI Act (CAIA), also known as SB24-205, with specific implications for nonprofits.

Is the Colorado AI Law Delayed?

Yes. The Colorado AI Act has been delayed.

Governor Jared Polis signed SB 25B-004 on August 28, 2025, pushing the effective date from February 1, 2026 to June 30, 2026.

Here’s the backstory that matters:

Governor Polis had concerns from day one. The same day he signed the original law in May 2024, he wrote a letter urging lawmakers to “fine tune” it. He worried about stifling innovation and driving businesses out of Colorado.

Multiple attempts to amend the law failed in 2025. SB 25-318, which would have narrowed the law significantly, died in committee. A last-minute attempt to insert a delay into an unrelated bill also failed.

Finally, during a special legislative session in August 2025, lawmakers couldn’t agree on substantive changes. So they compromised: delay the whole thing until June 2026 and try again during the regular 2026 session.

The bottom line for nonprofits: You have until June 30, 2026 to get compliant. But don’t let that extra time make you complacent. The law WILL take effect. And the requirements aren’t simple.

When Did the Colorado AI Act Pass?

Governor Jared Polis signed SB24-205 into law on May 17, 2024.

Colorado became the second U.S. state to pass major AI consumer protection legislation, following Utah’s more limited law in March 2024.

Senate Majority Leader Robert Rodriguez introduced the bill. It passed with bipartisan support, though it attracted criticism from both tech industry groups and consumer advocates (for opposite reasons).

The Colorado AI Act was modeled partly on the European Union’s AI Act, though it’s narrower in scope. It focuses specifically on preventing “algorithmic discrimination” in AI systems that make “consequential decisions” affecting people’s lives.

Key dates to remember:

Event Date
Law signed May 17, 2024
Original effective date February 1, 2026
Delay bill signed August 28, 2025
New effective date June 30, 2026

Did SB25-276 Pass in Colorado?

Yes, SB25-276 passed. But it has nothing to do with AI.

This is a common point of confusion. SB25-276 is about protecting civil rights related to immigration status. Governor Polis signed it on May 23, 2025.

If you’re looking for AI-related bills from 2025, here’s what actually happened:

SB 25-318 (Artificial Intelligence Consumer Protections): FAILED. This bill would have significantly revised the original Colorado AI Act by narrowing definitions, expanding exemptions for small businesses, and delaying implementation to January 2027. Senate Majority Leader Rodriguez introduced it, then withdrew support when stakeholders couldn’t reach consensus.

SB 25B-004: PASSED. This special session bill simply delayed the original law’s effective date from February 1, 2026 to June 30, 2026. No substantive changes to the law itself.

The Colorado AI Act (SB24-205) remains intact. It will take effect as originally written, just five months later than planned.

What Is the New Law in Colorado for 2025?

The Colorado Artificial Intelligence Act (CAIA), also called SB24-205, is the landmark AI legislation taking effect June 30, 2026.

Here’s what it actually does:

The Core Purpose

The law exists to prevent “algorithmic discrimination.” That’s defined as any AI system that creates unlawful differential treatment or impact based on protected characteristics like:

  • Age
  • Race
  • Color
  • Disability
  • Ethnicity
  • Sex
  • Religion
  • National origin
  • Veteran status
  • And other classifications protected under Colorado or federal law

What Triggers Compliance?

The law applies to “high-risk AI systems” that make or substantially influence “consequential decisions” in these areas:

  1. Employment (hiring, firing, promotions, assignments)
  2. Education (enrollment, opportunities, assessments)
  3. Healthcare services (diagnosis, treatment recommendations)
  4. Housing (applications, tenant screening)
  5. Financial and lending services (loans, credit decisions)
  6. Insurance (coverage, rates, claims)
  7. Legal services (case outcomes, representation)
  8. Government services (benefits, eligibility)

If your nonprofit uses AI in ANY of these areas, you’re probably covered.

Who Has to Comply?

The law distinguishes between two roles:

Developers: Organizations that build or substantially modify AI systems. Think Microsoft, Salesforce, or the company that created your donor management software.

Deployers: Organizations that USE AI systems to make consequential decisions. This includes most nonprofits.

Even if you’re just using off-the-shelf software, you’re a “deployer” under this law.

What Deployers Must Do

If you’re a nonprofit that deploys high-risk AI systems, you must:

  1. Implement a risk management program (modeled on NIST’s AI Risk Management Framework)

  2. Complete impact assessments before deployment, annually thereafter, and within 90 days of substantial modifications

  3. Notify consumers when AI is used to make consequential decisions about them

  4. Provide appeals via human review when technically feasible for adverse decisions

  5. Report discrimination to the Attorney General within 90 days if you discover your AI system is causing or likely to cause algorithmic discrimination

  6. Post public statements on your website describing what high-risk AI systems you use and how you manage discrimination risks

The Penalties Are Real

Violations are treated as unfair trade practices under Colorado’s Consumer Protection Act.

Maximum penalty: $20,000 per violation.

And violations are counted separately for each consumer or transaction affected. If your resume screening tool discriminated against 100 applicants? That’s potentially $2 million in penalties.

There’s no private right of action (individuals can’t sue you directly), but the Colorado Attorney General has exclusive enforcement authority and has signaled this will be a priority.

Does This Apply to Colorado Nonprofits?

Yes. Nonprofits are not exempt from the Colorado AI Act.

This is the part that catches most nonprofit leaders off guard. They assume “we’re small” or “we’re mission-driven” offers protection.

It doesn’t.

The law applies to any organization that does business in Colorado and deploys high-risk AI systems. Your 501(c)(3) status is irrelevant.

Common Nonprofit AI Uses That Trigger Compliance

Most nonprofit leaders don’t think of themselves as “deploying AI.” But look at this list:

Employment decisions:

  • Resume screening software
  • Skills assessment tools
  • Performance review platforms with algorithmic components
  • Scheduling optimization tools that affect assignments

Donor/client services:

  • Chatbots that answer questions
  • Eligibility screening tools for services
  • Donor prediction and scoring algorithms
  • Automated grant application review

Program delivery:

  • Case management systems with algorithmic recommendations
  • Housing placement tools
  • Healthcare navigation assistants
  • Educational assessment platforms

If any of these sound familiar, you need to pay attention.

The Small Business Exemption (Read the Fine Print)

There IS a partial exemption for small deployers. But it’s narrower than you think.

To qualify, you must meet ALL of these criteria:

  1. Employ fewer than 50 full-time employees
  2. NOT use your own data to train or fine-tune the AI system
  3. Only use the AI system for its intended purposes (as disclosed by the developer)
  4. Provide consumers with the developer’s impact assessment

What the exemption actually covers:

If you qualify, you’re excused from:

  • Maintaining a risk management program
  • Conducting your own impact assessments
  • Creating public statements about your AI use

What you’re NOT excused from:

Even small nonprofits must still:

  • Exercise a “duty of care” to prevent algorithmic discrimination
  • Notify consumers when AI makes consequential decisions about them
  • Provide human appeal options for adverse decisions

And here’s the kicker: if you use your own data to customize, train, or fine-tune any AI tool (even a commercial one), you lose the exemption entirely.

That donor database you exported to “improve” your AI predictions? You may have just voided your exemption.

How Nonprofits Should Prepare

You have until June 30, 2026. Here’s what to do now:

Step 1: Audit Your AI Use (This Week)

List every software tool your organization uses. Then ask:

  • Does it use AI, machine learning, or algorithmic decision-making?
  • Does it affect hiring, programs, services, or donor interactions?
  • Does it make recommendations or decisions that affect individuals?

Most nonprofit leaders are shocked at how many tools qualify.

Step 2: Categorize Your Risk (This Month)

For each AI tool, determine:

  • Is it making “consequential decisions” under the law?
  • Are you a small deployer who qualifies for exemptions?
  • Have you customized or trained it with your own data?

Step 3: Contact Your Vendors (Next Quarter)

Developers are required to provide deployers with documentation including:

  • Impact assessments
  • Descriptions of training data
  • Known limitations and discrimination risks
  • Guidance for compliant deployment

Request this documentation now. If a vendor can’t provide it, that’s a red flag.

Step 4: Build Your Compliance Infrastructure (By Q1 2026)

If you don’t qualify for exemptions, you’ll need:

  • A written risk management policy
  • Documented impact assessments for each high-risk system
  • Consumer notification procedures
  • Human appeal processes
  • Public disclosure statements
  • Training for staff who use AI systems

This isn’t something you slap together the week before the deadline.

Step 5: Get Expert Help

This law is complex. The Colorado Attorney General is still developing implementation rules. The definitions are evolving.

Consider working with:

  • Legal counsel familiar with AI governance
  • Compliance consultants who specialize in nonprofits
  • AI ethics specialists who can audit your systems

The cost of professional help is far less than $20,000-per-violation penalties.

What Happens If You Do Nothing?

Let’s be direct about the stakes.

Scenario 1: Enforcement action

The Attorney General identifies your nonprofit as using non-compliant AI. You receive a notice of violation. You have 60 days to cure the violation. If you don’t (or can’t), enforcement proceedings begin. Your nonprofit faces fines, injunctions, and public disclosure of your violations.

Your donor trust evaporates overnight.

Scenario 2: Discrimination complaint

A job applicant, program participant, or client believes your AI system discriminated against them. They file a complaint. Even if you ultimately prevail, the investigation consumes staff time, legal fees, and organizational attention for months.

Meanwhile, your mission suffers.

Scenario 3: Competitive disadvantage

Larger, better-resourced nonprofits get compliant first. They start advertising their responsible AI practices. Funders begin asking about AI governance in grant applications. You’re scrambling to catch up while they’re winning trust.

The nonprofits that take this seriously early will have a significant advantage.

The Opportunity Hidden in This Law

Here’s what most people miss:

The Colorado AI Act isn’t just a compliance burden. It’s a framework for using AI responsibly.

The impact assessments force you to actually understand how AI is affecting your clients. The risk management programs create accountability. The public disclosures build trust.

Nonprofits that embrace this thoughtfully will:

  • Catch discrimination issues before they become scandals
  • Build stronger relationships with communities they serve
  • Earn funder confidence in their AI governance
  • Set the standard for ethical technology use in the sector

The law is coming regardless. The question is whether you’ll treat it as a threat or an opportunity.

Your Next Step

Don’t wait until June 2026.

Start your AI audit this week. One spreadsheet. Every software tool. Three questions: Does it use AI? Does it affect people? Who’s responsible for compliance?

That’s it. Just start.

The nonprofits that move first will be the nonprofits that thrive in this new regulatory environment.

The ones that wait will be scrambling, stressed, and exposed.

Which one will you be?


This guide was prepared by ETS AI Consulting to help Colorado nonprofit leaders understand their obligations under the Colorado AI Act. For a personalized assessment of your organization’s AI compliance needs, contact us at [contact information].

Last updated: December 2025. This is general information, not legal advice. Consult qualified legal counsel for guidance specific to your situation.


Frequently Asked Questions

Q: Is there a minimum revenue threshold for the Colorado AI Act?

No. Unlike Colorado’s privacy law, there’s no minimum revenue or number of consumers. If you deploy high-risk AI in Colorado, you’re covered regardless of size.

Q: What if we only use AI for internal operations, not client-facing services?

Employment decisions are specifically covered. If you use AI for hiring, performance reviews, scheduling, or other employment-related decisions affecting Colorado residents, you’re subject to the law.

Q: Are HIPAA-covered entities exempt?

Partially. Healthcare entities covered by HIPAA are exempt only when providing AI-generated recommendations that require provider action and aren’t classified as high-risk. This is a narrow exemption with many caveats.

Q: What counts as “training AI with your own data”?

If you export your donor database, client records, or other organizational data to customize, fine-tune, or improve an AI system, you’re likely training it with your own data. This voids the small business exemption.

Q: Can we just stop using AI to avoid compliance?

You can, but you’d be handicapping your organization. The better approach is compliant AI adoption that gives you competitive advantages while meeting legal requirements.