

The Takeaway
On August 1, 2025, Illinois Governor Pritzker signed HB 1806, the Wellness and Oversight for Psychological Resources Act, into law. The Act imposes restrictions on the use of artificial intelligence (AI) by licensed professionals who provide therapy and psychotherapy services to Illinois residents. The Act prohibits using AI to provide therapy or psychotherapy to clients and limits it to administrative and supplementary tasks involving client information—but only if the client gives informed consent. Violations can result in civil penalties, including fines of up to $10,000.
Professionals providing mental health services in Illinois or to Illinois residents should update their policies and procedures governing their use of AI to ensure proper compliance.
The Act
Passed unanimously in both chambers of the Illinois General Assembly, HB 1806 seeks to “safeguard individuals seeking therapy or psychotherapy services by ensuring these services are delivered by qualified, licensed, or certified professionals … and protect consumers from unlicensed or unqualified providers, including unregulated artificial intelligence systems.”[1] The new law applies directly to individuals, corporations, and entities that provide therapy or psychotherapy services and regulates the use of AI in providing mental health services.
People Affected
The Act specifically prohibits any “individual, corporation, or entity” from providing or offering any therapy services “through the use of Internet-based artificial intelligence.”[2]
A “licensed professional” includes: clinical psychologists, clinical social workers, social workers, professional counselors, clinical professional counselors, marriage and family therapists, professional music therapists, advanced practice psychiatric nurses, and other professionals authorized to provide therapy services in Illinois.[3] The Act specifically excludes religious counseling, peer support, and self-help and educational resources that don’t claim to offer therapy or psychotherapy services.[4]
Restricted Actions
The Act prohibits using artificial intelligence to:
- make independent therapeutic decisions
- directly interact with clients for “therapeutic communication”
- generate therapeutic recommendations or treatment plans without review and approval by a licensed professional
- detect emotions or mental states of patients[5]
Under these restrictions, a licensed professional may use AI to generate recommendations or plans for a patient, but the professional must review and approve the plans before utilizing or recommending them.
Permitted Actions
Although the Act restricts AI, it also clarifies permissible uses.
A professional may use AI for “administrative” and “supplementary” support as long as the professional maintains full responsibility for the AI’s content and data.[6]
The Act defines “administrative support” as tasks that don’t involve therapeutic communication, such as scheduling, sending reminders, and processing billing and insurance claims.[7] It defines “supplementary support” as tasks that don’t involve therapeutic communication but do interact with client information, such as preparing and maintaining client records, analyzing anonymized data to track client progress, and identifying/organizing client resources or referrals.[8]
This distinction is important. If AI is used for supplementary support—such as recording or transcribing a client’s session—a professional must have the client’s written informed consent to use the AI system. The professional also must tell the client that AI will be used and specify its purpose.[9]
Enforcement
The Act is enforced by the Illinois Department of Financial & Professional Regulation (IDFPR), which is responsible for investigating suspected violations. Anyone violating the Act may be fined up to $10,000 (depending on the circumstances surrounding the violation).
Other States’ Approaches
Other states have also passed legislation regulating the use of AI in providing mental health services. However, Illinois is the first state to specifically regulate the use of AI systems by licensed professionals providing therapy services to clients.
Utah
Utah passed legislation in March 2025 that requires suppliers of AI chatbots for mental health purposes to disclose that the chatbot is AI, not a human.[10] The law defines “mental health chatbots” as AI meant “to engage in interactive conversations with a user … similar to the confidential communications an individual would have with a licensed mental health therapist” and that the supplier “can or will provide mental health therapy.”[11] Unlike Illinois, Utah permits AI systems to provide mental health services if they disclose their nature.
Nevada
In June 2025, Nevada prohibited AI providers from offering systems that claim to be “capable of providing professional mental or behavioral health care,” those that “[stimulate] human conversation in order to obtain professional mental or behavioral health care,” or programs that would “constitute the practice of professional mental or behavioral health care if provided by a natural person.”[12]
This law bars AI companies and systems from representing their software as capable of providing mental health care or advertising any feature of their system as mental health care. Like Illinois, Nevada prohibits AI from claiming to provide therapy services offered by licensed professionals. However, the Nevada law also applies restrictions to school resources (such as school counselors, psychologists, and social workers).[13]
New York
New York’s recent budget bill included a provision that requires operators of AI systems to “provide a clear and conspicuous notification to a user at the beginning of any AI companion interaction [that] states… the user is not communicating with a human.”[14] While not directly targeting mental health services, this provision does require anyone who may use AI systems in providing therapy services in New York to make the proper disclosures to clients. Additionally, the New York bill requires AI systems to include reasonable protocols for “detecting and addressing suicidal ideation or expressions of self-harm expressed by a user” and to provide crisis service information.[15]
Implications for Professionals
Based on the new requirements, professionals and companies providing mental health services to clients in Illinois should:
- Conduct a review of AI uses in their practice. Create a comprehensive list of AI-assisted tasks currently used. Distinguish between administrative, supplementary, and therapeutic uses. For uses that are therapeutic, cease using AI until guidelines have been created to ensure AI use remains compliant.
- Create or update written consent forms. If using AI for any supplementary tasks, update or create informed consent forms that clearly explain how and why artificial intelligence is being used. Ensure all current and future clients are informed of the new requirements.
- Update communications and advertising. Verify that all communications, marketing, and advertising comply with the Act. Ensure that any entity providing an AI system for self-help does not claim to provide mental health services or therapy.
Ensuring compliance with the Act’s requirements is especially important for telehealth services providing therapy or psychotherapy services to clients who could be based in Illinois. The Act applies to all providers serving Illinois residents, regardless of the provider’s location.
[1] Wellness and Oversight for Psychological Resources Act, Pub. Act 104-0054, 2025 at Section 5, https://www.ilga.gov/legislation/PublicActs/View/104-0054; https://idfpr.illinois.gov/content/dam/soi/en/web/idfpr/news/2025/2025-08-04-idfpr-press-release-hb1806.pdf
[2] Wellness and Oversight for Psychological Resources Act, Pub. Act 104-0054, 2025 at Section 20(a), https://www.ilga.gov/legislation/PublicActs/View/104-0054
[3] Wellness and Oversight for Psychological Resources Act, Pub. Act 104-0054, 2025 at Section 10
[4] Wellness and Oversight for Psychological Resources Act, Pub. Act 104-0054, 2025 at Section 35
[5]Wellness and Oversight for Psychological Resources Act, Pub. Act 104-0054, 2025 at Section 20(b)
[6] Wellness and Oversight for Psychological Resources Act, Pub. Act 104-0054, 2025 at Section 15(a)
[7] Wellness and Oversight for Psychological Resources Act, Pub. Act 104-0054, 2025 at Section 10
[8] Wellness and Oversight for Psychological Resources Act, Pub. Act 104-0054, 2025 at Section 10
[9] Wellness and Oversight for Psychological Resources Act, Pub. Act 104-0054, 2025 at Section 15(b)
[10] H.B. 452, 2025 Gen. Sess. (Utah 2025), https://le.utah.gov/~2025/bills/static/HB0452.html
[11] H.B. 452, 2025 Gen. Sess. (Utah 2025), https://le.utah.gov/~2025/bills/static/HB0452.html
[12] Assemb. B. 406, 83d Leg., Reg. Sess. (Nev. 2025), https://www.leg.state.nv.us/App/NELIS/REL/83rd2025/Bill/12575/Text#
[13] Assemb. B. 406, 83d Leg., Reg. Sess. (Nev. 2025), https://www.leg.state.nv.us/App/NELIS/REL/83rd2025/Bill/12575/Text#
[14] S. 3008-C, 2025 N.Y. Laws ch. 20 Art. 47 (2025), https://nyassembly.gov/2025budget/bills2025/enacted/A3008C.pdf
[15] S. 3008-C, 2025 N.Y. Laws ch. 20 Art. 47 (2025), https://nyassembly.gov/2025budget/bills2025/enacted/A3008C.pdf
- Associate
Maria E. Ceriotti defends healthcare providers, hospitals, and businesses in medical malpractice cases, employment disputes, and general liability cases. She focuses on reducing a client’s risk, resolving disputes ...
- Partner
Meg L. Fowler focuses her practice on trials involving multiple issues, including:
- General personal injury
- Professional liability
- Product liability
- Insurance coverage issues
- Transportation
She has authored, argued, and ...