2026 State AI Bills That Could Expand Liability, Insurance Risk
This article was originally published by Law360 and is available here and as a PDF here.
In a December article, we examined trends in enacted legislation regarding artificial intelligence, focusing on 2025 laws that created new private rights of action for alleged AI-related damages. In this article, we highlight trends for 2026 based on legislation that has been introduced but not yet enacted. These proposed laws provide insight into evolving and potentially new avenues for liability. Underwriters and claims professionals should stay informed of these developing trends and competing priorities.
Although numerous states have introduced legislation specifically addressing AI, the Trump administration issued an executive order in December that seeks to "sustain and enhance the United States' global AI dominance through a minimally burdensome national policy framework for AI."[1] The executive order directs the secretary of commerce to evaluate state AI laws and identify those that conflict with the administration's policy. An "AI Litigation Task Force" established by the attorney general is directed to challenge such laws.
The executive order also seeks to establish a regulatory framework that preempts state laws and to withhold funding for certain programs from states whose AI-related laws conflict with the administration's policy. At this early stage, it is unclear if the administration will be able to preempt state legislation successfully via executive order, and states have continued to introduce, evaluate and pass AI-related legislation.
This article provides a brief overview of notable, liability-expanding trends emerging in state legislatures focusing on the following issues: nonconsensual intimate deepfakes; AI companions; data privacy; chatbot disclosures; dynamic and individualized pricing; policing; elections; and healthcare.
Nonconsensual Intimate Deepfakes
Several states have introduced legislation that would create private rights of action for those depicted in synthetic media in intimate or sexual manners.[2] These bills continue a trend we identified in legislation already enacted in other states.[3]
Currently, a New York bill is making its way through committee, which would establish a private right of action for individuals depicted in deepfake, intimate images and sexually explicit material.[4]
A person depicted in such material would be permitted to bring a civil lawsuit against any individual who, for the purpose of harassing, annoying or alarming, disseminates, publishes or threatens to disseminate or publish the material, where the depicted person had a reasonable expectation of privacy. The private right of action extends to websites that host or transmit such material. The bill would permit plaintiffs to seek punitive damages in addition to compensatory damages and injunctive relief.
A similar Virginia bill would provide a right of action to any person depicted in synthetic media by expanding the code provisions related to defamation, slander and libel.[5] Although the bill would permit actual damages and other relief, such as punitive damages, defendants may mitigate damages by issuing an apology before litigation or proving the truth of the material. It remains unclear if synthetic media intended to depict a real-life event or person would be considered false for liability.
Alaska's proposal mirrors the Virginia bill but goes further by classifying undisclosed use of synthetic media as "defamation per se" and exempting internet service providers and broadcasters from liability.[6] The bill also offers a distinct definition of synthetic media by requiring that synthetic media cause "materially different understanding."
AI Companions
Although statutes may define the term "AI companion" differently, they generally refer to AI systems, typically chatbots, that are designed to simulate a sustained human-like relationship with a user.[7] Some states already have enacted legislation intended to protect children from sensitive topics that AI chatbots may discuss.[8] We expect this trend to continue in 2026.
For instance, Michigan's Legislature is considering a bill that would permit covered minors and their guardians to bring civil actions for damages, including punitive damages, for chatbots that are capable of encouraging self-harm, drug use, violence, illegal activities or disordered eating; offering mental health therapy; engaging in erotic or sexually explicit interactions; prioritizing validation of the user's beliefs or desires over factual accuracy or safety; or optimizing engagement in ways that override required safety guardrails.[9]
Although the bill's private right of action requires minor children to suffer actual harm, the state attorney general may impose a civil penalty regardless of whether any covered minor is actually harmed.
Florida's Legislature is working on the "Artificial Intelligence Bill of Rights."[10] Among other things, this bill would regulate companion chatbots for minors.
The bill would require parental consent for minors to create or maintain accounts, and would permit parents to access all interactions between a minor and a chatbot, limit chatbot use, including total time spent and ability to interact with third parties, and receive notifications of a child's expressed intent to self-harm or harm others. It also seeks to prevent chatbots from producing or sharing harmful materials or encouraging harmful conduct.
In addition to state enforcement, the bill would provide minors and their guardians with the right to sue for up to $10,000 per violation within one year of the violation. Punitive damages are reserved only for state enforcement.
Data Privacy
States also are addressing data privacy concerns arising from AI.[11] For example, Vermont is considering a bill that provides a limited private right of action for violations, including violations of rights to access personal data, transparency rights, rights to correct inaccuracies and to delete data, and rights to opt out of targeted advertising, the sale of personal data, and certain types of profiling.[12]
Although the bill grants the state attorney general the power to enforce the law, it also provides Vermont consumers with a private right of action for certain provisions, with damages including statutory and punitive damages. The bill includes several restrictions on the private actions, requiring consumers to notify the Vermont attorney general and issue a demand letter before initiating a lawsuit. The bill also would exempt companies with less than $25 million in revenue from private lawsuits.
Chatbot Disclosure
Among other states,[13] Minnesota's Legislature is considering a bill that aims to protect consumers by requiring businesses to disclose when individuals are communicating with artificial intelligence.[14] It also requires companies to provide an opt-out, ensuring that consumers can always choose to interact with a human.
In addition to authorizing the Minnesota attorney general to enforce the act, the bill would permit individuals to sue for violations of the law and permits actual damages and statutory damages capped at $1,000 per violation, among other relief.
Dynamic and Individualized Pricing
States are increasingly addressing issues concerning consumer pricing aided by AI systems. For instance, AIs may rely on personal data to determine how items are priced for a specific consumer.
New York is considering a bill titled "Protecting Consumers and Jobs from Discriminatory Pricing Act."[15] The bill seeks to target personalized algorithmic pricing in retail establishments. If enacted, the bill would ban electronic shelving labels, personalized pricing, pricing based on class data and all personalization for minors.
It exempts financial services companies and provides a private right of action for consumers, employees and labor organizations. The bill specifically would permit class actions and would permit plaintiffs to recover actual damages or statutory damages of no less than $5,000 per violation, as well as treble damages, disgorgement and other relief.
A Minnesota bill under consideration would prohibit any person from using AI to adjust, fix or control product prices in real time based on market demands, competitor prices, inventory levels, customer behavior, or other factors used to determine or set prices for a product.[16] Although the bill limits enforcement to the Minnesota attorney general, the state's Private Attorney General Act effectively permits individuals to enforce the law.[17]
A North Carolina bill would address rent-fixing with assistance of algorithmic pricing, including AI-driven tools.[18]
The bill would preclude real estate lessors or their agents to pay for or subscribe to "coordinating functions" that use "nonpublic competitor data" that could facilitate price-fixing. "Coordinating function" is defined to include activities such as collecting, analyzing, or processing rental data and recommending rental prices or terms using computational systems or algorithms, including AI.
The bill would permit aggrieved parties to bring legal action seeking damages, including punitive or treble damages. The bill also would invalidate forced arbitration agreements.
Policing
An emerging area of state legislation concerns the use of AI in policing.[19]
South Carolina has introduced a bill that would set rules for how state and local law enforcement agencies may use certain new technology systems, including AI-powered vehicle tracking.[20] The bill requires data security and retention measures and prohibits the use of AI in tracking vehicles based on appearance. The bill includes criminal penalties and a civil right of action permitting South Carolina residents to sue for violations concerning their data, allowing damages and injunctive relief.
Elections
States already have enacted laws regarding the use of AI in elections and election material, and this trend is expected to continue in 2026 as legislatures review additional proposals.[21]
For instance, Massachusetts has introduced a bill that attempts to "protect against election misinformation."[22] The bill would prohibit any communication within 90 days before an election that contains verifiably false information about the date, time or place of an election; requirements for registering or voting; election certifications; or endorsements by political parties, officials or organizations.
The bill covers synthetic media and AI-generated misinformation. It provides a private right of action for individuals whose voice or likeness appears in such materially deceptive election-related communication to sue violators for damages, among other relief.
Healthcare
States already have enacted legislation addressing the use of AI in healthcare, and this trend is expected to continue.[23]
Certain bills address safe staffing and seek to prohibit the use of AI from performing direct care that would otherwise be performed by a registered nurse.[24] These bills may include whistleblower protections for nurses, permitting private rights of action.[25] Other bills would limit the use of AI in talk therapy, though they do not tend to provide a private right of action.[26] Additional bills address disclosure requirements, mandating transparency when using AI in the healthcare setting.[27]
Conclusion
As states continue to introduce and refine AI legislation despite the White House's opposition, we expect many of the bills to fall within the above-discussed trends.
Although enforcement mechanisms vary, lawmakers often provide individuals with private rights of action, expanding the avenues for liability and redress. However, these legislative efforts are not uniform, and each state's approach varies in terms of enforcement mechanisms, available remedies, statutes of limitation and evidentiary standards. Some bills restrict enforcement to state attorneys general.
Collectively, these trends will reshape the liability landscape for all companies incorporating AI solutions into their business operations. Claims professionals and underwriters should be aware of these novel pathways to liability, as any novel private rights of action authorized under AI-related statutes signal expanding liability exposures for all businesses incorporating AI solutions. Underwriters should consider adding certain AI-specific policy terms, and claims professionals should be enhancing their knowledge of this emerging and expanding area of liability.
[1] Exec. Order No. 14,365, 90 Fed. Reg. 58499 (Dec. 11, 2025).
[2] E.g., S. 6278, 2025-26 Reg. Sess. (N.Y. 2025); H.B. 697, 2024 Sess. (Va. 2024); S.B. 33, 34th Leg., 1st Sess. (Ark. 2025); H.B. 2123, 103d Gen. Assemb., 2023-24 Sess. (Ill. 2023); S.B. 25-288, 75th Gen. Assemb., 1st Reg. Sess. (Colo. 2025).
[3] See Ken Ryan et al., 2025 State AI Laws Expand Liability, Raise Insurance Risks, Law360 (Dec. 17, 2025), https://www.law360.com/articles/2419598.
[4] S.B. 6278, 2025-26 Reg. Sess. (N.Y. 2025).
[5] H.B. No. 697, 2024 Sess. (Va. 2024).
[6] S.B. 33, 34th Leg., 1st Sess. (Alaska 2025).
[7] J.B. Branch, Intimacy on Autopilot: Why AI Companions Demand Urgent Regulation, Tech Policy (Apr. 10, 2025), https://www.techpolicy.press/intimacy-on-autopilot-why-ai-companions-demand-urgent-regulation/.
[8] See Ryan et al., supra.
[9] S.B. 760, 103d Leg., 2025 Reg. Sess. (Mich. 2025).
[10] S.B. 482, 2026 Sess. (Fla. 2026).
[11] E.g., S.B. 468, 2025-26 Reg. Sess. (Cal. 2025); S. 2516, 194th Gen Ct. (Mass. 2025); Legis. B. 642, 109th Leg., 1st Sess. (Neb. 2025).
[12] H. 208, 2025-26 Reg. Sess. (Vt. 2025).
[13] E.g., S.B. 243, ch. 677, 2025-26 Reg. Sess. (Cal. 2025); H.B. 3021, 104th Gen. Assemb., 2025-26 Sess. (Ill. 2025); S. File 1886, 94th Leg., 2025-26 Sess. (Minn. 2025).
[14] S. File 1886, 94th Leg., 2025-26 Sess. (Minn. 2025).
[15] Assemb. B. A9396, 2025-26 Reg. Sess. (N.Y. 2025).
[16] S. File 3098, 94th Leg., 2025-26 Sess. (Minn. 2025).
[17] Minn. Stat. §8.31, subd. 3a (2025).
[18] H.B. 970, 2025-26 Sess. (N.C. 2025).
[19] See, e.g., Assemb. B. A7172, 2025-26 Reg. Sess. (N.Y. 2025); Assemb. B. A9253, 2025-26 Reg. Sess. (N.Y. 2025); H.B. 249, 2024 Sess. (Va. 2024).
[20] H. 4675, 126th Sess. (S.C. 2025).
[21]Ryan et al., supra.
[22] H. 76, 194th Gen Ct. (Mass. 2025).
[23] See Ryan et al., supra.
[24] See, e.g., H.B. 62, 33d Leg. (Haw. 2025); H. File 2289, 94th Leg., 2025-26 Sess. (Minn. 2025).
[25] H. File 2289, 94th Leg., 2025-26 Sess. (Minn. 2025).
[26] See, e.g., Assemb. B. A9106, 2025-26 Reg. Sess. (N.Y. 2025); H.B. 525, 136th Gen. Assemb., 2025-26 Reg. Sess. (Ohio 2025).
[27] See, e.g., H. 1210, 194th Gen Ct. (Mass. 2025); S.B. 2259, 104th Gen. Assemb., 2025-26 Sess. (Ill. 2025).



