KHI - Kansas Health Institute Inc.

02/25/2026 | Press release | Distributed by Public on 02/25/2026 16:22

Legislative Approaches to Artificial Intelligence Advisory Bodies

Legislative Approaches to Artificial Intelligence Advisory Bodies

Structure, Scope and Authority

7 Min Read

Feb 25, 2026

By

Shelby C. Rowell, M.P.A.

As artificial intelligence (AI) expands into more sectors and domains, conversations about its implications for states are increasing nationwide. Policymakers are considering what AI may mean for governance, economic development, public services and public trust, while also examining questions related to transparency, human oversight, biasand data privacy.

A KansasTwist provides national news relevant to Kansas. Sign up hereto receive these summaries and more, and alsofollow KHI on  Facebook,  X, LinkedIn and Instagram. Previouseditions of A Kansas Twist can be found on our ARCHIVE PAGE.

Federal activity, including multiple executive actions in 2025, has further shaped the evolving AI policy landscape. Among these actions, a Dec. 11, 2025, executive order established a national AI policy framework and directed federal agencies to evaluate state AI laws for potential conflict with federal objectives. As federal AI policy continues to evolve, states are exploring legislative approaches to address emerging needs within their jurisdictions. To help Kansas prepare for emerging technologies and remain responsive to this evolving landscape, Kansas lawmakers are considering establishing an AI task force through House Bill (HB) 2592.

As states consider how to design these advisory bodies, questions arise about their composition, responsibilities and role within the broader policymaking process. This blog examines legislative approaches to AI advisory bodies, with a specific focus on Kansas lawmakers' consideration of an AI task force under HB 2592, introduced on Jan. 29, 2026, by Representative Nick Hoheisel.

Kansas HB 2592: Themes on Task Force Structure and Hearings

HB 2592 would establish the Kansas Task Force on Artificial Intelligence and Emerging Technologies to study the impacts, risks, workforce implications and governance considerations associated with artificial intelligence and related technologies. As drafted in HB 2592, the proposed task force would serve in an advisory capacity and would be responsible for providing findings and recommendations to inform future legislative or administrative action. The bill also outlines membership categories and reporting requirements, including a defined timeline for delivering written reports to the Legislature.

The House Legislative Modernization Committee held a bill hearing on HB 2592 on Feb. 9, 2026, during which the Kansas Health Institute (KHI) provided neutral testimony summarizing how other states have structured AI advisory bodies.

Separate from the statutory task force proposed in HB 2592, Representative Sean Willcott described an existing Kansas Legislative Artificial Intelligence Task Force during testimony on Feb. 11. He characterized the group as a non-voting, informational resource intended to build a knowledge base for legislators by convening members from the Legislature, executive branch, attorney general's office and higher education institutions, as well as private sector representatives. He outlined three areas of focus for the Task Force, including regulation and guardrails, legislative use of AI, and budget or funding considerations. Committee discussion centered on questions of representation, process and transparency, including whether broader bipartisan and geographic participation would strengthen the group's credibility and how the group relates to the statutory task force proposed in HB 2592.

To place Kansas' approach in context, KHI reviewed 2025 AI advisory body legislation from other states.

State Comparison

KHI analyzed 16 bills and resolutions across 13 states and territories from the 2025 legislative session. The bills and resolutions established AI-focused task forces, councils, commissions and work groups, identified through National Conference of State Legislatures (NCSL) website. This analysis explored how legislatures are designing AI advisory bodies in statute and identified common elements across states.

Task Force Composition

Across states, legislation typically specifies both membership categories and appointment authority for AI task forces in statute, often combining legislative representation and executive branch participation alongside technical experts. For example, Texas (HB 3808, failed - adjournment) would have established an AI advisory council with both legislative and executive branch appointees, with the executive branch specifically appointing subject matter experts in ethics, AI intelligence systems, law enforcement usage of AI and constitutional rights. Mississippi (SB 2426, enacted) establishes the Artificial Intelligence Task Force, with legislative co-chairs and executive agency representation, including the state's information technology agency and the attorney general. Across states, task force composition is defined in statute to ensure cross-branch participation and access to subject-matter expertise.

Scope and Focus Areas

AI advisory bodies are generally charged with a broad scope of study, rather than a single-issue area, including the impacts and risks of AI, governance and regulatory considerations, workforce and economic implications, and the use of AI within governmental operations. In many cases, one of the core functions of advisory bodies is to provide recommendations for policy. For example, legislation in New Jersey (A 4400, pending) would direct its Artificial Intelligence Council to study AI and assess the advantages and disadvantages of state agencies procuring, developingand implementing AI.

Duration

Most AI advisory bodies proposed or established in 2025 artificial intelligence legislation were structured as time-limited bodies, rather than permanent entities. Such legislation commonly specified an expiration date or otherwise linked the duration of the advisory body to the completion of required reports or recommendations. Across the bills analyzed, task force duration most often ranged from 12 to 24 months, with many terminating by July 1, 2027, while the longest duration observed among this group of legislation was four years. For example, Maryland (MD 956, enacted) included a provision that would terminate the task force on June 30, 2029. Alternatively, New Jersey (S 3357, pending) would provide that the advisory council terminates upon submission of its report to the governor and Legislature. While the bill does not set a fixed report deadline, it would require the report to be submittedwithin one year of the council's first meeting.

Deliverablesand Timelines

Written reports summarizing findings and recommendations are the most common deliverables. Some states require a single final report, while others mandate recurring annual updates. Kentucky (SCR 142), for example, requires submission of findings and legislative recommendations by a specific date. In contrast, Virginia (SB 621) and West Virginia (H 3187) require annual reporting to legislative and executive leadership. Some states establish phased timelines with interim milestones, while others rely on a single reporting deadline.In most cases, statutes clearly define when information must be delivered to policymakers.

Authority and Limitations

Across the bills reviewed, AI task forces are generally established as advisory bodies, with formal authority limited to studying artificial intelligence, reviewing or assessing its use and implications, and making recommendations to inform future legislative or administrative action. For example, Texas (HB 3808) directs its advisory council to study agency AI systems and review inventories but leaves implementation authority with agencies and the Legislature. West Virginia (H 3187) requires agencies to provide administrative support but limits the task force to recommend best practices and potential legislation. New Jersey (S 3357) authorizes interaction with state agencies while confining the council's role to study and recommendations. In short, these bodies may gather information and coordinate across government, but they do not possess rulemaking or enforcement authority.

KHI Artificial Intelligence Policy Resources

In addition to analyzing state legislation, KHI and its partners have developed evidence-informed resources, such as Developing Artificial Intelligence (AI) Policies for Public Health Organizations: A Template and Guidance, to support public-sector organizations as they consider artificial intelligence governance and oversight. These resources reflect many of the same structural considerations embedded in state advisory bodies, such as clarifying accountability, defining oversight roles, establishingtransparency practices, and incorporating risk assessment and workforce literacy into policy discussions.

Looking Ahead

The design of AI advisory bodies, including questions of structure, scope, duration and authority, represents one approach legislatures are using to examine emerging technologies. HB 2592 reflects Kansas' consideration of such an advisory model within a broader federal and state policy environment that continues to develop.

As policymakers consider potential next steps, comparative analysis of other states' approaches and access to evidence-informed governance resources may help inform ongoing discussions. KHI will continue to monitor artificial intelligence legislative developments and provide nonpartisan analysis to support informed decision-making.

About Kansas Health Institute

The Kansas Health Institute supports effective policymaking through nonpartisan research, education and engagement. KHI believes evidence-based information, objective analysis and civil dialogue enable policy leaders to be champions for a healthier Kansas. Established in 1995 with a multiyear grant from the Kansas Health Foundation, KHI is a nonprofit, nonpartisan educational organization based in Topeka.

Learn More About KHI

KHI - Kansas Health Institute Inc. published this content on February 25, 2026, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on February 25, 2026 at 22:22 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]