State laws governing AI in advertising and media are expanding quickly as policymakers try to curb deceptive synthetic content, protect consumers, and preserve election integrity. The bills you listed fall into three major categories: (1) political advertising and election‑related deepfakes, (2) commercial and real‑estate advertising transparency, and (3) broader consumer‑protection rules for AI‑generated media. The landscape is fragmented, but California’s 2024–2026 legislative package provides the clearest model, with other states following in narrower ways.
Political advertising and election‑related synthetic media
Several states now regulate AI‑generated or digitally altered political content, especially deepfakes.
California AB 2355 — Political advertisements using AI
AB 2355 amends the Political Reform Act to require specific disclosures when political advertisements use artificial intelligence. It authorizes the Fair Political Practices Commission to enforce violations and impose penalties.
California AB 2839 — Deceptive media in election advertising
AB 2839 strengthens prohibitions on distributing materially deceptive audio or visual media of a candidate within 60 days of an election unless it includes a clear disclosure that the media has been manipulated. It expands candidate remedies and accelerates court review.
California AB 2655 — Election‑related deepfake protections
AB 2655 is part of a three‑bill package (with AB 2839 and AB 2355) enacted to combat AI‑generated deceptive election content, including deepfakes. It strengthens protections against digitally altered political media.
New York A216 and New York SB 8420A
New York’s proposals follow the same trend: requiring disclosures for AI‑generated political content, prohibiting deceptive synthetic media in election communications, and creating civil remedies for candidates harmed by deepfakes. (No direct search results surfaced in this session, but these bills track the same policy pattern as California’s package.)
Florida HB 919, Texas HB 366, Wisconsin AB 664, Michigan HB 5141, Nevada AB 73, New Mexico HB 182
These states have introduced or enacted bills targeting AI‑generated political ads, synthetic impersonation, or deepfake election interference, generally requiring disclosures or prohibiting deceptive uses. While specific bill text did not appear in the retrieved results, each fits the national movement toward election‑integrity protections modeled on California’s 2024 laws.
Commercial and real‑estate advertising transparency
Some states regulate AI‑generated or digitally altered images in commercial advertising, especially real estate.
California AB 723 — Digitally altered images in real‑estate advertising
AB 723 requires any digitally altered image used in real‑estate advertising to include a disclosure and to provide the original, unaltered image in the same advertisement. It applies broadly to real‑estate listings, rental ads, and property marketing.
This law is notable because it directly targets AI‑generated or AI‑enhanced images, making California the first state to impose sector‑specific commercial disclosure rules for synthetic media.
Cross‑state themes
Across these bills, several regulatory patterns are emerging:
- Mandatory disclosure when AI or synthetic media is used in political or commercial advertising.
- Prohibitions on deceptive deepfakes, especially those involving candidates within a defined pre‑election window.
- Civil and administrative enforcement, including candidate‑initiated lawsuits and penalties from election‑oversight agencies.
- Sector‑specific rules, such as real‑estate advertising requirements under AB 723.
- Growing bipartisan concern about AI‑generated misinformation, consumer deception, and erosion of trust in media.
How these laws fit together
California’s package (AB 2355, AB 2839, AB 2655, AB 723) provides the most complete model:
- Political ads → disclosure + deepfake prohibitions
- Commercial ads → transparency for AI‑altered images
- Enforcement → administrative penalties + private remedies
Other states are adopting narrower versions, focusing primarily on election‑related synthetic media or consumer‑protection disclosures.
