Bill Summary
The "Protect Elections from Deceptive AI Act" aims to prevent the distribution of misleading audio or visual media generated by artificial intelligence (AI) regarding candidates for federal office. This legislation addresses concerns about the potential for AI-generated content to distort public perception during elections.
Key provisions include:
1. **Prohibition**: It prohibits the knowing distribution of materially deceptive AI-generated media that could influence elections or solicit funds. This includes any content that misrepresents a candidate's appearance, speech, or actions in a way that would mislead a reasonable person.
2. **Definitions**: The act defines "deceptive AI-generated audio or visual media" as content created through AI that alters original media to appear authentic, leading to a fundamentally different understanding of the depicted candidate.
3. **Exemptions**: Certain entities, such as news outlets that disclose the deceptive nature of the content, are exempt from this prohibition. Additionally, content that qualifies as satire or parody is also excluded.
4. **Legal Recourse**: Candidates whose likeness or voice is misrepresented can seek injunctive relief and damages against the distributors of such deceptive media. The act establishes a civil action process for violations, emphasizing the burden of proof on the plaintiff.
5. **Defamation Implications**: A violation of this act is considered defamation per se, simplifying the legal process for affected individuals.
Overall, the legislation seeks to safeguard the integrity of electoral processes by addressing the challenges posed by advanced AI technologies in media creation.
Possible Impacts
Here are three examples of how the "Protect Elections from Deceptive AI Act" could affect people:
1. **Candidates for Federal Office**: The legislation provides candidates with protection against the distribution of materially deceptive AI-generated media that could misrepresent them. If a political opponent or third party were to release a manipulated video or audio clip that paints a candidate in a negative light, this candidate would have the legal right to seek injunctions and damages. This could help maintain the integrity of the electoral process and ensure that candidates are not unfairly harmed by misleading representations.
2. **Media and News Organizations**: This legislation outlines specific exemptions for media outlets, allowing them to broadcast deceptive AI-generated media if it is part of bona fide news coverage and is clearly disclosed as potentially misleading. This could impact how journalists and news organizations approach reporting during elections. They may need to exercise greater caution in their broadcasting and reporting practices, ensuring clarity and transparency about the authenticity of the media they present. It could also prompt them to develop new editorial policies regarding AI-generated content.
3. **Voters and the General Public**: For voters, this legislation aims to reduce the likelihood of encountering misleading AI-generated media that could distort their understanding of candidates and issues. By prohibiting the distribution of such deceptive media, voters may find it easier to make informed choices based on accurate representations of candidates. This could lead to a more informed electorate and help prevent the manipulation of public opinion through deceptive media tactics. However, it may also create challenges in discerning what constitutes "materially deceptive" content, potentially requiring voters to critically evaluate the sources of the media they consume.
[Congressional Bills 119th Congress]
[From the U.S. Government Publishing Office]
[H.R. 5272 Introduced in House (IH)]
<DOC>
119th CONGRESS
1st Session
H. R. 5272
To prohibit the distribution of materially deceptive AI-generated audio
or visual media relating to candidates for Federal office, and for
other purposes.
_______________________________________________________________________
IN THE HOUSE OF REPRESENTATIVES
September 10, 2025
Ms. Johnson of Texas (for herself, Mr. Fitzpatrick, Ms. Houlahan, and
Mr. Tony Gonzales of Texas) introduced the following bill; which was
referred to the Committee on House Administration
_______________________________________________________________________
A BILL
To prohibit the distribution of materially deceptive AI-generated audio
or visual media relating to candidates for Federal office, and for
other purposes.
Be it enacted by the Senate and House of Representatives of the
United States of America in Congress assembled,
SECTION 1. SHORT TITLE.
This Act may be cited as the ``Protect Elections from Deceptive AI
Act''.
SEC. 2. PROHIBITION ON DISTRIBUTION OF MATERIALLY DECEPTIVE AI-
GENERATED AUDIO OR VISUAL MEDIA PRIOR TO ELECTION.
(a) In General.--Title III of the Federal Election Campaign Act of
1971 (52 U.S.C. 30101 et seq.) is amended by adding at the end the
following new section:
``SEC. 325. PROHIBITION ON DISTRIBUTION OF MATERIALLY DECEPTIVE AI-
GENERATED AUDIO OR VISUAL MEDIA.
``(a) Definitions.--In this section:
``(1) Covered individual.--The term `covered individual'
means a candidate for Federal office.
``(2) Deceptive ai-generated audio or visual media.--The
term `deceptive AI-generated audio or visual media' means an
image, audio, or video that--
``(A) is the product of artificial intelligence or
machine learning, including deep learning techniques,
that--
``(i) merges, combines, replaces, or
superimposes content onto an image, audio, or
video, creating an image, audio, or video that
appears authentic; or
``(ii) generates an inauthentic image,
audio, or video that appears authentic; and
``(B) a reasonable person, having considered the
qualities of the image, audio, or video and the nature
of the distribution channel in which the image, audio,
or video appears--
``(i) would have a fundamentally different
understanding or impression of the appearance,
speech, or expressive conduct exhibited in the
image, audio, or video than that person would
have if that person were hearing or seeing the
unaltered, original version of the image,
audio, or video; or
``(ii) would believe that the image, audio,
or video accurately exhibits any appearance,
speech, or expressive conduct of a person who
did not actually exhibit such appearance,
speech, or expressive conduct.
``(3) Federal election activity.--The term `Federal
election activity' has the meaning given the term in section
301(20)(A)(iii).
``(b) Prohibition.--Except as provided in subsection (c), a person,
political committee, or other entity may not knowingly distribute
materially deceptive AI-generated audio or visual media of a covered
individual, or in carrying out a Federal election activity, with the
intent to--
``(1) influence an election; or
``(2) solicit funds.
``(c) Inapplicability to Certain Entities.--This section shall not
apply to the following:
``(1) A radio or television broadcasting station, including
a cable or satellite television operator, programmer, or
producer, or a streaming service that broadcasts materially
deceptive AI-generated audio or visual media prohibited by this
section as part of a bona fide newscast, news interview, news
documentary, or on-the-spot coverage of bona fide news events,
if the broadcast clearly acknowledges through content or a
disclosure, in a manner that can be easily heard or read by the
average listener or viewer, that there are questions about the
authenticity of the materially deceptive AI-generated audio or
visual media.
``(2) A regularly published newspaper, magazine, or other
periodical of general circulation, including an internet or
electronic publication, that routinely carries news and
commentary of general interest, and that publishes materially
deceptive AI-generated audio or visual media prohibited under
this section, if the publication clearly states that the
materially deceptive AI-generated audio or visual media does
not accurately represent the speech or conduct of the covered
individual.
``(3) Materially deceptive AI-generated audio or visual
media that constitutes satire or parody.
``(d) Civil Action.--
``(1) Injunctive or other equitable relief.--
``(A) In general.--A covered individual whose voice
or likeness appears in, or who is the subject of, a
materially deceptive AI-generated audio or visual
media, including content distributed as part of a
Federal election activity, distributed in violation of
this section may seek injunctive or other equitable
relief prohibiting the distribution of materially
deceptive AI-generated audio or visual media in
violation of this section.
``(B) Precedence.--An action under this paragraph
shall be entitled to precedence in accordance with the
Federal Rules of Civil Procedure.
``(2) Damages.--
``(A) In general.--A covered individual whose voice
or likeness appears in, or who is the subject of, a
materially deceptive AI-generated audio or visual
media, including content distributed as part of a
Federal election activity, distributed in violation of
this section may bring an action for general or special
damages against the person, committee, or other entity
that distributed the materially deceptive AI-generated
audio or visual media.
``(B) Attorney's fees and costs.--In addition to
any damages awarded under subparagraph (A), the court
may also award a prevailing party reasonable attorney's
fees and costs.
``(C) Rule of construction.--Nothing in this
paragraph shall be construed to limit or preclude a
plaintiff from securing or recovering any other
available remedy.
``(3) Burden of proof.--In any civil action alleging a
violation of this section, the plaintiff shall bear the burden
of establishing the violation through clear and convincing
evidence.''.
(b) Effect on Defamation Action.--For purposes of an action for
defamation, a violation of section 325 of the Federal Election Campaign
Act of 1971, as added by subsection (a), shall constitute defamation
per se.
(c) Severability.--If any provision of this Act, or an amendment
made by this Act, or the application of such provision to any person or
circumstance, is held to be invalid, the remainder of this Act, or an
amendment made by this Act, or the application of such provision to
other persons or circumstances, shall not be affected.
<all>