Bill Summary
The "Protect Elections from Deceptive AI Act" aims to prevent the distribution of misleading AI-generated audio or visual media related to candidates for federal office, particularly during election periods. The legislation defines "deceptive AI-generated media" as content created using advanced artificial intelligence techniques that misrepresents a candidate's appearance, speech, or actions, creating a false impression of authenticity.
Key provisions include:
1. **Prohibition**: It is illegal for individuals or entities to knowingly distribute such deceptive media with the intent to influence elections or solicit funds.
2. **Exceptions**: Certain entities, like news organizations or publications, can distribute this type of media if they clearly disclose its deceptive nature. Content that is classified as satire or parody is also exempt.
3. **Civil Actions**: Candidates whose likenesses are used in deceptive media can seek legal relief, including injunctions against distribution and damages for violations. They can also recover attorney's fees if they prevail in court.
4. **Severability**: If any part of the law is found invalid, the remaining provisions will still stand.
This legislation is part of an effort to maintain the integrity of elections and ensure that voters are not misled by manipulated media.
Possible Impacts
The "Protect Elections from Deceptive AI Act" could affect people in several ways, including:
1. **Enhanced Protection for Candidates**: Candidates running for federal office would have more protection against the dissemination of misleading AI-generated media that could damage their reputations or misrepresent their positions. This could lead to a more equitable electoral process where candidates are not subjected to unfair manipulation through deceptive media.
2. **Legal Recourse for Harmed Individuals**: Individuals whose likenesses or voices are used in materially deceptive AI-generated media would have the ability to seek legal action for damages. This provision allows candidates to hold entities accountable for harmful misinformation, potentially leading to greater accountability within media and political campaigning.
3. **Increased Public Awareness and Media Literacy**: The legislation may encourage news outlets and media platforms to clearly label and disclose the authenticity of AI-generated media, promoting a culture of transparency. As a result, the public might become more critical consumers of media, improving overall media literacy and reducing the impact of deceptive information in political discourse.
These impacts collectively aim to foster a more truthful electoral environment, though they also raise questions about the balance between regulation and free expression, especially in the context of satire and parody.
[Congressional Bills 119th Congress]
[From the U.S. Government Publishing Office]
[S. 1213 Introduced in Senate (IS)]
<DOC>
119th CONGRESS
1st Session
S. 1213
To prohibit the distribution of materially deceptive AI-generated audio
or visual media relating to candidates for Federal office, and for
other purposes.
_______________________________________________________________________
IN THE SENATE OF THE UNITED STATES
March 31, 2025
Ms. Klobuchar (for herself, Mr. Hawley, Mr. Coons, Ms. Collins, and Mr.
Bennet) introduced the following bill; which was read twice and
referred to the Committee on Rules and Administration
_______________________________________________________________________
A BILL
To prohibit the distribution of materially deceptive AI-generated audio
or visual media relating to candidates for Federal office, and for
other purposes.
Be it enacted by the Senate and House of Representatives of the
United States of America in Congress assembled,
SECTION 1. SHORT TITLE.
This Act may be cited as the ``Protect Elections from Deceptive AI
Act''.
SEC. 2. PROHIBITION ON DISTRIBUTION OF MATERIALLY DECEPTIVE AI-
GENERATED AUDIO OR VISUAL MEDIA PRIOR TO ELECTION.
(a) In General.--Title III of the Federal Election Campaign Act of
1971 (52 U.S.C. 30101 et seq.) is amended by adding at the end the
following new section:
``SEC. 325. PROHIBITION ON DISTRIBUTION OF MATERIALLY DECEPTIVE AI-
GENERATED AUDIO OR VISUAL MEDIA.
``(a) Definitions.--In this section:
``(1) Covered individual.--The term `covered individual'
means a candidate for Federal office.
``(2) Deceptive ai-generated audio or visual media.--The
term `deceptive AI-generated audio or visual media' means an
image, audio, or video that--
``(A) is the product of artificial intelligence
technology that uses machine learning (including deep
learning models, natural learning processing, or any
other computational processing techniques of similar or
greater complexity), that--
``(i) merges, combines, replaces, or
superimposes content onto an image, audio, or
video, creating an image, audio, or video that
appears authentic; or
``(ii) generates an inauthentic image,
audio, or video that appears authentic; and
``(B) a reasonable person, having considered the
qualities of the image, audio, or video and the nature
of the distribution channel in which the image, audio,
or video appears--
``(i) would have a fundamentally different
understanding or impression of the appearance,
speech, or expressive conduct exhibited in the
image, audio, or video than that person would
have if that person were hearing or seeing the
unaltered, original version of the image,
audio, or video; or
``(ii) would believe that the image, audio,
or video accurately exhibits any appearance,
speech, or expressive conduct of a person who
did not actually exhibit such appearance,
speech, or expressive conduct.
``(3) Federal election activity.--The term `Federal
election activity' has the meaning given the term in section
301(20)(A)(iii).
``(b) Prohibition.--Except as provided in subsection (c), a person,
political committee, or other entity may not knowingly distribute
materially deceptive AI-generated audio or visual media in carrying out
a Federal election activity or of a covered individual for the purpose
of--
``(1) influencing an election; or
``(2) soliciting funds.
``(c) Inapplicability to Certain Entities.--This section shall not
apply to the following:
``(1) A radio or television broadcasting station, including
a cable or satellite television operator, programmer, or
producer, or a streaming service that broadcasts materially
deceptive AI-generated audio or visual media prohibited by this
section as part of a bona fide newscast, news interview, news
documentary, or on-the-spot coverage of bona fide news events,
if the broadcast clearly acknowledges through content or a
disclosure, in a manner that can be easily heard or read by the
average listener or viewer, that there are questions about the
authenticity of the materially deceptive AI-generated audio or
visual media.
``(2) A regularly published newspaper, magazine, or other
periodical of general circulation, including an internet or
electronic publication, that routinely carries news and
commentary of general interest, and that publishes materially
deceptive AI-generated audio or visual media prohibited under
this section, if the publication clearly states that the
materially deceptive AI-generated audio or visual media does
not accurately represent the speech or conduct of the covered
individual.
``(3) Materially deceptive AI-generated audio or visual
media that constitutes satire or parody.
``(d) Civil Action.--
``(1) Injunctive or other equitable relief.--
``(A) In general.--A covered individual whose voice
or likeness appears in, or who is the subject of, a
materially deceptive AI-generated audio or visual
media, including content distributed as part of a
Federal election activity, distributed in violation of
this section may seek injunctive or other equitable
relief prohibiting the distribution of materially
deceptive AI-generated audio or visual media in
violation of this section.
``(B) Precedence.--An action under this paragraph
shall be entitled to precedence in accordance with the
Federal Rules of Civil Procedure.
``(2) Damages.--
``(A) In general.--A covered individual whose voice
or likeness appears in, or who is the subject of, a
materially deceptive AI-generated audio or visual
media, including content distributed as part of a
Federal election activity, distributed in violation of
this section may bring an action for general or special
damages against the person, committee, or other entity
that distributed the materially deceptive AI-generated
audio or visual media.
``(B) Attorney's fees and costs.--In addition to
any damages awarded under subparagraph (A), the court
may also award a prevailing party reasonable attorney's
fees and costs.
``(C) Rule of construction.--Nothing in this
paragraph shall be construed to limit or preclude a
plaintiff from securing or recovering any other
available remedy.
``(3) Burden of proof.--In any civil action alleging a
violation of this section, the plaintiff shall bear the burden
of establishing the violation through clear and convincing
evidence.''.
(b) Severability.--If any provision of this Act, or an amendment
made by this Act, or the application of such provision to any person or
circumstance, is held to be invalid, the remainder of this Act, or an
amendment made by this Act, or the application of such provision to
other persons or circumstances, shall not be affected.
<all>