Bill Summary
The **Algorithm Accountability Act** seeks to amend Section 230 of the Communications Act of 1934 by limiting liability protections for certain social media platforms regarding the use of recommendation-based algorithms. Under this legislation, social media platforms are required to exercise reasonable care in the design and operation of their algorithms to prevent bodily injury or death that could be attributed to these algorithms.
Key provisions include:
1. **Duty of Care**: Platforms must ensure their algorithms are designed to prevent foreseeable harm to users or others, with exceptions for chronological sorting or responses to user-initiated searches.
2. **Loss of Liability Protection**: If a platform violates the duty of care, it can lose its liability protections under Section 230, allowing users to sue for damages in civil court.
3. **Private Right of Action**: Individuals who suffer harm due to algorithm-related negligence can bring lawsuits against the platform.
4. **Invalidation of Arbitration Agreements**: Any predispute arbitration agreements or waivers that limit a user’s ability to sue in such cases will be considered invalid.
5. **Definitions**: The Act provides definitions for "recommendation-based algorithms" and "social media platforms," clarifying which services are covered.
Overall, the Act aims to enhance accountability for social media companies regarding the impacts of their algorithmic systems on user safety.
Possible Impacts
Here are three examples of how the "Algorithm Accountability Act" could affect people:
1. **Increased Legal Recourse for Users**: The legislation allows individuals who suffer bodily injury or death due to the operation of recommendation-based algorithms on social media platforms to sue those platforms for compensatory and punitive damages. This means that users who are harmed by harmful content promoted by algorithms, such as incitement to violence or harmful misinformation, can hold the platforms accountable in court. This increased legal recourse could lead to greater accountability among social media companies regarding the content they promote.
2. **Potential Changes in Content Moderation Policies**: Social media companies may react to the new liability risks by altering their content moderation practices and algorithms. To avoid lawsuits, platforms might become more stringent in their monitoring and filtering of content, potentially leading to more aggressive removal of posts or accounts that could be deemed harmful. This could affect users' experiences on these platforms, either by limiting exposure to harmful content or, conversely, by limiting free expression if platforms err on the side of caution.
3. **Impact on Smaller Platforms**: The Act specifically targets larger, for-profit social media platforms that utilize recommendation-based algorithms. Smaller platforms, especially those with fewer than one million registered users, are exempt from this legislation. This could lead to a market advantage for smaller platforms, as they may not face the same level of liability and regulatory scrutiny. Users seeking alternative social media experiences may gravitate towards these smaller platforms, affecting the overall dynamics of social media usage and competition.
[Congressional Bills 119th Congress]
[From the U.S. Government Publishing Office]
[S. 3193 Introduced in Senate (IS)]
<DOC>
119th CONGRESS
1st Session
S. 3193
To amend section 230 of the Communications Act of 1934 to limit
liability protection under that section for certain social media
platforms, and for other purposes.
_______________________________________________________________________
IN THE SENATE OF THE UNITED STATES
November 18, 2025
Mr. Curtis (for himself and Mr. Kelly) introduced the following bill;
which was read twice and referred to the Committee on Commerce,
Science, and Transportation
_______________________________________________________________________
A BILL
To amend section 230 of the Communications Act of 1934 to limit
liability protection under that section for certain social media
platforms, and for other purposes.
Be it enacted by the Senate and House of Representatives of the
United States of America in Congress assembled,
SECTION 1. SHORT TITLE.
This Act may be cited as the ``Algorithm Accountability Act''.
SEC. 2. LIMITATION OF LIABILITY PROTECTION FOR CERTAIN SOCIAL MEDIA
PLATFORMS.
(a) In General.--Section 230 of the Communications Act of 1934 (47
U.S.C. 230) is amended--
(1) by redesignating subsection (f) as subsection (g); and
(2) by inserting after subsection (e) the following:
``(f) Algorithmic Product Design Accountability.--
``(1) Duty of care in algorithmic design.--
``(A) In general.--A provider of a social media
platform shall exercise reasonable care in the design,
training, testing, deployment, operation, and
maintenance of a recommendation-based algorithm on the
social media platform to prevent bodily injury or death
described in subparagraph (B) that a reasonable and
prudent person would agree was--
``(i) reasonably foreseeable by the
provider; and
``(ii) attributable, in whole or in part,
to the design characteristics or performance of
the recommendation-based algorithm.
``(B) Covered bodily injury or death.--Bodily
injury or death described in this subparagraph, with
respect to a social media platform, is bodily injury to
or the death of a user of the social media platform, or
bodily injury or death inflicted by a user of the
social media platform upon another person, that arises
from the operation of the recommendation-based
algorithm.
``(C) Exception.--
``(i) In general.--Subparagraph (A) shall
not apply to the ranking, ordering, promotion,
recommendation, amplification, or similar
curation of content that is effectuated--
``(I) by sorting information
chronologically or reverse
chronologically; or
``(II) to respond to an individual
search for content on the social media
platform initiated by a user.
``(ii) Exception limited to initial
search.--Nothing in clause (i)(II) shall be
construed to limit the applicability of
subparagraph (A) to a provider of a social
media platform, with respect to the activities
of a recommendation-based algorithm, after a
user of the social media platform navigates
beyond the initially populated search results.
``(D) First amendment protections.--Nothing in
subparagraph (A) shall be construed to authorize the
Commission to enforce that subparagraph based on the
viewpoint of a user of a social media platform or of an
information content provider expressed by or through
any speech, expression, or information protected by the
First Amendment to the Constitution of the United
States.
``(2) Enforcement.--
``(A) Loss of liability protection.--Subsection
(c)(1) shall not apply to a provider of a social media
platform that violates paragraph (1)(A) of this
subsection.
``(B) Private right of action.--If a person suffers
bodily injury or death as the result of a violation of
paragraph (1)(A) by the provider of a social media
platform, and the bodily injury or death meets the
requirements under clauses (i) and (ii) of that
paragraph and paragraph (1)(B), the person or, in the
case of a minor or disabled person who suffers a bodily
injury or any person who dies, the legal representative
of such a person, may bring a civil action in a
district court of the United States of competent
jurisdiction against the provider for compensatory and
punitive damages.
``(3) Invalidity of predispute agreements and waivers.--
``(A) In general.--No predispute arbitration
agreement or predispute joint-action waiver (as those
terms are defined in section 401 of title 9, United
States Code) shall be valid or enforceable with respect
to a dispute arising under this subsection.
``(B) Applicability.--Any determination as to the
scope or manner of applicability of subparagraph (A)
shall be made by a court, rather than an arbitrator,
without regard to whether an agreement described in
that subparagraph purports to delegate such
determination to an arbitrator.
``(4) Relationship to other laws.--Nothing in this
subsection or any regulation promulgated thereunder shall be
construed to prohibit or otherwise affect the enforcement of
any Federal law or regulation or State law or regulation that
is at least as protective of users of social media platforms as
this subsection and the regulations promulgated thereunder.
``(5) Severability.--If any provision of this subsection or
the application of such provision to any person or circumstance
is held to be unconstitutional, the remainder of this
subsection and the application of the provision to any other
person or circumstance shall not be affected.
``(6) Definitions.--In this subsection:
``(A) Recommendation-based algorithm.--The term
`recommendation-based algorithm' means, with respect to
a user of a social media platform, a fully or partially
automated system used to rank, order, promote,
recommend, amplify, or similarly curate content,
including other users, hashtags, or posts, based on the
personal data of the user, including the preferences,
interests, behavior, or characteristics of the user.
``(B) Social media platform.--The term `social
media platform'--
``(i) means a for-profit interactive
computer service that--
``(I) permits a user to establish
an account or create a profile for the
purpose of allowing the user to create,
share, or view content through the
account or profile; and
``(II) primarily serves as a
service through which a user described
in subclause (I) interacts with
content; and
``(ii) does not include an interactive
computer service--
``(I) that serves fewer than
1,000,000 registered users;
``(II) that is--
``(aa) an email program;
``(bb) an email
distribution list;
``(cc) a wireless messaging
service; or
``(dd) an online messaging
service, the predominant or
exclusive function of which is
direct messaging, meaning
messages are transmitted from
the sender to a recipient and
not posted within the
interactive computer service or
publicly;
``(III) that is a private platform
or messaging service used by an entity
solely to communicate with others
employed by or affiliated with the
entity;
``(IV) that is a teleconferencing
or video conferencing service that
allows reception and transmission of
audio or video signals for real-time
communication, provided that the real-
time communication is initiated by
using a unique link or identifier to
facilitate access; or
``(V) that is an internet-based
platform whose primary purpose is--
``(aa) to allow users to
post product reviews, business
reviews, or travel information
and reviews;
``(bb) internet commerce,
which may include providing a
comment section;
``(cc) to allow users to
stream music, audiobooks, or
podcasts; or
``(dd) news or sports
coverage.''.
(b) Technical and Conforming Amendments.--
(1) Trademark act of 1946.--Section 45 of the Act entitled
``An Act to provide for the registration and protection of
trademarks used in commerce, to carry out the provisions of
certain international conventions, and for other purposes'',
approved July 5, 1946 (commonly known as the ``Trademark Act of
1946'') (15 U.S.C. 1127), is amended, in the definition
relating to the term ``Internet'', by striking ``section
230(f)(1) of the Communications Act of 1934 (47 U.S.C.
230(f)(1))'' and inserting ``section 230 of the Communications
Act of 1934 (47 U.S.C. 230)''.
(2) Title 18, united states code.--Section 2421A of title
18, United States Code, is amended--
(A) in subsection (a), by striking ``(as such term
is defined in defined in section 230(f) the
Communications Act of 1934 (47 U.S.C. 230(f)))'' and
inserting ``(as that term is defined in section 230 of
the Communications Act of 1934 (47 U.S.C. 230))''; and
(B) in subsection (b), by striking ``(as such term
is defined in defined in section 230(f) the
Communications Act of 1934 (47 U.S.C. 230(f)))'' and
inserting ``(as that term is defined in section 230 of
the Communications Act of 1934 (47 U.S.C. 230)''.
(3) Webb-kenyon act.--Section 3(b)(1) of the Act entitled
``An Act divesting intoxicating liquors of their interstate
character in certain cases'', approved March 1, 1913 (commonly
known as the ``Webb-Kenyon Act'') (27 U.S.C. 122b(b)(1)), is
amended by striking ``(as defined in section 230(f) of the
Communications Act of 1934 (47 U.S.C. 230(f))'' and inserting
``(as defined in section 230 of the Communications Act of 1934
(47 U.S.C. 230))''.
(4) Title 31, united states code.--Section 5362(6) of title
31, United States Code, is amended by striking ``section 230(f)
of the Communications Act of 1934 (47 U.S.C. 230(f))'' and
inserting ``section 230 of the Communications Act of 1934 (47
U.S.C. 230)''.
<all>