[ad_1]
The order of the President of the United States, Donald Trump, for the revision of the laws that govern the Internet that he issued after Twitter verified two of his posts has returned with his recommendations, which could have an impact on the Internet around the world.
Trump ordered section 230 of the Communications Decency Act, added to the law in 1996, to be revised to clarify the immunity it affords to Internet platforms by treating them as distributors of information and not as publishers of the same.
Although it is an American law, most of the big tech companies and social media it governs are also American, so local law enforcement can have a global impact as they change the way they run their platforms to comply with it. .
The review that Trump ordered in May was carried out and the US Department of Justice “has concluded that the time has come to realign the scope of section 230 with the realities of the modern Internet” and believes that there is a “productive middle ground” between those who argue for the repeal of the law and those who argue it should be left alone.
The Department has identified what it calls “measured but concrete proposals that address many of the concerns raised about section 230.”
Current law states that “no provider or user of an interactive computer service will be treated as the publisher or broadcaster of any information provided by another provider of information content.”
This law has been credited with helping create an open and modern Internet, but it was written years before social media and currently allows sites like Facebook and Twitter to be absolved of responsibility for the content that appears on their sites. .
However, much of Trump’s executive order focused on what sites remove rather than what they let stay active, and the order stemmed from perceived censorship against him after Twitter encouraged users to Viewers of his tweets to “know the facts” about what they were reading.
“Immunity should not extend beyond its wording and purpose to provide protection to those who seek to provide users with a forum for free and open expression, but actually use their power over a vital communication medium to engage in misleading or pretextual actions that freely suffocate and open debate censuring certain points of view, “said the executive order issued in May.
THE MODERATOR’S DILEMMA
The latest of the proposed reforms involves “explicitly” overturning a 1995 court decision that created what is known as the “Moderator’s Dilemma.”
An online messaging board was held responsible for the defamatory remarks made about the president of Jordan Belfort’s fraudulent brokerage firm, Stratton Oakmont.
The message board promoted itself as family-friendly and had removed user posts from its site in the past, but because it did not remove the defamatory comments, it held itself accountable for them.
This created the dilemma: websites could try to moderate everything posted on their site, holding them responsible for things that get lost, or they can’t moderate anything at all and just be containers of information, removing things if they violate the law or their own. site policies. – but without having editorial supervision over the content or ideas that you spread like traditional book or newspaper publishers.
That dilemma is what section 230 was supposed to address, but the internet and the businesses for which the law was introduced have changed significantly since it was added 24 years ago, and while the law has a major influence on how businesses work. social networks, predates social networks. media as we know them today for several years.
THE PROPOSED REFORMS
The proposed reforms advise that the law “protect responsible online platforms” and not “immunize from liability an online platform that purposely facilitates or requests third-party content” that breaks the law, and this would include things like hacking sites. or sites where people share illegal material such as child exploitation material.
The Department also wants to add a clear definition to identify when a platform’s moderation decision was made “in good faith,” which would limit a platform’s ability to remove only content that violates its rules.
This would be identified in its own “simple and particular” terms of service (meaning that anyone reading the terms will be informed in advance of what it can and cannot do, so it could not argue that it has been censored for not reason).
The Department has also recommended “exclusions” to specifically address child abuse, terrorism, and cyberbullying and allow victims to “seek civil redress.”
It also recommended replacing “vague terminology” that establishes language “otherwise objectionable” with clauses that specifically identify content that is “illegal” or “promotes terrorism.”
The United States government would also gain a new power to police the Internet around the world in order to “protect citizens from harmful and illegal conduct.”
He also wants to change the law so that big tech companies cannot use it as a defense when they are subject to antitrust investigations, which is a separate issue.
“The avenues for engaging in both online commerce and speech have been concentrated in the hands of a few key players,” read the reforms recommended by the Department.
“It doesn’t make much sense to allow the big online platforms (particularly the dominant ones) to invoke Section 230 immunity in antitrust cases, where liability is based on harm to competition, not third party speech.”
The reforms still have to be voted on in Congress before they become law, and the Congress that eventually votes on them may be different than the current one.
American citizens head to the polls to elect (or re-elect) a president and representatives to Congress in November.
Proposed Amendments to Section 230
The US Department of Justice identified four areas for reform:
1. Incentivize online platforms to tackle illegal content
The first category of potential reforms is intended to incentivize platforms to address the growing amount of illicit content online, while preserving the core of Section 230’s immunity against defamation.
to. Carve-Out of the Bad Samaritan. First, the Department proposes to deny Section 230 immunity to true bad actors. The title of the immunity provision of Section 230 – “Protection for the blocking and selection of offensive material by the ‘good Samaritan'” – makes it clear that the immunity of Section 230 is intended to incentivize and protect platforms responsible online. Therefore, it makes little sense to immunize from liability an online platform that intentionally facilitates or requests third-party content or activity that would violate federal criminal law.
second. Carve-Outs for Child Abuse, Terrorism, and Cyber-Stalking. Second, the Department proposes to waive specific categories of claims that address particularly egregious content, including (1) child exploitation and sexual abuse, (2) terrorism, and (3) cyberbullying from immunity. These selective exclusions would stop the excessive extension of Section 230 immunity and allow victims to seek civil redress in causes of action far removed from the original purpose of the statute.
C. Case-specific exemptions for actual knowledge or court decisions. Third, the Department supports reforms to make clear that Section 230 immunity does not apply in a specific case where a platform had actual knowledge or notice that the third-party content in question violated federal criminal law or where the platform received a court ruling that the content is illegal in any respect.
2. Clarify the federal government’s enforcement capabilities to address illegal content
A second-rate reform would increase the government’s ability to protect citizens from harmful and illegal conduct. These reforms would make clear that the immunity provided by Section 230 does not apply to civil enforcement actions initiated by the federal government. Civil enforcement by the federal government is an important complement to criminal prosecution.
3. Promotion of competition
A third reform proposal is to clarify that federal antitrust claims are not covered by Section 230 immunity. Over time, avenues to engage in both online and speech commerce have become concentrated in the hands of a few actors. key. It makes little sense to allow large online platforms (particularly mainstream ones) to invoke Section 230 immunity in antitrust cases, where liability is based on harm to competition, not third party speech.
4. Promotion of open discourse and greater transparency
A fourth category of potential reforms aims to clarify the original text and purpose of the statute to promote free and open speech online and foster greater transparency between platforms and users.
to. Replace the vague terminology in (c) (2). First, the Department supports the replacement of the vague blanket language “otherwise objectionable” in Section 230 (c) (2) with “illegal” and “promotes terrorism.” This reform would focus the broad blanket immunity for content moderation decisions on the central goal of Section 230 – reducing online content harmful to children – while limiting a platform’s ability to remove content arbitrarily or in inconsistent ways. with its terms or service simply deeming it “objectionable.”
second. Provide a definition in good faith. Second, the Department proposes to add a statutory definition of “good faith,” which would limit immunity for content moderation decisions to those made in accordance with simple and particular terms of service and accompanied by a reasonable explanation, unless Such notice prevents law enforcement or risks imminent harm to others. Clarifying the meaning of “good faith” should encourage platforms to be more transparent and accountable to their users, rather than hiding behind the blanket protections of Section 230.
C. Explicitly override Stratton Oakmont to avoid moderator’s dilemma. Third, the Department proposes to clarify that removing content from a platform pursuant to Section 230 (c) (2) or pursuant to its terms of service does not, by itself, make the platform a publisher or publisher. speaker for all other content on your service. .
Source: US Department of Justice.
[ad_2]