Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

[Algorithms] Google responsible for online terrorism? SCOTUS hears Section 230 case
JPOST ^ | Feb 20, 2023 | Michael Starr

Posted on 02/21/2023 6:52:27 AM PST by Conservat1

Gonzalez v. Google centers on the contention that YouTube’s algorithms engendered the radicalization and recruitment of Islamic State members and affiliates, ultimately making the social media platform liable for attacks like the November 2015 Paris ones that cost the lives of 130 people including the petitioners’ daughter, Mexican-American college student Nohemi

The case touches on a core legal article for the Internet, Section 230, which says that social media platforms and other interactive computer services are considered not liable for content posted by their users.

“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider,” reads the article...

(Excerpt) Read more at jpost.com ...


TOPICS: News/Current Events; War on Terror
KEYWORDS: algorithms; antisemitism; darshanleitner; gonzalezvsgoogle; google; googleterrorism; isis; islamicstate; jihad; section230; shurathadin; socialjustice; socialmedia; terroristfunding; youtube
https://www.inn.co.il/news/593163

Historic discussion in the US Supreme Court: Will the immunity of social networks be removed?

An Israeli lawsuit by the Shurat Hadin organization on behalf of the family of Naomi Gonzalez, who was murdered in Paris by ISIS, is going to cause an international uproar.

Channel 7. 30 Shvat, 5783.  21.02.23

In two precedent cases pending in the US Supreme Court, the extent of the immunity granted by the US legislature to social networks will be tested today. This, following a lawsuit filed by the Shurat Hadin organization, led by attorney Nitsana Darshan-Leitner, which claims that the social media giants have ceased to be a "table ads" only, and therefore cannot disclaim responsibility for terrorist content online.

One of the cases is the case of the late Naomi Gonzalez, an American student who was murdered in Paris in 2015 by a terrorist affiliated with ISIS, whose family is represented by the Shurat HaDin organization and the American attorneys Robert Tolchin and Prof. Eric Shanfer, who operates from years against the phenomenon of incitement and the use of terrorist elements on social networks.

The members of the Gonzalez family claim that the terrorist organizations are taking advantage of social media advertising in order to spread extreme content including beheading videos, to encourage and recruit fighters to join their ranks to carry out terrorism. The case attacks the networks' use of "algorithms" that push and recommend posts to users, thus increasing incitement and testifying that the media giant has control over the content that is published on it.

Ahead of the hearing, a hundred opinions were submitted to the court as "friends of the court" from states, senators, giant media companies and human rights organizations. From Israel, an opinion was submitted by the former security establishment officials, Moshe (Bogie) Ya'alon, Roni Elshich, Ya'akov Amidror, Tamir Hyman, Yossi Koperverser and Haim Tomer, which states that social networks play a decisive role in encouraging and assisting terrorist attacks around the world. In addition, the administration's own position supporting the removal of immunity due to the networks' "algorithms" was submitted.

Attorney Nitsana Darshan-Leitner, president of the Shurat Hadin organization, said outside the court: "After 7 long years in which the case is being conducted in the courts in the USA, the time has come for justice to be done. Social media has become a tool in the hands of the terrorist organizations without which almost nothing is done A terrorist attack. The sophisticated algorithms that generate billions of dollars for social media platforms have become fertile ground for inciting violence and terrorism, for raising resources for terrorist organizations, and for targeting individuals who will join the terrorist circle and carry out attacks. We hope that the composition of conservative judges in the US Supreme Court will reduce the immunity of the media giants social justice and historical justice will be done to the victims of terrorism."

1 posted on 02/21/2023 6:52:27 AM PST by Conservat1
[ Post Reply | Private Reply | View Replies]

To: Conservat1

Its a thorny issue with no easy answer. Think the legislative branch dropped the ball on addressing it in an intelligent way. The courts are left with either having to set a horrible precedence one way or another or do legislation on the fly.


2 posted on 02/21/2023 7:00:07 AM PST by AndyTheBear
[ Post Reply | Private Reply | To 1 | View Replies]

To: Conservat1

Were 19th century newspapers responsible when editorials caused public anger?

Are telephone companies responsible when criminals discuss crimes over their networks?


3 posted on 02/21/2023 7:04:53 AM PST by PGR88 (, )
[ Post Reply | Private Reply | To 1 | View Replies]

To: AndyTheBear
Yes, it is a difficult issue, compounded by the Tech Oligarchs blatant use of their power for political and ideological purposes.

How are they to be constrained?

Will the cure be worse than the problem?

4 posted on 02/21/2023 7:08:04 AM PST by marktwain
[ Post Reply | Private Reply | To 2 | View Replies]

To: marktwain
Will the cure be worse than the problem?

You can be certain that the "cure" will worsen the problem.

The people who caused the problem can buy and sell (or at least rent) the legislators who pass any laws to "fix" the problem.

5 posted on 02/21/2023 7:23:59 AM PST by flamberge (We don't get the government we vote for. We get the government we will tolerate.)
[ Post Reply | Private Reply | To 4 | View Replies]

To: Conservat1

Will liability just hit large tech companies that we don’t like or smaller online platforms as well eg. to use a completely random example freerepublic.com


6 posted on 02/21/2023 7:25:59 AM PST by JSM_Liberty
[ Post Reply | Private Reply | To 1 | View Replies]

To: AndyTheBear

“Think the legislative branch dropped the ball on addressing it in an intelligent way.”

“Dropped the ball” implies it was an accident.


7 posted on 02/21/2023 7:37:01 AM PST by Boogieman
[ Post Reply | Private Reply | To 2 | View Replies]

To: PGR88

“Are telephone companies responsible when criminals discuss crimes over their networks?”

They certainly would be if they had operators spying on everyone’s conversations (agreed to when signing up for the service), so that they were aware of the crimes being planned and did not report them.


8 posted on 02/21/2023 7:38:15 AM PST by Boogieman
[ Post Reply | Private Reply | To 3 | View Replies]

To: JSM_Liberty

There’s no distinction in free speech laws based on size of the platform, so any changes will probably affect everything.


9 posted on 02/21/2023 7:39:28 AM PST by Boogieman
[ Post Reply | Private Reply | To 6 | View Replies]

To: PGR88

IIRC In the bad old days of dial up modems and bulletin boards the theory was Platform vs. Publisher.
If you just provided a soapbox, (telephone line), you were considered a Platform and immune.
Once you started editing ideas, (newspaper), you became a Publisher and were liable.
Obviously it was a little more gray than that


10 posted on 02/21/2023 7:47:53 AM PST by Do_Tar (All my comments are creative or artistic expression.)
[ Post Reply | Private Reply | To 3 | View Replies]

To: marktwain
Think there has to be a replacement of 230 with legislation that makes at least these two important distinctions:

1) Forums/Platforms that are de facto public squares in which political speech should not be allowed to be censored or treated differently vs platforms which are de facto private clubs where those running the platform can zot people for any reason.

2) Usage of platforms to further terrorism, child porn, and other very illicit criminal activity, which all platforms should be required to make a reasonable effort to curtail vs political speech and opinions that one might disagree with (and for Leftists try to conflate with the former class)..which all large de facto public square platforms should not be allowed to curtail.

11 posted on 02/21/2023 8:03:07 AM PST by AndyTheBear
[ Post Reply | Private Reply | To 4 | View Replies]

To: AndyTheBear

The only way the applicable law could be constitutional would be for the law to mandate that the venue not filter ANY content regarding the filtered subject. In other words, Google would have to delete ALL terrorism related posts in order to delete some terrorism related posts on one side or the other. Otherwise Google would be discriminating against chosen speech on behalf of the overall community. Google would be acting as a government, and governmental censorship is unconstitutional.


12 posted on 02/21/2023 8:27:07 AM PST by nagant
[ Post Reply | Private Reply | To 2 | View Replies]

To: nagant
....ALL terrorism related posts in order to delete some terrorism related posts on one side or the other...

Seems to me all social media platforms should report and delete ALL terrorism posts they can find. But for posts that are not illicit a non-defacto public square should be able to do whatever they want. If you don't want some annoying guy on your Minecraft server you should be able to kick him off. However if your Minecraft server somehow grows to be a major avenue of political debate than you should lose that right because it would interfere with freedom of speech. If the annoying guy calls for terrorism you should be required to make some reasonable effort to kick him off and report him even a non-annoying guy or anyone. Certainly the effort should be even handed but be aimed at all illicit use, not just some.

13 posted on 02/21/2023 8:48:23 AM PST by AndyTheBear
[ Post Reply | Private Reply | To 12 | View Replies]

To: Conservat1

Digital ID to use the internet incoming.


14 posted on 02/21/2023 9:06:07 AM PST by Revel
[ Post Reply | Private Reply | To 1 | View Replies]

To: All

Justice Ketanji Brown Jackson stupidity:

Justice Ketanji Brown Jackson questioned Blatt about Congress’ original intent in passing Section 230, suggesting that the law was never meant to insulate tech platforms from lawsuits linked to algorithmic content recommendations
https://www.cnn.com/business/live-news/supreme-court-gonzalez-v-google-2-21-23/index.html


15 posted on 02/21/2023 9:42:58 AM PST by Postel
[ Post Reply | Private Reply | To 14 | View Replies]

To: Conservat1

Still not seeing the oral arguments for this posted yet. Should be in the next couple of hours.


16 posted on 02/21/2023 11:35:57 AM PST by zeugma (Stop deluding yourself that America is still a free country.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: PGR88
Are telephone companies responsible when criminals discuss crimes over their networks?

Do telephone companies censor people's conversations based on content, or potential content?

17 posted on 02/21/2023 11:37:47 AM PST by zeugma (Stop deluding yourself that America is still a free country.)
[ Post Reply | Private Reply | To 3 | View Replies]

To: zeugma
Do telephone companies censor people's conversations based on content, or potential content?

NO - so ask yourself, why do Fed.gov and leftists now insist social media companies sensor content?

So they can insist on a role in that censorship.

Are we playing into their game?

18 posted on 02/21/2023 11:46:09 AM PST by PGR88 (, )
[ Post Reply | Private Reply | To 17 | View Replies]

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson