You also need to consider the medium and how they are used as different laws apply to different situations - with a newspaper a letter to the editor has to be manually approved but stuff on the internet isn't, similar issue with radio/TV - live broadcasts generally have less liability than pre-recorded or scripted content, as with live content you don't have full control, but a script would be approved beforehand.Īlso having had a quick search, newspapers aren't automatically liable for the contents of letters to the editor - court cases have gone either way depending on the circumstances - they generally needed to have knowingly approved false info to be held liable.įurther sites aren't recommending content in the traditional sense - all they are saying is users that interacted with our site like you do also interacted with this content, rather than saying we approve of this content.Ģ. (Or a book store putting certain books front and centre). It's not really the same thing, sites promoting/recommending UGC is more akin to a newsagent displaying a newspaper rather than the publishing of letters to the editor. I also think that most social media sites are exploiting the difficultly of defining one over the other and frequently end up doing some version of stochastic terrorism in the pursuit of profit as a result of this ambiguity.ġ. Like many things in reality, it is instead a morass of grey area and I have yet to personally come up with a definition of one or that other that is satisfactory and I think the Supreme Court and most other players are in the same boat. Unfortunately, I think the actual reality of it is deeper than that. On the surface, I think its at least a reasonable argument. Since the amount of content is exceedingly large and complex, they have to use various algorithms to categorize content and sometimes those algorithms will get it wrong but they should not be punished for the attempt. Therefore what they are doing is not promoting or recommending content, they are simply showing you more videos of a particular category that you've already shown interest in. I know YouTube in particular takes great care to never actually say they are recommending any particular video. They will 100% accept a couple genocides in the name of "engagement."Ĭlick to expand.The argument that most sites make is not that they are necessarily promoting particular content so much as they are categorizing it. Because these companies have proven they have zero desire to tackle it themselves. I don't think this is an issue that will be well tackled via the courts, and it sounds like the Court agrees with me.īut it's an issue that I think does need to be tackled. In the context of terrorist groups and other illegal activities, I'd say they absolutely steer some users toward those things via the algorithm, which arguably does provide material support.but I'd also say that 230 as it stand almost certainly covers them. I've seen first-hand what Twitter and YouTube will fill your feed with as both a new user and if you click on just one "wrong" video/tweet. When platforms are choosing what to highlight for particular users, what to hide from other users, and promoting or shadowbanning content across the entirely platform (which is a step beyond mere "moderation") eventually I become less amenable to the argument that they're just passive distributors of user-generated content. The issue to me is the underlying question of whether the "algorithm" any platform employs reaches the level of editorial tomated though it may be. Click to expand.Yeah, it's a tough issue to tackle without actually modifying the legislation, because doing away with 230 entirely clearly doesn't would end most social media full stop.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |