"The white house and social media companies care more about censoring views they don't like than the facilitation of child porn - and rape - on their platforms"
By Alex Berenson
Jun 7, 2023
instagram has a problem with child sexual abuse.
instagram - and its parent company, meta platforms, which also owns facebook - do not seem to care.
the wall street journal ran a devastating piece today on child pornography and rape networks that instagram does not merely tolerate but facilitates.
—
(INFORMATION AND ANALYSIS YOU WON’T GET ANYWHERE ELSE. AND THAT’S THE TRUTH.)
—
The piece is filled with ugly revelations from top to bottom. Near its end the reporters report “instagram’s [automated] suggestions were helping to rebuild [a pedophile] network that the platform’s own safety staff was in the middle of trying to dismantle.”
Even worse, they reveal that instagram would allow users to see posts it knew might be harmful or illegal, after a short warning:
(“See results anyway.” Unbelievable but true, this is real:)
SOURCE
—
I’d call the piece an expose, except other outlets have run similar attacks on instagram’s enabling of child sexual abuse for years, without any effective response from instagram, meta, or facebook. If anything, the problem appears to have worsened since 2020, when school closures left children prey for abusive adults.
[N.S.: The schools had already for years been grooming children to become promiscuous.]
The journal article makes clear that the problem is not that the users posting this content are sophisticated or technologically savvy. They are not using encryption or even trying very hard to hide the content:
The pedophilic accounts on instagram mix brazenness with superficial efforts to veil their activity, researchers found. Certain emojis function as a kind of code, such as an image of a map—shorthand for “minor-attracted person”—or one of “cheese pizza,” which shares its initials with “child pornography.”
The users don’t try harder to hide what they’re doing because they can’t - they’re chasing new buyers and users. instagram’s virtue for them is that it is wide-open.
—
But why doesn’t instagram try harder?
Assuming the answer is not that meta and instagram are run by a pedophile cabal - and let’s all hope that’s not the answer - the reason is that they don’t have to. The previous stories generated a day or two of bad press and then vanished.
Meanwhile, section 230 of the communications decency act, the infamous section 230, gives social media companies essentially complete immunity for user-generated content.
Even a 2018 law called the fight online sex trafficking act - which, as its name implies, is meant to increase the legal liability companies faced - has hardly pierced 230’s legal veil.
Last year, the federal 9th circuit dismissed a claim from women who said the bulletin-board site reddit had allowed images of them being abused as minors. And on may 30, the supreme court declined to hear the case - again refusing to set any limits on section 230 and the protection it gives the companies.
—
(Smile for the camera, kiddo!)
https://alexberenson.substack.com/p/on-section-230-and-instagrams-child
SOURCE
—
This issue incenses me not just because I have three kids but because I know personally social media platforms can move quickly to ban content when it bothers them. instagram has repeatedly taken down posts of mine that are nothing more than screenshots of my substack articles reporting on the mRNAs.
But I am far from alone. During covid, instagram and facebook heavily censored anti-lockdown posts. facebook even banned posts on the lab leak theory until late may 2021.
Instead of putting the same effort into stopping child pornography and even use of its network to set up real-world physical sexual abuse of minors, facebook and instagram appear to be doing the minimum possible, relying on automated systems that match images to an existing database of child sexual abuse photos and videos.
facebook may have concluded that using human moderators to examine images and hash tags would expose them to legal liability for pornography. Worse, it may have decided that setting strict automated limits would risk making it harder for bikini models who have some of instagram’s largest audiences to post new glamour [sic] shots.
["glamour" is british media slang for softcore pornography.]
(21,559 likes. Willow Hand is 24, but you get the point. So does panda_wants_gummibears.)
—
The great irony here is that section 230 explicitly allows for social media companies to move against sexually abusive content - its section c(2) allows for bans of “obscene” material in “good faith.”
But the companies would rather rely on the broader protections the federal 9th circuit and other courts have said the law’s section C(1) gives them. Courts interpret 230 as allowing the companies both to censor content and users whenever they like and avoid any liability for the content they do allow.
They have the best of both worlds, and they use it. At this point, unless the supreme court restricts section 230, it now seems that only boycotts and possibly legal and congressional investigations of top executives will cause meta and instagram to tighten their rules against pedophiles.
Legal immunity is a hell of a drug.
Subscribe to:
Post Comments (Atom)
2 comments:
Another offshoot of wokeism.On a scale of 1-10 on the Sodom and Gomorrah scale--with 1 being puritan and 10 being citizens of Sodom and Gomorrah would be shocked and disgusted by current perversities--we're close to a 10,especially people in government and the minority populations in general.
--GRA
“'cheese pizza,' which shares its initials with 'child pornography.'”
OH, CP the letters. Even I get it. How clever.
Post a Comment