[Discussion] The Inconceivable Power of Trolls in Social Media

This is a follow-on to the nearly two year old topic "Trouble at the Kool-Aid Point." The intention is to provide a place to discuss the unreasonable power social media trolls have over women and minorities, with a primary focus on video games (though other examples are certainly welcome).

bnpederson wrote:
Reaper81 wrote:
SpacePPoliceman wrote:
Reaper81 wrote:

As an aside, I found Waypoint’s analysis of that “Losing my son to the Alt Right” or whatever to be terrible. They should have included a developmental psychologist or pediatrician on that cast to really help inform the discussion.

Disagreed. A developmental psychologist or pediatrician wouldn't have been able to account for how contrived and suspicious much of that piece was.

I think either would have been more than capable of discussing parent’s motivations to lie, minimize, or justify a child’s behaviors. They also are qualified to discuss the behaviors of the child directly... More qualified than a bunch of non-parent bougie art critics. ;-)

The one who brought up the article, and later had misgivings about the framing and credence given to the child, is a parent.

Klepek’s daughter is like 2.

Stengah wrote:
OG_slinger wrote:
Jayhawker wrote:

IMAGE(https://i.imgur.com/GXU7tHJ.png)

And Trump tweeted the video last night...

How does that motherf*cker still have a Twitter account?

Because Jack Dorsey is a coward fascist sympathizer.

Y'all can spread the hate because Facebook also refused to take down the video.

WaPo wrote:

But Facebook, where the video appeared to gain much of its audience, declined Friday to remove the video, even after Facebook’s independent fact-checking groups, Lead Stories and PolitiFact, deemed the video “false."

“We don’t have a policy that stipulates that the information you post on Facebook must be true,” Facebook said in a statement to The Washington Post.

The company said it instead would “heavily reduce” the video’s appearances in people’s news feeds, append a small informational box alongside the video linking to the two fact-check sites, and open a pop-up box linking to “additional reporting” whenever someone clicks to share the video.

“We don’t have a policy that stipulates that the information you post on Facebook must be true” is going to be up there with "I was just following orders."

Reaper81 wrote:

I think either would have been more than capable of discussing parent’s motivations to lie, minimize, or justify a child’s behaviors.

Our cultural lionization of certain ideals of parenthood are outside the realm of this discussion. I'll just suffice to say that large parts of that story reeked of bullsh*t, reeked of bullsh*t in a way that specifically buoys the odious thinking the story was alleged to critique, and it doesn't take an expert to understand why a parent would push forward that bullsh*t, nor does any motivation make it right.

If more specificity is justified, I'm happy to provide, but this thread moves very fast, and the latest doctored video looks to be what people want to discuss, so I'm also happy to let this be my final statement on that matter.

An Indiana Man Who Vandalized A Synagogue With Nazi Symbols Admitted How Far-Right Figures Radicalized Him

An Indiana man who plead guilty to defacing a synagogue with Nazi symbolism detailed to federal agents his road to radicalization, including meeting with members of the far-right group Identity Evropa and being inspired by the writings of a former Breitbart editor and the Nazi propaganda site Stormfront.

Nolan Brewer, 21, of Eminence, plead guilty last week to conspiring to violate the civil rights of Congregation Shaarey Tefilla. He was sentenced to three years in prison.

In July 2018, Brewer and his then-17-year-old wife, Kiyomi Brewer, drove 50 miles from their home to the synagogue, spray painted a Nazi flag and iron crosses on a Dumpster enclosure, and lit a fire on the ground. Prosecutors said they originally planned to break into the synagogue and destroy it with homemade bombs and napalm they brought along, but they got scared.

In an interview with FBI agents, Brewer said they wanted to send a message to Jews as a race. He cited bogus statistics, aiming to back up the racist conspiracy theory that Jews have undue political influence.

“I guess it’s just .... back down or something like that,” Brewer told the FBI, describing the message of the vandalism. He also said he wanted to make news headlines, and was proud word of the attack reached Vice President Mike Pence, who condemned it.

Brewer told FBI agents he wanted to “scare the hell out of them,” prosecutors said, and send “a message of like, get out I guess.”

His defense attorneys acknowledged that Brewer had latched onto pseudointellectual arguments for white supremacy. Evidence submitted to the court included racist memes he had shared and selfies in which he wore the iron cross associated with Nazi Germany. His phone wallpaper was an image of a swastika.

“It is clear that he has adopted beliefs based on ‘alt-right’ or white nationalist propaganda,” the defense attorneys said.

The details were first reported by data scientist Emily Gorcenski, who does extensive research on the far-right.

As his attorneys sought a lighter sentence, they outlined how a young man from a small town, who’d recently graduated from community college, and had no history of criminal or behavioral issues became radicalized.

They blamed his teenage wife, who they said had a troubled upbringing and would spend hours chatting on Discord, an app that had become popular among white supremacists. She then shared articles with her husband.

“According to Nolan, she began with rightwing yet mainstream views such as those presented on Fox News. She then moved on to writing by Ben Shapiro and articles on Breitbart News which bridged the gap to the notorious white supremacist and anti-Semititc propaganda site Stormfront.”

Shapiro didn’t return requests for comment.

Kiyomi was charged as an adult by the state of Indiana, her lawyer, Kevin Karimi, told BuzzFeed News. She pleaded guilty to arson, he said, getting no jail time though she did get probation.

“Ms. Brewer was in fact a minor at the time of the incident,” Karimi said. “The fact that my client wasn't charged federally speaks for itself.”

“That said, the gravity of her actions were not lost upon my client. The anti Semitic crimes committed across our country are sickening,” Karimi said. “This case helped fuel legislation in Indiana to make 'hate crimes' further punishable by law.

Brewer told FBI agents how Kiyomi had long chats with who she thought was a Romanian “identitarian” white nationalist called “Asbestos Peter” on Discord and Telegram, who convinced them to attack a synagogue and told them which supplies to buy.

FBI: What, what did Peter tell you the message was he was trying to send here?

Brewer: Uhhh he, he… said it was mainly trying to rile up—rally up people. See if there were any underlying groups that wanted to see if—were waiting to see if anybody was out there in Indiana. Just to see if something could come of it and people would become more active which… yeah

Much of their communication with white supremacists took place online, but Brewer also described becoming a member of Identity Evropa, a neo-Nazi group that took part in the Unite the Right rally violence at Charlottesville.

Brewer told the FBI he and his wife met for dinner at a local restaurant with a “lovely couple” and others from the group, but he glossed over its violent recent history. After the 2017 violence left a woman dead, Identity Evropa members were told not to participate in the 2018 Unite the Right rally, he acknowledged.

“It’s just be proud that you’re European, it’s, it’s an identitarian movement, it’s nothing political,” Brewer told the FBI, adding he and his wife gave $100 to the group.

But Ben Shapiro throws a hissy fit if you dare label him as "alt-right".

Quintin_Stone wrote:

But Ben Shapiro throws a hissy fit if you dare label him as "alt-right".

Or point out the constantly increasing number of acts of violence carried out by people who openly cite him as an influence.

They blamed his teenage wife, who they said had a troubled upbringing and would spend hours chatting on Discord, an app that had become popular among white supremacists.

Not... DISCORD?!?!

Did she also use an "Android" smartphone, popular with white supremacists? Did she eat at "McDonald's" restaurants, also popular with white supremacists? Does she wear the typical footwear of white supremacists, the "sneaker"?

What these kids did is horrible, but come on, BuzzFeed "news."

BadKen wrote:
They blamed his teenage wife, who they said had a troubled upbringing and would spend hours chatting on Discord, an app that had become popular among white supremacists.

Not... DISCORD?!?!

Did she also use an "Android" smartphone, popular with white supremacists? Did she eat at "McDonald's" restaurants, also popular with white supremacists? Does she wear the typical footwear of white supremacists, the "sneaker"?

What these kids did is horrible, but come on, BuzzFeed "news."

I blame Wumpus the new Pepe.

Eh, Discord does in fact knowingly host and not remove a ton of alt right and white power communities. I have no concerns with blaming them, YouTube, reddit, etc.

It's interesting to be so defensive about Discord in a thread full of criticisms of many other online platforms.

ruhk wrote:
Quintin_Stone wrote:

But Ben Shapiro throws a hissy fit if you dare label him as "alt-right".

Or point out the constantly increasing number of acts of violence carried out by people who openly cite him as an influence.

To be fair, I don't think it's fair to call Ben alt-right. Although I do feel he isn't largely because of their anti-Semitism, and not really any other reason.

That said, it does point out what others have noticed, that it's a surprisingly short journey from Ben Shapiro and others like him to the far-right.

Prederick wrote:
ruhk wrote:
Quintin_Stone wrote:

But Ben Shapiro throws a hissy fit if you dare label him as "alt-right".

Or point out the constantly increasing number of acts of violence carried out by people who openly cite him as an influence.

To be fair, I don't think it's fair to call Ben alt-right. Although I do feel he isn't largely because of their anti-Semitism, and not really any other reason.

That said, it does point out what others have noticed, that it's a surprisingly short journey from Ben Shapiro and others like him to the far-right.

I think it's fair to label him as alt-right considering that he's partly responsible for bringing it out of the shadows of fringe conservatism. There's far more overlap between Shapiro and typical alt-right talking points than there are differences. Realistically the only major difference is that most alt-right conservatives are broadly anti-semitic, whereas Shapiro limits his anti-semitism to Jews who don't fit the narrow definition of Judaism that he practices.

Yeah giving Shapiro some benefit of the doubt for being Jewish is like giving Milo some for being gay.

Bullsh*t.

Aggrieved guy with a history of harrasment and/or domestic violence?

Talk about fringes. A black Trump super fan and New Yorker that’s a NE Patriots “fan”. I guess the fandom makes sense since that’s Trump favorite team.

Returning to our theme of "YouTube does things"

On YouTube’s Digital Playground, an Open Gate for Pedophiles

Christiane C. didn’t think anything of it when her 10-year-old daughter and a friend uploaded a video of themselves playing in a backyard pool.

“The video is innocent, it’s not a big deal,” said Christiane, who lives in a Rio de Janeiro suburb.

A few days later, her daughter shared exciting news: The video had thousands of views. Before long, it had ticked up to 400,000 — a staggering number for a video of a child in a two-piece bathing suit with her friend.

“I saw the video again and I got scared by the number of views,” Christiane said.

She had reason to be.

YouTube’s automated recommendation system — which drives most of the platform’s billions of views by suggesting what users should watch next — had begun showing the video to users who watched other videos of prepubescent, partially clothed children, a team of researchers has found.

YouTube had curated the videos from across its archives, at times plucking out the otherwise innocuous home movies of unwitting families, the researchers say. In many cases, its algorithm referred users to the videos after they watched sexually themed content.

The result was a catalog of videos that experts say sexualizes children.

“It’s YouTube’s algorithm that connects these channels,” said Jonas Kaiser, one of three researchers at Harvard’s Berkman Klein Center for Internet and Society who stumbled onto the videos while looking into YouTube’s impact in Brazil. “That’s the scary thing.”

The video of Christiane’s daughter was promoted by YouTube’s systems months after the company was alerted that it had a pedophile problem. In February, Wired and other news outlets reported that predators were using the comment section of YouTube videos with children to guide other pedophiles.

That month, calling the problem “deeply concerning,” YouTube disabled comments on many videos with children in them.

But the recommendation system, which remains in place, has gathered dozens of such videos into a new and easily viewable repository, and pushed them out to a vast audience.

YouTube never set out to serve users with sexual interests in children — but in the end, Mr. Kaiser said, its automated system managed to keep them watching with recommendations that he called “disturbingly on point.”

Users do not need to look for videos of children to end up watching them. The platform can lead them there through a progression of recommendations.

So a user who watches erotic videos might be recommended videos of women who become conspicuously younger, and then women who pose provocatively in children’s clothes. Eventually, some users might be presented with videos of girls as young as 5 or 6 wearing bathing suits, or getting dressed or doing a split.

On its own, each video might be perfectly innocent, a home movie, say, made by a child. Any revealing frames are fleeting and appear accidental. But, grouped together, their shared features become unmistakable.

So there's been quite the kerfluffle over on YouTube the last week and change or so, regarding popular Conservative YouTuber Steven Crowder and Vox media journalist Carlos Maza.

In what I cannot regard as a more perfect summation of the cack-handed way YT tries to handle these things, YouTube has...

At first said they won't take any action against Crowder, then announcing that they were banning Nazi ideology from the service and demonetizing videos from many YouTubers, including Crowder, Austrian "identitarian" Martin Sellner and others.

Well, sort of. Because apparently, YT says Crowder can be remonetized, if he stops selling t-shirts on his channel. (I assume they mean the ones that say "Socialism is for f*gs".)

This is a grossly un-detailed overview of what's happening, but it's all a clusterf*ck over there and between this and the above pedophile thing, it's amazing to watch YouTube try and protect its sweet sweet revenue-generating ecosystem while also being incapable of dealing with the environment said same ecosystem has created and incentivizes.

If I wanted to be nice, I guess I could say that they're caught between a rock and a hard place, but I think it's far more accurate to simply say "they're completely making it up as they go along and, like most of Silicon Valley, was entirely unprepared for trolls and bad-faith actors repeatedly abusing their services."

It seems like they're going a bit further than just shutting down Nazi and other extremist videos. YouTube announced a big step today in shutting down extremist and misinformation videos. They're changing the terms of service.

YouTube is changing its community guidelines to ban videos promoting the superiority of any group as a justification for discrimination against others based on their age, gender, race, caste, religion, sexual orientation, or veteran status, the company said today. The move, which will result in the removal of all videos promoting Nazism and other discriminatory ideologies, is expected to result in the removal of thousands of channels across YouTube.

So while the Guardian article is true, it seems to me that changing the terms of service and the broader new definition of disallowed content may actually be significant.

This move will also ban videos denying "well-documented violent events" - so no more bullsh*t about child actors at school shootings. Also no more flat earthers or "bleach cures autism" madness.

However, as Pred pointed out, they're not doing sh*t about Crowder's nonsense. So I guess believe the policy when you see real action.

As I read that post I could only imagine the ways in which prominent white YouTubers would cry “reverse racism” and have legit stuff taken down.

So like you say: we’ll see.

I don't think it would affect flat earthers and other deniers of well documented but non violent truths

I don't care what you fools say. The Earth is flat and being held up by four elephants standing on a giant space turtle that is the mortal enemy of the dead lights that takes the form of a clown but is really a giant space spider. Everyone knows this and just denies it to give me a headache forcing me to drink bleach to get rid of it.

BadKen wrote:

However, as Pred pointed out, they're not doing sh*t about Crowder's nonsense. So I guess believe the policy when you see real action.

They rolled back on this yesterday. Apparently there was a petition within youtube's LGBTQ+ employees demanding they remove all references to pride from the site until such time as they put their money where their mouth is and actually take a stand against his homophobic abuse.

So Crowder's vids are now demonitised. Doesn't stop him selling his "merch" though...

4,397th verse, same as the first.

Mr. Cain, 26, recently swore off the alt-right nearly five years after discovering it, and has become a vocal critic of the movement. He is scarred by his experience of being radicalized by what he calls a “decentralized cult” of far-right YouTube personalities, who convinced him that Western civilization was under threat from Muslim immigrants and cultural Marxists, that innate I.Q. differences explained racial disparities, and that feminism was a dangerous ideology.

“I just kept falling deeper and deeper into this, and it appealed to me because it made me feel a sense of belonging,” he said. “I was brainwashed.”

Over years of reporting on internet culture, I’ve heard countless versions of Mr. Cain’s story: an aimless young man — usually white, frequently interested in video games — visits YouTube looking for direction or distraction and is seduced by a community of far-right creators.

Some young men discover far-right videos by accident, while others seek them out. Some travel all the way to neo-Nazism, while others stop at milder forms of bigotry.

The common thread in many of these stories is YouTube and its recommendation algorithm, the software that determines which videos appear on users’ home pages and inside the “Up Next” sidebar next to a video that is playing. The algorithm is responsible for more than 70 percent of all time spent on the site.

The radicalization of young men is driven by a complex stew of emotional, economic and political elements, many having nothing to do with social media. But critics and independent researchers say YouTube has inadvertently created a dangerous on-ramp to extremism by combining two things: a business model that rewards provocative videos with exposure and advertising dollars, and an algorithm that guides users down personalized paths meant to keep them glued to their screens.

“There’s a spectrum on YouTube between the calm section — the Walter Cronkite, Carl Sagan part — and Crazytown, where the extreme stuff is,” said Tristan Harris, a former design ethicist at Google, YouTube’s parent company. “If I’m YouTube and I want you to watch more, I’m always going to steer you toward Crazytown.”

In recent years, social media platforms have grappled with the growth of extremism on their services. Many platforms have barred a handful of far-right influencers and conspiracy theorists, including Alex Jones of Infowars, and tech companies have taken steps to limit the spread of political misinformation.

YouTube, whose rules prohibit hate speech and harassment, took a more laissez-faire approach to enforcement for years. This past week, the company announced that it was updating its policy to ban videos espousing neo-Nazism, white supremacy and other bigoted views. The company also said it was changing its recommendation algorithm to reduce the spread of misinformation and conspiracy theories.

With two billion monthly active users uploading more than 500 hours of video every minute, YouTube’s traffic is estimated to be the second highest of any website, behind only Google.com. According to the Pew Research Center, 94 percent of Americans ages 18 to 24 use YouTube, a higher percentage than for any other online service.

Like many Silicon Valley companies, YouTube is outwardly liberal in its corporate politics. It sponsors floats at L.G.B.T. pride parades and celebrates diverse creators, and its chief executive endorsed Hillary Clinton in the 2016 presidential election. President Trump and other conservatives have claimed that YouTube and other social media networks are biased against right-wing views, and have used takedowns like those announced by YouTube on Wednesday as evidence for those claims.

In reality, YouTube has been a godsend for hyper-partisans on all sides. It has allowed them to bypass traditional gatekeepers and broadcast their views to mainstream audiences, and has helped once-obscure commentators build lucrative media businesses.

It has also been a useful recruiting tool for far-right extremist groups. Bellingcat, an investigative news site, analyzed messages from far-right chat rooms and found that YouTube was cited as the most frequent cause of members’ “red-pilling” — an internet slang term for converting to far-right beliefs. A European research group, VOX-Pol, conducted a separate analysis of nearly 30,000 Twitter accounts affiliated with the alt-right. It found that the accounts linked to YouTube more often than to any other site.

“YouTube has been able to fly under the radar because until recently, no one thought of it as a place where radicalization is happening,” said Becca Lewis, who studies online extremism for the nonprofit Data & Society. “But it’s where young people are getting their information and entertainment, and it’s a space where creators are broadcasting political content that, at times, is overtly white supremacist.”

I visited Mr. Cain in West Virginia after seeing his YouTube video denouncing the far right. We spent hours discussing his radicalization. To back up his recollections, he downloaded and sent me his entire YouTube history, a log of more than 12,000 videos and more than 2,500 search queries dating to 2015.

These interviews and data points form a picture of a disillusioned young man, an internet-savvy group of right-wing reactionaries and a powerful algorithm that learns to connect the two. It suggests that YouTube may have played a role in steering Mr. Cain, and other young men like him, toward the far-right fringes.

It also suggests that, in time, YouTube is capable of steering them in very different directions.

I'm sure YT will hang desperately on to that last sentence. "Sure, our website takes you in four clicks from 'self-help video' to 'children's storybook retelling of the Christchurch massacre to avoid being banned', but over the course of several years, it might lead you in a different direction! Happy Pride Month!"

Fun times with the algorithm for me. One of the channels I enjoy, one Thought Slime, did a video on fallacious defenses of Stephen...Steven... I'm not looking it up, defenses of failed comedian bigot bro Crowder, and now, lo, I'm getting Crowder's videos in my Recommended for You feed. f*cken great.

Yeah, I turned off YouTube recommendations because of that kinda sh*t. Well, as much as one can.

TikTok Has A Predator Problem. A Network Of Young Women Is Fighting Back.

The same mechanics that have turned TikTok into this year’s fastest-growing social media app have brought with them a dark side: sexual predation.

In an era when the failure of social media giants to police their platforms has gone from a scandal to a fact of life, an ad hoc network of young women is springing up to combat the exploitation that seems inseparable from the Chinese-owned app’s explosive success.

One of the most popular kinds of videos from TikTok’s users, who are mostly young and female, are lip-synch videos, where they dance and sing along with their favorite songs. These performances are sometimes sexualized by older men who lurk on the app, sending the young creators explicit messages and, in some cases, remixing the videos and dancing along with them via a TikTok feature called “duet.”

And the platform doesn’t just overlook this kind of conduct, like YouTube — its core mechanics inadvertently facilitate it. Like all social media platforms, TikTok is optimized for engagement, algorithmically steering users to content via a “For You" page that works like if Facebook’s News Feed were curated like Netflix’s landing page. It learns what you like and shows you more and more of it. It also reacts in real time, delivering an endless stream of similar videos, even if you aren’t logged in.

“If some creepy guy just keeps liking videos of younger girls doing similar audios or soundtracks or hashtags, those are going to keep coming up on his ‘For You’ page,” said an 18-year-old user named Liz W., who goes by @bithoeji on the app and spoke on the condition her full last name not be used. “So it's easier for him to find more victims. And I think that's what makes it so easy for predators to come on it and victimize young children.”

To fight this, a DIY effort to police TikTok has emerged, with the women at the heart of the DIY effort collecting allegations and evidence of sexual misconduct, blasting it out across YouTube, Instagram, and Twitter, bagging and tagging the older men trying to prey on them. They say they have to protect each other on TikTok because they don’t have faith in the company’s ability to keep its users safe. The result: a Lord of the Flies free-for-all where young users weaponize dubious screenshots, defamatory callout videos, and whisper campaigns via group chats to deliver vigilante justice at dizzying speeds.

The thing that amazes me, to this day, is that every single one of these apps and companies is focused on engagement first, second, third, and fourth through one millionth. User safety clearly was not ever a concern for them, and does not become a begrudging concern until abusers/criminals/bad faith actors begin running absolutely rampant on their site and (and this is the actually important part) giving them bad press.

See also Capitalism.

Prederick wrote:

The thing that amazes me, to this day, is that every single one of these apps and companies is focused on engagement first, second, third, and fourth through one millionth.

None of them got funding from a "user safety" company.

They were funded by venture capital companies that expect their investment will be repaid many times over and who actively push them to focus on engagement. That's because engagement equals more ad revenue while "user safety" equals extra costs.