[Discussion] The Inconceivable Power of Trolls in Social Media

This is a follow-on to the nearly two year old topic "Trouble at the Kool-Aid Point." The intention is to provide a place to discuss the unreasonable power social media trolls have over women and minorities, with a primary focus on video games (though other examples are certainly welcome).

OG_slinger wrote:

That's because engagement equals more ad revenue while "user safety" equals extra costs.

Sort of equals more ad revenue. The perverse thing about the current state of startups is that a lot of them aren't trying to optimize to make money, they're optimizing for proxies for making money which may or may not correspond to actual future revenue. Advertisers like engagement because they can measure how much people watched their ads--which is, again, a proxy for how many sales came from the ad.

If you don't have revenue, you tell investors about engagement because you can measure that and it's a big number that goes up. But it's not a given that you can turn that into money. Twitter spent 11 years with sky-high engagement before they managed to have their first profitable quarter. Many companies collapse without that.

One big recent example of wrong-headed metric-chasing is the "pivot to video" where a lot of news and media sites stopped writing articles and started making videos, because Facebook stats said videos had higher engagement and higher ad revenue. Only it turns out that Facebook had screwed up the metrics because the average view-time didn't count the people who just scrolled by the videos and media companies lost big-time.

I'm somewhat upset how much of our culture we've sold off to startups in exchange for bad metrics and content that most people don't want.

Gremlin wrote:

I'm somewhat upset how much of our culture we've sold off to startups in exchange for bad metrics and content that most people don't want.

A 17-Year-Old Girl Was Murdered. How Did Photos of Her Death Go Viral?

A 17-year-old Instagram celebrity was brutally murdered allegedly by an obsessed male friend who then posted images of the slaying on Instagram, gaming website Discord, and 4chan, prompting an outpouring of shock and horror on social media.

The victim, Bianca Devins, is a 17-year-old so-called “egirl” who lived in Utica, NY, and had a small following on Instagram under the name @escty. Devins also frequently posted on the discussion forum 4chan. Utica Police confirmed Devins’ death.

According to a statement sent to Rolling Stone from the Devins family, Bianca was “a talented artist, a loving sister, daughter, and cousin, and a wonderful young girl, taken from us all too soon.” The statement also notes that Devins had just graduated from high school and was looking forward to attending a local community college in the fall. “She is now looking down on us, as she joins her cat, Belle, in heaven,” the statement reads. “Bianca’s smile brightened our lives. She will always be remembered as our Princess.”

This s yet another story of incredibly disturbing, offensive content lying around on Instagram for ages while the company responds in earnest only after media interest.

As I have personally turned this into a more broad "the Internet is probably going to kill us all" thread, here's an interesting new study from the Pew Research Center on YouTube.

The media landscape was upended more than a decade ago when the video-sharing site YouTube was launched. The volume and variety of content posted on the site is staggering. The site’s popularity makes it a launchpad for performers, businesses and commentators on every conceivable subject. And like many platforms in the modern digital ecosystem, YouTube has in recent years become a flashpoint in ongoing debates over issues such as online harassment, misinformation and the impact of technology on children.

Amid this growing focus, and in an effort to continue demystifying the content of this popular source of information, Pew Research Center used its own custom mapping technique to assemble a list of popular YouTube channels (those with at least 250,000 subscribers) that existed as of late 2018, then conducted a large-scale analysis of the videos those channels produced in the first week of 2019. The Center identified a total of 43,770 of these high-subscriber channels using a process similar to the one used in our study of the YouTube recommendation algorithm. This data collection produced a variety of insights into the nature of content on the platform:

The YouTube ecosystem produces a vast quantity of content. These popular channels alone posted nearly a quarter-million videos in the first seven days of 2019, totaling 48,486 hours of content. To put this figure in context, a single person watching videos for eight hours a day (with no breaks or days off) would need more than 16 years to watch all the content posted by just the most popular channels on the platform during a single week. The average video posted by these channels during this time period was roughly 12 minutes long and received 58,358 views during its first week on the site.

Altogether, these videos were viewed over 14.2 billion times in their first seven days on the platform.

Prederick wrote:
Altogether, these videos were viewed over 14.2 billion times in their first seven days on the platform.

And roughly 14 billion of those views were probably bots watching them and then their promoted content to game suggested video predictions.

Have several thousand bots alternating between cat videos and alt-right videos and youtube starts thinking "oh, it turns out that people that like cat videos also like the alt-right". Throw in the other thousands of similar correlations that they want made, and then the thousands of interests doing similar things to game the metrics for their chosen projects, and all the sudden it starts to explain why youtube viewer statistics make it appear like every single human being on Earth spends 6 hours a day watching youtube.

I mean, either that or they are piping youtube straight into Hell, I'd believe that too.

Yonder wrote:

I mean, either that or they are piping youtube straight into Hell, I'd believe that too.

That's redundant, YouTube's comment section is under every YouTube video.

I'm on book 4 of the Hyperion cantos, and I'm starting to think Dan Simmons is a time traveler from the future who has seen where all of this is going.

I keep thinking about rereading those.

RE: Today's news

Any politician trying to file a suit against Facebook/Twitter/YouTube et al for "bias," and has nothing to say about 4chan/8chan, at this point, should face nothing less than overwhelming opprobrium whenever they are seen in public.

Prederick wrote:

RE: Today's news

Any politician trying to file a suit against Facebook/Twitter/YouTube et al for "bias," and has nothing to say about 4chan/8chan, at this point, should face nothing less than overwhelming opprobrium whenever they are seen in public.

If 8chan was run by a brown guy his servers would be in pieces buried under six feet of rubble and he himself would have been on the receiving end of a Hellfire missile.

Instead his little business is patient zeor for every right wing conspiracy theory and is rapidly approaching a triple digit body count.

OG_slinger wrote:

If 8chan was run by a brown guy his servers would be in pieces buried under six feet of rubble and he himself would have been on the receiving end of a Hellfire missile.

The best-case scenario for the owner of "jihadchan" that was linked to this many mass casualty events is Guantanamo.

Prederick wrote:

I swear, Twitter has become an incredibly efficient machine at ruining my day these days.

YouTube’s Newest Far-Right, Foul-Mouthed, Red-Pilling Star Is A 14-Year-Old Girl

What does a 14-year-old girl dressed in a chador have to say on YouTube to amass more than 800,000 followers?

How about this: “I’ve become a devout follower of the prophet Muhammad. Suffice to say, I’ve been having a f*ck ton of fun. Of course, I get raped by my 40-year-old husband every so often and I have to worship a black cube to indirectly please an ancient Canaanite god, but at least I get to go to San Fran and stone the sh*t out of some gays and the cops can’t do anything about it because California is a crypto-caliphate.”

Or how about, simply, “Kill yourself, f****t.”

Yes, if you want a vision of the future YouTube is midwifing, imagine a cherubic white girl mocking Islamic dress, while lecturing her hundreds of thousands of followers about Muslim “rape gangs,” social justice “homos,” and the evils wrought by George Soros, under the thin guise of edgy internet comedy, forever.

Actually, don’t imagine it. Watch it. It’s already here.

The video is called “Be Not Afraid,” and it may be the clearest manifestation yet of the culture the executives of Alphabet’s video monster are delivering to millions of kids around the world, now via children incubated in that selfsame culture. To understand just how bad things have gotten on the platform, you need to see it for yourself.

Users, and more importantly to YouTube, advertisers, have over the past year started to hold the platform accountable for enabling the exploitation of children and exposing them to disturbing content. But this video reveals an entirely different way the platform is harming kids: By letting them express extreme views in front of the entire world. This is what indoctrination looks like when it’s reflected back by the indoctrinated.

A twenty-minute, unbroken and hyper-articulate tirade ostensibly about ignoring criticism online, “Be Not Afraid” stars a high school freshman from the Bay Area who goes by the name Soph on YouTube. (She edits and scores the videos, which she says are comedic, as well.) Through videos like these, she’s become a rising star — with more than 800,000 followers — in the universe of conspiracy theorists, racists, and demagogues that owes its big bang to YouTube.

The video platform for years has incentivized such content through algorithms favoring sensational videos, and as recent reporting has revealed, has deliberately ignored toxic content as a growth strategy.

Soph’s scripts, which she says she writes with a collaborator, are familiar: A mix of hatred towards Muslims, anti-black racism, byzantine fearmongering about pedophilia, tissue thin incel evolutionary psychology, and reflexive misanthropy that could have been copy and pasted from a thousand different 4chan posts. Of course, it’s all presented in the terminally ironic style popularized by boundary-pushing comedy groups like the influential Million Dollar Extreme and adopted of late by white supremacist mass shooters in Christchurch and San Diego.

(Soph is even more explicitly hateful on Discord, the gaming chat app, where she recently admitted to writing under the username “lutenant {homophobic slur}” that she hoped for “A Hitler for Muslims” to “gas them all.”)

By now, we’re used to this stuff coming from grown men — some of whom have even used the platform as a launching pad for political aspirations. But Soph is a child. Despite the vitriol of her words and her confidence in delivering them, she’s still just a 14-year-old kid. And hearing this language lisped through braces, with the odd word mispronounced as if read but never before said, is clarifying.

Think of Jonathan Krohn, the conservative child prodigy who addressed CPAC in 2013, at age 13. Today he’s a freelance journalist who writes about extremism for liberal magazines, and has disowned his past views. Or think of Lynx and Lamb Gaede, who became media sensations as 11-year-old white nationalist twin pop singers in the mid aughts. Today they’ve renounced racism and taken up marijuana legalization activism. Part of being a young person, maybe especially for a rhetorically gifted one, is testing out ideas and identities — even ones we later find anathema. That’s not to excuse anything Soph says; but it is to say children often don’t understand the weight of the words they use. (Neither Soph nor her father responded to requests for comment.)

Interviews with Soph and asides in her videos reveal a young person whose identity is obviously still being formed. She didn’t start as a politics caster but, predictably, as a profane 9-year-old (nine!) game streamer called LtCorbis. Influential YouTubers Pyrocynical and Keemstar promoted her early work, which ripped on YouTube culture and the indignities of being a fifth grader instead of minorities and liberals. (A 2016 Daily Dot story about her bore the unintentionally profound headline, “This sweary, savvy, 11-year-old gamer girl is the future of YouTube.”) In more recent videos, Soph discloses a health issue that kept her out of class for long stretches. She confesses to being unhappy in school. She talks about a move from New York to California. She identifies by turns as “right-wing” and “anarcho-capitalist.” She’s 14, precocious, isolated, and pissed off, a combination that has produced a lot of bad behavior over the years, but not all of it monetized through preroll ads and a Patreon, and not all of it streamed to millions.

YouTube’s kid problem is well-known. From disturbing auto-generated cartoons, to parents who play act violence with their children for clicks, to a network of users exploiting videos of children for sexual content, the company has consistently failed at protecting the young users who are its most valuable assets. But Soph’s popularity raises another, perhaps more difficult question, about whether YouTube has an obligation to protect such users from themselves — and each other.

YouTube Terminated The Account Of A 14-Year-Old Star Over Her Anti-Gay Video

BuzzFeed wrote:

YouTube has removed the account of Soph, a 14-year-old girl who accumulated nearly a million followers through racist and anti-Muslim videos, after she uploaded an anti-LGBTQ video on July 31.

In the video, a 12-minute anti-gay rant titled “Pride and Prejudice,” Soph encourages her followers to “make sure to blame me in your manifestos” — a direct reference to the kind of document posted to 8chan by the Christchurch shooter who killed 51 people in March. On Saturday, the El Paso shooting suspect would also publish a manifesto to 8chan before killing 22 people.

According to YouTube, the channel was removed in accordance with its strike system, under which YouTube terminates accounts that violate its Community Guidelines three times in 90 days; in the case of Soph’s channel, the final strike violated the site’s Community Guidelines on hate speech.

...

After her channel was taken down, Soph tweeted an image of herself with what appears to be an assault rifle, with the caption “youtube headquarters here I come.” She later removed the tweet. “Gun tweet obviosly a joke,” she wrote in a follow-up tweet.

The 'obvios' joke:

IMAGE(https://i.imgur.com/zOgOfHv.png)

Oh man. Right wing comedy just slays me.

White House drafting executive order to tackle Silicon Valley’s alleged anti-conservative bias

Politico wrote:

The White House is circulating drafts of a proposed executive order that would address allegations of anti-conservative bias by social media companies, according to a White House official and two other people familiar with the matter — a month after President Donald Trump pledged to explore "all regulatory and legislative solutions" on the issue.

None of the three would describe the contents of the order, which one person cautioned has already taken many different forms and remains in flux. But its existence, and the deliberations surrounding it, are evidence that the administration is taking a serious look at wielding the federal government’s power against Silicon Valley.

“If the internet is going to be presented as this egalitarian platform and most of Twitter is liberal cesspools of venom, then at least the president wants some fairness in the system,” the White House official said. “But look, we also think that social media plays a vital role. They have a vital role and an increasing responsibility to the culture that has helped make them so profitable and so prominent."

Two other people knowledgeable about the discussions also confirmed the existence of the draft order.

None of the three people could say what penalties, if any, the order would envision for companies deemed to be censoring political viewpoints. The order, which deals with other topics besides tech bias, is still in the early drafting stages and is not expected to be issued imminently.

Ah, the clusterf*ck keeps on f*cking.

Liberal cesspools of venom.

Non-zero chance they try to make calling someone a racist or white supremacist "dangerous hate speech" equateable to the average comments at Stormfront.

Mitch McConnell's campaign discovers that the way Twitter responds to harassment is to shut down the victims, learns the wrong lesson from it:

Ars Technica: Republicans suspend Twitter ad spending after boneheaded video takedown

[...]

McConnell was understandably concerned about these threats and posted the video to his campaign's Twitter account to demonstrate the threatening tone of the protest.

But Twitter has a blanket policy against posting content containing violent threats, regardless of the context. Bizarrely, Twitter's policies don't seem to distinguish videos posted by people making threats from videos posted by targets of those threats. Both types of videos are banned. So Twitter hid the tweet and locked McConnell's account, preventing his campaign from posting new tweets.

"The user was temporarily locked out of their account for a tweet that violated our violent threats policy, specifically threats involving physical safety," Twitter said in a statement to The Washington Post.

I could have told you that Twitter's moderation is context-blind in 2014. No anti-conservative bias needed.

I was right that his account was banned just wrong on the reason. Thought it was due the tombstones he posted of Merrick Garland and Amy McGrath.

SixteenBlue wrote:

Liberal cesspools of venom.

Just as long as they bring Tom Hardy back.

Four millionth verse, same as the first

When Matheus Dominguez was 16, YouTube recommended a video that changed his life.

He was in a band in Niterói, a beach-ringed city in Brazil, and practiced guitar by watching tutorials online.

YouTube had recently installed a powerful new artificial intelligence system that learned from user behavior and paired videos with recommendations for others. One day, it directed him to an amateur guitar teacher named Nando Moura, who had gained a wide following by posting videos about heavy metal, video games and, most of all, politics.

In colorful and paranoid far-right rants, Mr. Moura accused feminists, teachers and mainstream politicians of waging vast conspiracies. Mr. Dominguez was hooked.

As his time on the site grew, YouTube recommended videos from other far-right figures. One was a lawmaker named Jair Bolsonaro, then a marginal figure in national politics — but a star in YouTube’s far-right community in Brazil, where the platform has become more widely watched than all but one TV channel.

Last year, he became President Bolsonaro.

“YouTube became the social media platform of the Brazilian right,” said Mr. Dominguez, now a lanky 17-year-old who says he, too, plans to seek political office.

Members of the nation’s newly empowered far right — from grass-roots organizers to federal lawmakers — say their movement would not have risen so far, so fast, without YouTube’s recommendation engine.

New research has found they may be correct. YouTube’s search and recommendation system appears to have systematically diverted users to far-right and conspiracy channels in Brazil.

A New York Times investigation in Brazil found that, time and again, videos promoted by the site have upended central elements of daily life.

Teachers describe classrooms made unruly by students who quote from YouTube conspiracy videos or who, encouraged by right-wing YouTube stars, secretly record their instructors.

Some parents look to “Dr. YouTube” for health advice but get dangerous misinformation instead, hampering the nation’s efforts to fight diseases like Zika. Viral videos have incited death threats against public health advocates.

And in politics, a wave of right-wing YouTube stars ran for office alongside Mr. Bolsonaro, some winning by historic margins. Most still use the platform, governing the world’s fourth-largest democracy through internet-honed trolling and provocation.

YouTube’s recommendation system is engineered to maximize watchtime, among other factors, the company says, but not to favor any political ideology. The system suggests what to watch next, often playing the videos automatically, in a never-ending quest to keep us glued to our screens.

But the emotions that draw people in — like fear, doubt and anger — are often central features of conspiracy theories, and in particular, experts say, of right-wing extremism.

As the system suggests more provocative videos to keep users watching, it can direct them toward extreme content they might otherwise never find. And it is designed to lead users to new topics to pique new interest — a boon for channels like Mr. Moura’s that use pop culture as a gateway to far-right ideas.

The system now drives 70 percent of total time on the platform, the company says. As viewership skyrockets globally, YouTube is bringing in over $1 billion a month, some analysts believe.

This article also neatly underlines how f*cking useless YouTube's "moderation" is:

Threats of rape and torture filled Ms. Diniz’s phone and email. Some cited her daily routines. Many echoed claims from Mr. Küster’s videos, she said.

Mr. Küster gleefully mentioned, though never explicitly endorsed, the threats. That kept him just within YouTube’s rules.

Facebook bans ads from The Epoch Times after huge pro-Trump buy

Facebook has banned The Epoch Times, a conservative news outlet that spent more money on pro-Trump Facebook advertisements than any group other than the Trump campaign, from any future advertising on the platform.

The decision follows an NBC News report that The Epoch Times had shifted its spending on Facebook in the last month, seemingly in an effort to obfuscate its connection to some $2 million worth of ads that promoted the president and conspiracy theories about his political enemies.

"Over the past year we removed accounts associated with the Epoch Times for violating our ad policies, including trying to get around our review systems," a Facebook spokesperson said. "We acted on additional accounts today and they are no longer able to advertise with us."

Facebook's decision came as a result of a review prompted by questions from NBC News. The spokesperson explained that ads must include disclaimers that accurately represent the name of the ad's sponsors.

The Epoch Times' new method of pushing the pro-Trump conspiracy ads on Facebook, which appeared under page names such as "Honest Paper" and "Pure American Journalism," allowed the organization to hide its multimillion-dollar spending on dark-money ads, in effect bypassing Facebook's political advertising transparency rules. Facebook's ban will affect only The Epoch Times' ability to buy ads; the sock-puppet pages created to host the new policy-violating ads were still live at the time of publication.

Nicholas Fouriezos, a reporter for the website OZY, tweeted about the move Thursday. It was first spotted last week by Lachlan Markay of The Daily Beast.

A recent NBC News investigation revealed how The Epoch Times had evolved from a nonprofit newspaper that carried a Chinese-American religious movement's anti-communism message into a conservative online news behemoth that embraced President Donald Trump and conspiracy content.

It blows my mind that Youtube and Facebook will be the downfall of humanity. Just utterly blows my mind.

Also, iFunny is run by a Russian guy! Seems like a non-zero amount of this modern ultra-nationalist far-right stuff is six degrees of separation from Russia.

Except, more like three degrees. Or less.

Prederick wrote:

Also, iFunny is run by a Russian guy! Seems like a non-zero amount of this modern ultra-nationalist far-right stuff is six degrees of separation from Russia.

Except, more like three degrees. Or less.

iFunny sounds like something Sasha Baron Cohen came up with as a comedy show for a Russian character he portrays.

So, in a hilarious development, a tweet I did for my job this evening ended up getting picked up by the QAnon crew. there's nothing to it, outside of the deranged fever dreams of the QAnoners, but suddenly, our account which on a good day MIGHT get a 10 RT tweet, now has a 400 RT tweet sitting there.

I'm hoping I don't have to explain this to my boss.

Which pizza restaurant do you work for?

iaintgotnopants wrote:

Which pizza restaurant do you work for?

Make 2 intentional -haussered mistakes if you personally know JFK Jr.