[News] The Internet Was a Mistake

A thread for updates on the various ways the internet is destroying everything and the undying hellsites of social media. Let's all laugh at the abyss.

OnlyFans to Block Sexually Explicit Videos Starting in October

The cycle continues. Platforms use NSFW to build their foundation, secure the bag then toss sex workers off.

As many online have noted, OnlyFans appears to be Tumblrizing itself, but someone else made a good, if infuriating point: They'd all have been better off being racist trolls.

And that's not just an OnlyFans thing, it's a thing for all platforms. Across the board, you are far better off and will be able to make far more money without getting bothered by just being an abusive, racist troll than you would by being a sex worker.

EDIT: Also, a thing I realized, which I personally found just horrifically, darkly hilarious?

If you are an adult content creator, or consumer who is frustrated by banks and payment processors essentially controlling what you can and cannot pay for over the internet...

(And that truly is the crux of the problem here. Yes, it's the platforms, but the major pressure is coming from the banks and payment processors. If Visa/Mastercard looks at your site and goes "Nah, we're not helping you buy stuff," most sites are virtually crippled instantaneously.)

...and you are looking for an alternative...

Well, the most popular, accessible, reasonable one is the one that most of the people I've seen tweeting in support of sex workers absolutely despise. It rhymes with "schmipto flourency."

It's like the purest distillation of "there is no ethical consumption." You're stuck between the vicissitudes of Visa and whatever Elon Musk is sh*tposting about.

That's disappointing. So many places just rip off content. Seemed like that was a place pornography could actually act as a legitimate service. Money directly to creators.

My former employer, PNC Bank, was proud of its ties to child sex trafficking groups like the Catholic Archdiocese of Chicago. But it refused to do any sort of business with any strip club in the state.

I wish there was a way to word this without making my antisemitism sense trigger, but the banks really do control everything.

Ah, looks like the Beeb did an expose on OnlyFans recently.

That cannot have helped with the pressure.

"A woman who had sex on camera with a man to whom she was showing a property, the BBC investigation showed, in fact had no real estate credentials."

Current Affairs is in the news for discourse reasons right now, however, this article from last year is very, very trenchant.

Paywalls are justified, even though they are annoying. It costs money to produce good writing, to run a website, to license photographs. A lot of money, if you want quality. Asking people for a fee to access content is therefore very reasonable. You don’t expect to get a print subscription to the newspaper gratis, why would a website be different? I try not to grumble about having to pay for online content, because I run a magazine and I know how difficult it is to pay writers what they deserve.

But let us also notice something: the New York Times, the New Yorker, the Washington Post, the New Republic, New York, Harper’s, the New York Review of Books, the Financial Times, and the London Times all have paywalls. Breitbart, Fox News, the Daily Wire, the Federalist, the Washington Examiner, InfoWars: free! You want “Portland Protesters Burn Bibles, American Flags In The Streets,” “The Moral Case Against Mask Mandates And Other COVID Restrictions,” or an article suggesting the National Institutes of Health has admitted 5G phones cause coronavirus—they’re yours. You want the detailed Times reports on neo-Nazis infiltrating German institutions, the reasons contact tracing is failing in U.S. states, or the Trump administration’s undercutting of the USPS’s effectiveness—well, if you’ve clicked around the website a bit you’ll run straight into the paywall. This doesn’t mean the paywall shouldn’t be there. But it does mean that it costs time and money to access a lot of true and important information, while a lot of bullsh*t is completely free.

Maaaan those are bitter words, especially coming from Robinson’s pen. I realize there is no ethical consumption in capitalism but between this and the recent, umm, revelations about what the leftist thinks about corporate management, it sure shakes up my opinion of him.

I don’t want to sully Robinson too much. But you’re not wrong, Prederick, he’s not having a pleasant 2021.

It is a pretty serious issue, and I don't think there are any good solutions without huge first amendment issues. Good journalism is no longer economically viable. But, government subsidies would bring a host of problems. Bad journalism fills the gap.

Mixolyde wrote:

It is a pretty serious issue, and I don't think there are any good solutions without huge first amendment issues. Good journalism is no longer economically viable. But, government subsidies would bring a host of problems. Bad journalism fills the gap.

Don't forget the public radio/television funding model. I'm sure that their journalists could get more money elsewhere, but it seems (from the outside at least) that it's not a pittance. (Speaking as a long-time supporter of WNYC.)

ProPublica seems to make it work without a paywall (disclosure: I'm a donator), and I feel like NYTs best journalism is usually in partnership with them.

Vox is also pretty good.

I don't know who Nathan Robinson is, or what Current Affairs is. Taken in isolation, though, there is a lot of sense in his article about paywalls.

I don't claim to be any kind of authority myself, but I did work in newspapers for many years, and a lot of my closest friends are (or were) journalists and editors. I've seen first-hand what has happened in the internet era, with publications making all their material free online and relying more and more on online ad revenue versus sales and subscriptions.

Case in point: Until recently, a friend worked for one of the largest regional dailies in the UK, owned by the largest newspaper chain. Online ad revenue was so important that there were electronic "leaderboards" installed around the office, showing which articles were performing best online. This was the metric on which management was most focused. If you wrote a few articles in a row that underperformed (ie - did not get enough clicks), you might be called into the editor's office for a chat.

The implications of this are obvious. Why spend a week working on an in-depth, quality article with multiple sources when you could get 100 times as many clicks and shares with, "Look at this picture of a dildo that was found in a kids' playground!" or some other such rubbish that has no value? Quality is disincentivized. Figuring out ways to get people to click on a link is what it's all about.

So much of journalism has become a race to the bottom: what can we do to get the most clicks, and therefore the most ad revenue? And the consequences are not just dumb clickbait about sex toys under the slide at a park. As Robinson hints at, outrage and hyperbole are a great way to get more clicks, too.

Most people just don't get this, or don't think about it. When you complain about the ad-supported, content-for-free model on so many internet sites, a lot of folks just shrug and say, "Oh, I don't mind looking at a few ads if it means I can access the stuff for free." It's not that simple. It's literally not the same content that you would be getting if the site were not reliant entirely on ad revenue and clicks.

Looks like OnlyFans has reversed its policy change.
Pinned Twitter post

Thank you to everyone for making your voices heard.
We have secured assurances necessary to support our diverse creator community and have suspended the October 1 policy change.
OnlyFans stands for inclusion and we continue to provide a home for all creators.

Apropos of nothing, I will never, ever stop being amused by Internet Dudes who take being blocked as a sign of their Ultimate Victory.

I'm not quite fully blown "Social media is one of the most harmful, destructive inventions in the history of mankind" yet...

....but I'm like 75% of the way there.

I'm 100% at "shutting down Facebook or Twitter is the single greatest thing Zuckerberg or Dorsey could do to help humanity"

Even the non misinformation stuff is extremely harmful. It is a platform to glorify and amplify narcissists. In normal times its dangerous. During COVID it is extremely deadly.

They have all the worst aspects of communication (miscommunication, impulsiveness, ease of recycling and proliferation, self-fulfilling, stubbornness, enabling, trivializes empathy, nothing ever dies, legitimizes the untested and unproven...) boiled into one service.

I am an Australian living in Australia. For some reason FB has decided that I really want to see yet another public mournstabation event of 13 beers being set out on a table to go flat at various bars across America.

They aren't ads so they are not easily dismissed. They are "Suggested" posts. I am seeing 10+ different versions of this every day.

I wonder if any of these bars left beers out for the other 2,400+

Conspiracy Wall - The memeable trope distracts us from the way conspiracy theories actually spread

Key part, to me:

The end result is always the same: You don’t need to learn the catechism; you only have to want the Good News. All that’s required for buy-in is the final conclusion: the election was rigged, masks aren’t necessary, vaccines are poison. Ultimately, people who subscribe to the kinds of theories peddled by Alex Jones or Marjorie Taylor Greene are not looking for a causal, logical explanation for the state of the world we live in, but rather for permission to feel how they want to feel or embrace the behavior they’re looking to embrace. The less said, the better.

The inalienable right to be the complete asshole you always were.

TIL fake news is like alcohol.

High Court finds media outlets are responsible for Facebook comments in Dylan Voller defamation case

The High Court has dismissed an appeal by some of Australia's biggest media outlets including The Sydney Morning Herald and The Australian, finding they are the publishers of third-party comments on their Facebook pages.
Former Northern Territory detainee Dylan Voller wants to sue the companies over alleged defamatory comments on their Facebook pages in the New South Wales Supreme Court.

But the case had been stalled by the dispute over whether the outlets were the publishers of the material.

The High Court today found that, by running the Facebook pages, the media groups participated in communicating any defamatory material posted by third parties and are therefore responsible for the comments.

Mr GT Chris wrote:

The High Court today found that, by running the Facebook pages, the media groups participated in communicating any defamatory material posted by third parties and are therefore responsible for the comments.

I applaud this. Frankly, I think Facebook should also be held liable. In any first-world country, if a physical newspaper published a "letter to the editor" full of libel and defamation, the newspaper would be sued. The law does not allow them to shrug and say, "It was the letter-writer who said all that stuff. We were just the platform." No - providing the platform in the first place makes you liable. You are responsible for the content you publish.

Really makes no sense to me that social-media companies are not held to these same legal standards.

Troll farms reached 140 million Americans a month on Facebook before 2020 election, internal report shows

MIT Technology Review wrote:

In the run up to the 2020 election, the most highly contested in US history, Facebook’s most popular pages for Christian and Black American content were being run by Eastern European troll farms. These pages were part of a larger network that collectively reached nearly half of all Americans, according to an internal company report, and achieved that reach not through user choice but primarily as a result of Facebook’s own platform design and engagement-hungry algorithm.

The report, written in October 2019 and obtained by MIT Technology Review from a former Facebook employee not involved in researching it, found that after the 2016 election, Facebook failed to prioritize fundamental changes to how its platform promotes and distributes information. The company instead pursued a whack-a-mole strategy that involved monitoring and quashing the activity of bad actors when they engaged in political discourse, and adding some guardrails that prevented “the worst of the worst.”

But this approach did little to stem the underlying problem, the report noted. Troll farms were still building massive audiences by running networks of Facebook pages, with their content reaching 140 million US users per month—75% of whom had never followed any of the pages. They were seeing the content because Facebook’s content-recommendation system had pushed it into their news feeds.

“Instead of users choosing to receive content from these actors, it is our platform that is choosing to give [these troll farms] an enormous reach,” wrote the report’s author, Jeff Allen, a former senior-level data scientist at Facebook.

Joe Osborne, a Facebook spokesperson, said in a statement that the company “had already been investigating these topics” at the time of Allen’s report. “Since that time, we have stood up teams, developed new policies and collaborated with industry peers to address these networks. We’ve taken aggressive enforcement actions against these kinds of foreign and domestic inauthentic groups and have shared the results publicly on a quarterly basis.”

In the process of fact checking this story shortly before publication, MIT Technology Review found that five of the troll-farm pages mentioned in the report remained active.

The report found that troll farms were reaching the same demographic groups singled out by the Kremlin-backed Internet Research Agency (IRA) during the 2016 election, which had targeted Christians, Black Americans, and Native Americans. A 2018 BuzzFeed News investigation found that at least one member of the Russian IRA, indicted for alleged interference in the 2016 US election, had also visited Macedonia around the emergence of its first troll farms, though it didn’t find concrete evidence of a connection. (Facebook said its investigations hadn’t turned up a connection between the IRA and Macedonian troll farms, either.)

“This is not normal. This is not healthy,” Allen wrote. “We have empowered inauthentic actors to accumulate huge followings for largely unknown purposes ... The fact that actors with possible ties to the IRA have access to huge audience numbers in the same demographic groups targeted by the IRA poses an enormous risk to the US 2020 election.”

As long as troll farms found success in using these tactics, any other bad actor could too, he continued: “If the Troll Farms are reaching 30M US users with content targeted to African Americans, we should not at all be surprised if we discover the IRA also currently has large audiences there.”

Allen wrote the report as the fourth and final installment of a year-and-a-half-long effort to understand troll farms. He left the company that same month, in part because of frustration that leadership had “effectively ignored” his research, according to the former Facebook employee who supplied the report. Allen declined to comment.

The report reveals the alarming state of affairs in which Facebook leadership left the platform for years, despite repeated public promises to aggressively tackle foreign-based election interference. MIT Technology Review is making the full report available, with employee names redacted, because it is in the public interest.

Its revelations include:

  • As of October 2019, around 15,000 Facebook pages with a majority US audience were being run out of Kosovo and Macedonia, known bad actors during the 2016 election.
  • Collectively, those troll-farm pages—which the report treats as a single page for comparison purposes—reached 140 million US users monthly and 360 million global users weekly. Walmart’s page reached the second-largest US audience at 100 million.
  • The troll farm pages also combined to form:
    • the largest Christian American page on Facebook, 20 times larger than the next largest—reaching 75 million US users monthly, 95% of whom had never followed any of the pages.
    • the largest African-American page on Facebook, three times larger than the next largest—reaching 30 million US users monthly, 85% of whom had never followed any of the pages.
    • the second-largest Native American page on Facebook, reaching 400,000 users monthly, 90% of whom had never followed any of the pages.
    • the fifth-largest women’s page on Facebook, reaching 60 million US users monthly, 90% of whom had never followed any of the pages.

From Instagram’s Toll on Teens to Unmoderated ‘Elite’ Users, Here’s a Break Down of the Wall Street Journal’s Facebook Revelations

A series of investigative reports being rolled out by the Wall Street Journal is putting a spotlight on the behind-the-scenes actions of Facebook. Ranging from rule exemptions for high-profile users to Instagram’s toll on teens’ mental health, the Journal‘s “Facebook Files” expose internal Facebook research that appears to show just how knowledgable the company was of the platform’s “ill-effects.”

The Journal’s reports, three so far, are based on a review of internal Facebook documents, including research reports, online employee discussions and drafts of presentations to senior management. They reveal that the company has allegedly failed to fix numerous problems that it’s long known about, and in some cases made them worse.

“Time and again, the documents show, Facebook’s researchers have identified the platform’s ill effects. Time and again, despite congressional hearings, its own pledges and numerous media exposés, the company didn’t fix them,” the Journal reports. “The documents offer perhaps the clearest picture thus far of how broadly Facebook’s problems are known inside the company, up to the chief executive himself.”

Social media companies are the tobacco industry of the 21st century.

That is a phenomenal analogy dejanzie, and to take it the next step, the "secondhand smoke" equivalent with social media has a much wider and farther reach.

No More Apologies: Inside Facebook’s Push to Defend Its Image

NYT wrote:

Mark Zuckerberg, Facebook’s chief executive, signed off last month on a new initiative code-named Project Amplify.

The effort, which was hatched at an internal meeting in January, had a specific purpose: to use Facebook’s News Feed, the site’s most important digital real estate, to show people positive stories about the social network.

The idea was that pushing pro-Facebook news items — some of them written by the company — would improve its image in the eyes of its users, three people with knowledge of the effort said. But the move was sensitive because Facebook had not previously positioned the News Feed as a place where it burnished its own reputation. Several executives at the meeting were shocked by the proposal, one attendee said.

Project Amplify punctuated a series of decisions that Facebook has made this year to aggressively reshape its image. Since that January meeting, the company has begun a multipronged effort to change its narrative by distancing Mr. Zuckerberg from scandals, reducing outsiders’ access to internal data, burying a potentially negative report about its content and increasing its own advertising to showcase its brand.

The moves amount to a broad shift in strategy. For years, Facebook confronted crisis after crisis over privacy, misinformation and hate speech on its platform by publicly apologizing. Mr. Zuckerberg personally took responsibility for Russian interference on the site during the 2016 presidential election and has loudly stood up for free speech online. Facebook also promised transparency into the way that it operated.

But the drumbeat of criticism on issues as varied as racist speech and vaccine misinformation has not relented. Disgruntled Facebook employees have added to the furor by speaking out against their employer and leaking internal documents. Last week, The Wall Street Journal published articles based on such documents that showed Facebook knew about many of the harms it was causing.

So Facebook executives, concluding that their methods had done little to quell criticism or win supporters, decided early this year to go on the offensive, said six current and former employees, who declined to be identified for fear of reprisal.

“They’re realizing that no one else is going to come to their defense, so they need to do it and say it themselves,” said Katie Harbath, a former Facebook public policy director.

The changes have involved Facebook executives from its marketing, communications, policy and integrity teams. Alex Schultz, a 14-year company veteran who was named chief marketing officer last year, has also been influential in the image reshaping effort, said five people who worked with him. But at least one of the decisions was driven by Mr. Zuckerberg, and all were approved by him, three of the people said.

...

So in January, executives held a virtual meeting and broached the idea of a more aggressive defense, one attendee said. The group discussed using the News Feed to promote positive news about the company, as well as running ads that linked to favorable articles about Facebook. They also debated how to define a pro-Facebook story, two participants said.

That same month, the communications team discussed ways for executives to be less conciliatory when responding to crises and decided there would be less apologizing, said two people with knowledge of the plan.

Mr. Zuckerberg, who had become intertwined with policy issues including the 2020 election, also wanted to recast himself as an innovator, the people said. In January, the communications team circulated a document with a strategy for distancing Mr. Zuckerberg from scandals, partly by focusing his Facebook posts and media appearances on new products, they said.

Ugh