Author Topic: Facebook's Thought Policing.  (Read 1152 times)

0 Members and 3 Guests are viewing this topic.

Offline Yuri Bezmenov

  • Drunk-assed squadron leader
  • Obsessive Postwhore
  • *****
  • Posts: 6663
  • Karma: 0
  • Communist propaganda is demoralizing the West.
Facebook's Thought Policing.
« on: December 27, 2018, 07:22:10 PM »
This is just part of what's going on in Silicon Valley, they are also acting in coordination with other platforms to squelch "wrongthink".

https://www.msn.com/en-us/news/technology/inside-facebook’s-secret-rulebook-for-global-political-speech/ar-BBRvjsg?ocid=spartanntp

MENLO PARK, Calif. — In a glass conference room at its California headquarters, Facebook is taking on the bonfires of hate and misinformation it has helped fuel across the world, one post at a time.
The social network has drawn criticism for undermining democracy and for provoking bloodshed in societies small and large.
But for Facebook, it’s also a business problem.
Sign Up For the Morning Briefing Newsletter
The company, which makes about $5 billion in profit per quarter, has to show that it is serious about removing dangerous content. It must also continue to attract more users from more countries and try to keep them on the site longer.
How can Facebook monitor billions of posts per day in over 100 languages, all without disturbing the endless expansion that is core to its business? The company’s solution: a network of workers using a maze of PowerPoint slides spelling out what’s forbidden.
Every other Tuesday morning, several dozen Facebook employees gather over breakfast to come up with the rules, hashing out what the site’s two billion users should be allowed to say. The guidelines that emerge from these meetings are sent out to 7,500-plus moderators around the world.
The closely held rules are extensive, and they make the company a far more powerful arbiter of global speech than has been publicly recognized or acknowledged by the company itself, The New York Times has found.
The Times was provided with more than 1,400 pages from the rulebooks by an employee who said he feared that the company was exercising too much power, with too little oversight — and making too many mistakes.
An examination of the files revealed numerous gaps, biases and outright errors. As Facebook employees grope for the right answers, they have allowed extremist language to flourish in some countries while censoring mainstream speech in others.
Moderators were once told, for example, to remove fund-raising appeals for volcano victims in Indonesia because a co-sponsor of the drive was on Facebook’s internal list of banned groups. In Myanmar, a paperwork error allowed a prominent extremist group, accused of fomenting genocide, to stay on the platform for months. In India, moderators were mistakenly told to take down comments critical of religion.
The Facebook employees who meet to set the guidelines, mostly young engineers and lawyers, try to distill highly complex issues into simple yes-or-no rules. Then the company outsources much of the actual post-by-post moderation to companies that enlist largely unskilled workers, many hired out of call centers.
Those moderators, at times relying on Google Translate, have mere seconds to recall countless rules and apply them to the hundreds of posts that dash across their screens each day. When is a reference to “jihad,” for example, forbidden? When is a “crying laughter” emoji a warning sign?
Moderators express frustration at rules they say don’t always make sense and sometimes require them to leave up posts they fear could lead to violence. “You feel like you killed someone by not acting,” one said, speaking on the condition of anonymity because he had signed a nondisclosure agreement.
Facebook executives say they are working diligently to rid the platform of dangerous posts.
“It’s not our place to correct people’s speech, but we do want to enforce our community standards on our platform,” said Sara Su, a senior engineer on the News Feed. “When you’re in our community, we want to make sure that we’re balancing freedom of expression and safety.”
Monika Bickert, Facebook’s head of global policy management, said that the primary goal was to prevent harm, and that to a great extent, the company had been successful. But perfection, she said, is not possible.
“We have billions of posts every day, we’re identifying more and more potential violations using our technical systems,” Ms. Bickert said. “At that scale, even if you’re 99 percent accurate, you’re going to have a lot of mistakes.”
The Rules
The Facebook guidelines do not look like a handbook for regulating global politics. They consist of dozens of unorganized PowerPoint presentations and Excel spreadsheets with bureaucratic titles like “Western Balkans Hate Orgs and Figures” and “Credible Violence: Implementation standards.”
Because Facebook drifted into this approach somewhat by accident, there is no single master file or overarching guide, just a patchwork of rules set out by different parts of the company. Facebook confirmed the authenticity of the documents, though it said some had been updated since The Times acquired them.
The company’s goal is ambitious: to reduce context-heavy questions that even legal experts might struggle with — when is an idea hateful, when is a rumor dangerous — to one-size-fits-all rules. By telling moderators to follow the rules blindly, Facebook hopes to guard against bias and to enforce consistency.
Facebook says the files are only for training, but moderators say they are used as day-to-day reference materials.
Taken individually, each rule might make sense. But in their byzantine totality, they can be a bit baffling.
One document sets out several rules just to determine when a word like “martyr” or “jihad” indicates pro-terrorism speech. Another describes when discussion of a barred group should be forbidden. Words like “brother” or “comrade” probably cross the line. So do any of a dozen emojis.
The guidelines for identifying hate speech, a problem that has bedeviled Facebook, run to 200 jargon-filled, head-spinning pages. Moderators must sort a post into one of three “tiers” of severity. They must bear in mind lists like the six “designated dehumanizing comparisons,” among them comparing Jews to rats.
“There’s a real tension here between wanting to have nuances to account for every situation, and wanting to have set of policies we can enforce accurately and we can explain cleanly,” said Ms. Bickert, the Facebook executive.
Though the Facebook employees who make the rules are largely free to set policy however they wish, and often do so in the room, they also consult with outside groups.
“We’re not drawing these lines in a vacuum,” Ms. Bickert said.
An Unseen Branch of Government
As detailed as the guidelines can be, they are also approximations — best guesses at how to fight extremism or disinformation. And they are leading Facebook to intrude into sensitive political matters the world over, sometimes clumsily.
Increasingly, the decisions on what posts should be barred amount to regulating political speech — and not just on the fringes. In many countries, extremism and the mainstream are blurring.
In the United States, Facebook banned the Proud Boys, a far-right pro-Trump group. The company also blocked an inflammatory ad, about a caravan of Central American migrants, that was produced by President Trump’s political team.
In June, according to internal emails reviewed by The Times, moderators were told to allow users to praise the Taliban — normally a forbidden practice — if they mentioned its decision to enter into a cease-fire. In another email, moderators were told to hunt down and remove rumors wrongly accusing an Israeli soldier of killing a Palestinian medic.
“Facebook’s role has become so hegemonic, so monopolistic, that it has become a force unto itself,” said Jasmin Mujanovic, an expert on the Balkans. “No one entity, especially not a for-profit venture like Facebook, should have that kind of power to influence public debate and policy.”
In Pakistan, shortly before elections were held in July, Facebook issued its moderators a 40-page document outlining “political parties, expected trends and guidelines.”
Pakistan, one of the world’s largest and most fragile democracies, enforces a media blackout on Election Day. This makes Facebook a center of news and discussion during voting.
The document most likely shaped those conversations — even if Pakistanis themselves had no way of knowing it. Moderators were urged, in one instance, to apply extra scrutiny to Jamiat Ulema-e-Islam, a hard-line religious party. But another religious party, Jamaat-e-Islami, was described as “benign.”
Though Facebook says its focus is protecting users, the documents suggest that other concerns come into play. Pakistan guidelines warn moderators against creating a “PR fire” by taking any action that could “have a negative impact on Facebook’s reputation or even put the company at legal risk.”
In India, Chinmayi Arun, a legal scholar, identified troubling mistakes in Facebook’s guidelines.
One slide tells moderators that any post degrading an entire religion violates Indian law and should be flagged for removal. It is a significant curb on speech — and apparently incorrect. Indian law prohibits blasphemy only in certain conditions, Ms. Arun said, such as when the speaker intends to inflame violence.
Another slide says that Indian law prohibits calls for an independent Kashmir, which some legal scholars dispute. The slide instructs moderators to “look out for” the phrase “Free Kashmir” — though the slogan, common among activists, is completely legal.
Facebook says it is simply urging moderators to apply extra scrutiny to posts that use the phrase. Still, even this could chill activism in Kashmir. And it is not clear that the distinction will be obvious to moderators, who are warned that ignoring violations could get Facebook blocked in India.
‘Things Explode Really Fast’
In the absence of governments or international bodies that can set standards, Facebook is experimenting on its own.
The company never set out to play this role, but in an effort to control problems of its own creation, it has quietly become, with a speed that makes even employees uncomfortable, what is arguably one of the world’s most powerful political regulators.
“A lot of this would be a lot easier if there were authoritative third parties that had the answer,” said Brian Fishman, a counterterrorism expert who works with Facebook.
“Sometimes these things explode really fast,” Mr. Fishman said, “and we have to figure out what our reaction’s going to be, and we don’t have time for the U.N.”
But the results can be uneven.
Consider the guidelines for the Balkans, where rising nationalism is threatening to reignite old violence. The file on that region, not updated since 2016, includes odd errors. Ratko Mladic, a Bosnian war criminal still celebrated by extremists, is described as a fugitive. In fact, he was arrested in 2011.
The slides are apparently written for English speakers relying on Google Translate, suggesting that Facebook remains short on moderators who speak local languages — and who might understand local contexts crucial for identifying inflammatory speech. And Google Translate can be unreliable: Mr. Mladic is referred to in one slide as “Rodney Young.”
The guidelines, said Mr. Mujanovic, the Balkans expert, appear dangerously out of date. They have little to say about ultranationalist groups stoking political violence in the region.
Nearly every Facebook employee who spoke to The Times cited, as proof of the company’s competence, its response after the United Nations accused the platform of exacerbating genocide in Myanmar. The employees pointed to Facebook’s ban this spring on any positive mention of Ma Ba Tha, an extremist group that has been using the platform to incite violence against Muslims since 2014.
But puzzled activists in Myanmar say that, months later, posts supporting the group remain widespread.
The culprit may be Facebook’s own rulebooks. Guidelines for policing hate speech in Myanmar instruct moderators not to remove posts supporting Ma Ba Tha. Facebook corrected the mistake only in response to an inquiry from The Times.
Employees also touted their decision to shut down Facebook accounts belonging to senior military officials in Myanmar.
But the company did not initially notify Myanmar’s government, leading the barred officers to conclude that they had been hacked. Some blamed Daw Aung San Suu Kyi, the country’s de facto civilian leader, and the episode deepened distrust between her and the military, lawmakers say.
The Hate List
Facebook’s most politically consequential document may be an Excel spreadsheet that names every group and individual the company has quietly barred as a hate figure.
Moderators are instructed to remove any post praising, supporting or representing any listed figure.
Anton Shekhovtsov, an expert in far-right groups, said he was “confused about the methodology.” The company bans an impressive array of American and British groups, he said, but relatively few in countries where the far right can be more violent, particularly Russia or Ukraine.
Countries where Facebook faces government pressure seem to be better covered than those where it does not. Facebook blocks dozens of far-right groups in Germany, where the authorities scrutinize the social network, but only one in neighboring Austria.
The list includes a growing number of groups with one foot in the political mainstream, like the far-right Golden Dawn, which holds seats in the Greek and European Union parliaments.
For a tech company to draw these lines is “extremely problematic,” said Jonas Kaiser, a Harvard University expert on online extremism. “It puts social networks in the position to make judgment calls that are traditionally the job of the courts.”
The bans are a kind of shortcut, said Sana Jaffrey, who studies Indonesian politics at the University of Chicago. Asking moderators to look for a banned name or logo is easier than asking them to make judgment calls about when political views are dangerous.
But that means that in much of Asia and the Middle East, Facebook bans hard-line religious groups that represent significant segments of society. Blanket prohibitions, Ms. Jaffrey said, amount to Facebook shutting down one side in national debates.
And its decisions often skew in favor of governments, which can fine or regulate Facebook.

© Adam Dean for The New York Times The ruins of a home set upon by a Buddhist mob in a deadly attack in Sri Lanka last March. Facebook has been accused of accelerating violence in the country.
In Sri Lanka, Facebook removed posts commemorating members of the Tamil minority who died in the country’s civil war. Facebook bans any positive mention of Tamil rebels, though users can praise government forces who were also guilty of atrocities.
Kate Cronin-Furman, a Sri Lanka expert at University College London, said this prevented Tamils from memorializing the war, allowing the government to impose its version of events — entrenching Tamils’ second-class status.
The View From Menlo Park
Facebook’s policies might emerge from well-appointed conference rooms, but they are executed largely by moderators in drab outsourcing offices in distant locations like Morocco and the Philippines.
Facebook says moderators are given ample time to review posts and don’t have quotas. Moderators say they face pressure to review about a thousand pieces of content per day. They have eight to 10 seconds for each post, longer for videos.
The moderators describe feeling in over their heads. For some, pay is tied to speed and accuracy. Many last only a few exhausting months. Front-line moderators have few mechanisms for alerting Facebook to new threats or holes in the rules — and little incentive to try, one said.
One moderator described an officewide rule to approve any post if no one on hand can read the appropriate language. This may have contributed to violence in Sri Lanka and Myanmar, where posts encouraging ethnic cleansing were routinely allowed to stay up.
Facebook says that any such practice would violate its rules, which include contingencies for reviewing posts in unfamiliar languages. Justin Osofsky, a Facebook vice president who oversees these contracts, said any corner-cutting probably came from midlevel managers at outside companies acting on their own.
This hints at a deeper problem. Facebook has little visibility into the giant outsourcing companies, which largely police themselves, and has at times struggled to control them. And because Facebook relies on the companies to support its expansion, its leverage over them is limited.
One hurdle to reining in inflammatory speech on Facebook may be Facebook itself. The platform relies on an algorithm that tends to promote the most provocative content, sometimes of the sort the company says it wants to suppress.
Facebook could blunt that algorithm or slow the company’s expansion into new markets, where it has proved most disruptive. But the social network instills in employees an almost unquestioned faith in their product as a force for good.
When Ms. Su, the News Feed engineer, was asked if she believed research finding that more Facebook usage correlates with more violence, she replied, “I don’t think so.”
“As we have greater reach, as we have more people engaging, that raises the stakes,” she said. “But I also think that there’s greater opportunity for people to be exposed to new ideas.”
Still, even some executives hesitate when asked whether the company has found the right formula.
Richard Allan, a London-based vice president who is also a sitting member of the House of Lords, said a better model might be “some partnership arrangement” with “government involved in setting the standards,” even if not all governments can be trusted with this power.
Mr. Fishman, the Facebook terrorism expert, said the company should consider deferring more decisions to moderators, who may better understand the nuances of local culture and politics.
But at company headquarters, the most fundamental questions of all remain unanswered: What sorts of content lead directly to violence? When does the platform exacerbate social tensions?
Rosa Birch, who leads an internal crisis team, said she and her colleagues had been posing these questions for years. They are making progress, she said, but will probably never have definitive answers.
But without a full understanding of the platform’s impact, most policies are just ad hoc responses to problems as they emerge. Employees make a tweak, wait to see what happens, then tweak again — as if repairing an airplane midflight.
In the meantime, the company continues to expand its reach to more users in more countries.
“One of the reasons why it’s hard to talk about,” Mr. Fishman said, “is because there is a lack of societal agreement on where this sort of authority should lie.”
But, he said, “it’s harder to figure out what a better alternative is.”

« Last Edit: January 01, 2019, 07:30:45 PM by Sofa King Wasted »

Offline Jesse

  • My mirror shows black (Otherwise known as nigger)
  • Elder
  • Obsessive Postwhore
  • *****
  • Posts: 6000
  • Karma: 110
  • Gender: Male
  • where mountains throne
Re: Facebook's Though Policing.
« Reply #1 on: December 28, 2018, 04:57:03 PM »
I've been banned several times. Longest was 7 days twice, a few 24 hour bans few 3day bans

All for in jest, making fun of Muslims. Ghey People. Womins. Feminists

Made fun of a bunch of Christians, nothing happened. I don't know if they have an actual person looking at all the complaints or a computer program doing it, but its so inconsistent.

it really is a bummer that Facebook has such a grip on your balls or titties but they do. And there really is nothing you can do except try to regulate a private company which to me is even scarier
:skywarp:

Offline renaeden

  • Complicated Case of the Aspie Elite
  • Caretaker Admin
  • Almighty Postwhore
  • *****
  • Posts: 26132
  • Karma: 2535
  • Gender: Female
Re: Facebook's Though Policing.
« Reply #2 on: December 28, 2018, 11:05:36 PM »
I don't understand the title of this thread. Is it meant to be Through? Thorough?
Mildly Cute in a Retarded Way
Tek'ma'tae

Offline Minister Of Silly Walks

  • Elder
  • Dedicated Postwhore
  • *****
  • Posts: 4035
  • Karma: 421
Re: Facebook's Though Policing.
« Reply #3 on: December 28, 2018, 11:33:44 PM »
Thought policing is what I believe Pappy means.

Fuck that's a big copy paste. It's nearly as long as one of Lestat's posts, but a fuck of a lot less interesting.
“When men oppress their fellow men, the oppressor ever finds, in the character of the oppressed, a full justification for his oppression.” Frederick Douglass

Offline Calandale

  • Official sheep shagger of the aspie underclass
  • Elder
  • Postwhore Beyond The Pale
  • *****
  • Posts: 41238
  • Karma: -57
  • Gender: Male
  • peep
    • The Game Box: Live!
Re: Facebook's Though Policing.
« Reply #4 on: December 29, 2018, 06:25:39 AM »
Wait, you can get banned from facebook?

How have I not managed this?

Offline Walkie

  • Wooden sword crusader of the Aspie Elite
  • Elder
  • Dedicated Postwhore
  • *****
  • Posts: 3121
  • Karma: 352
Re: Facebook's Though Policing.
« Reply #5 on: December 29, 2018, 07:16:08 AM »
looks like the link's broken already :(

Offline Minister Of Silly Walks

  • Elder
  • Dedicated Postwhore
  • *****
  • Posts: 4035
  • Karma: 421
Re: Facebook's Though Policing.
« Reply #6 on: December 29, 2018, 04:34:28 PM »
Facebook has no responsibility to maintain Facebook as some kind of free speech platform. My Facebook feed sees more than enough islamophobic fake news and various other types of bigotry. I often feel compelled to confront the bs at the possible expense of friendship, and that's how Facebook can lose their revenue stream.

Doesn't really affect me. I don't particularly hate transgender people or Muslims or gays or women, and wouldn't post negative shit about them even if Facebook let me.
“When men oppress their fellow men, the oppressor ever finds, in the character of the oppressed, a full justification for his oppression.” Frederick Douglass

Offline Jesse

  • My mirror shows black (Otherwise known as nigger)
  • Elder
  • Obsessive Postwhore
  • *****
  • Posts: 6000
  • Karma: 110
  • Gender: Male
  • where mountains throne
Re: Facebook's Though Policing.
« Reply #7 on: December 29, 2018, 08:30:36 PM »
Wait, you can get banned from facebook?

How have I not managed this?
You seem to just keep to yourself there. Get in a heated debate, and somebody will report you
:skywarp:

Offline Calandale

  • Official sheep shagger of the aspie underclass
  • Elder
  • Postwhore Beyond The Pale
  • *****
  • Posts: 41238
  • Karma: -57
  • Gender: Male
  • peep
    • The Game Box: Live!
Re: Facebook's Though Policing.
« Reply #8 on: December 30, 2018, 02:03:12 AM »
Wait, you can get banned from facebook?

How have I not managed this?
You seem to just keep to yourself there. Get in a heated debate, and somebody will report you

I get 'reported' enough that my vids get marked as spam. :P

Offline Lestat

  • Pharmaceutical dustbin of the autie elite
  • Elder
  • Obsessive Postwhore
  • *****
  • Posts: 8965
  • Karma: 451
  • Gender: Male
  • Homo stercore veteris, heterodiem
Re: Facebook's Though Policing.
« Reply #9 on: December 30, 2018, 07:39:47 AM »
'Buddhist mob'...now that's one you don't hear too often. I'm not the religious type, but out of the lot of them, buddhism is probably the one I've most respect for.
Beyond the pale. Way, way beyond the pale.

Requiescat in pacem, Wolfish, beloved of Pyraxis.

Offline Phoenix

  • Elder
  • Obsessive Postwhore
  • *****
  • Posts: 6161
  • Karma: 413
  • Gender: Female
Re: Facebook's Though Policing.
« Reply #10 on: December 30, 2018, 02:07:26 PM »
'Buddhist mob'...now that's one you don't hear too often. I'm not the religious type, but out of the lot of them, buddhism is probably the one I've most respect for.
:lol1:
“To rise, first you must burn.”
― Hiba Fatima Ahmad

Offline Yuri Bezmenov

  • Drunk-assed squadron leader
  • Obsessive Postwhore
  • *****
  • Posts: 6663
  • Karma: 0
  • Communist propaganda is demoralizing the West.
Re: Facebook's Though Policing.
« Reply #11 on: January 01, 2019, 07:35:38 PM »
Facebook has no responsibility to maintain Facebook as some kind of free speech platform.

That might be changing in the future. After Patreon's latest shenanigans, there's a push to have silicon valley tech giants regulated as public utilities because they have become the main medium through which people communicate. It's possible that the FCC will regulate them in the future, it might depend on a few lawsuits that are in the works right now.

Offline Minister Of Silly Walks

  • Elder
  • Dedicated Postwhore
  • *****
  • Posts: 4035
  • Karma: 421
Re: Facebook's Though Policing.
« Reply #12 on: January 01, 2019, 08:45:01 PM »
Facebook has no responsibility to maintain Facebook as some kind of free speech platform.

That might be changing in the future. After Patreon's latest shenanigans, there's a push to have silicon valley tech giants regulated as public utilities because they have become the main medium through which people communicate. It's possible that the FCC will regulate them in the future, it might depend on a few lawsuits that are in the works right now.

That may well be the case, but I can't see any proposed regulation forcing social media platforms to carry hate speech getting much traction.

Then again, 'Merica never fails to surprise me.
“When men oppress their fellow men, the oppressor ever finds, in the character of the oppressed, a full justification for his oppression.” Frederick Douglass

Offline Lestat

  • Pharmaceutical dustbin of the autie elite
  • Elder
  • Obsessive Postwhore
  • *****
  • Posts: 8965
  • Karma: 451
  • Gender: Male
  • Homo stercore veteris, heterodiem
Re: Facebook's Thought Policing.
« Reply #13 on: January 01, 2019, 09:17:51 PM »
Well it's true, miss K, the very IDEA of a murderous, torch-and-pitchforks mob of ravening buddhists is just outright weird.

A violent buddhist mob, it goes against THE number one most important of a group of core tenets, that of Ahimsa, or non-violence towards others, top of which is basically the same as the 'thou shallt not kill' of the ten commandments, except it wasn't allegedly penned by the same psychotic, genocidal BPD-infested schizophrenic fuckup of a deity who both makes it one of his most sacred laws not to murder; and who enshrines free will as inviolate, whilst at the same time 'hardening Pharaoh's heart' so he will not release the israelites until said deity has unleashed a whole bunch of plagues, economic disasters and grisly curses against the egyptian people as a whole, and who sends down his own personal black-ops wet work angel to slaughter the first born CHILDREN of egypt en-masse.

Genociding children to prove a point, and claiming murder is abhorrent and the most grievous of mortal sins, claiming free will as utterly sacred, and then 'hardening the heart' of a ruler to influence his actions to give himself (I.e god, not Pharaoh) the excuse to unleash plague and disaster among the subjects of the ruler? and punishing adam and eve for eating the fruit of the tree of knowledge of good and evil, that he stuck there, conveniently, as a temptation, whilst blaming satan for tempting them to eat of it? and if that wasn't enough, whilst in the bible, demanding that the son not be punished for the sin of the father, both burdens the species he allegedly created with 'original sin', that all the descendants of Adam and Eve be cursed for the sins of not the father, but the primal ancestor of the race? and cursing the line of Cain, after he committed the first murder, instead of punishing Cain alone? (after getting Cain himself pissed off and miserable by rejecting his sacrifice whilst looking with favour upon that of his brother, Abel, despite never having actually told either of them WHAT they ought to do to please him with their sacrifices? no guidelines, just wing it and then find out if god hates you later?

Sounds like a borderline PD parent to me, as well as a raving narcissist with genocidal tendencies and more than just a touch of schizophrenia thrown in to boot (old vs new testament. Bloodthirsty bastard not much better than the Amalekite god Moloch, to whom babies were burnt alive as human sacrifices, to the new testament, where he's all love and infinite mercy (as long as you do exactly what you are told, or your god of infinite mercy will punish you by letting the devil roast your soul alive until time itself comes to an end)....the judaeo-xtian god/allah (same bugger, different names...shit, if being a schizo, fulminating genocidal narcissistic warlord, tyrant and borderline parent weren't bad enough already, the fucker has multiple personality disorder/dissociative identity disorder too? Gives a whole new perspective to the phrase 'holy shit', no?)

And if the second coming ever happens, everybody better pack considerable supplies of clean underwear, for they are going to be shitting them on a regular basis. At least until the men in white coats have a squad of hench great buggering big orderlies that look like, and probably think like your average side of beef, manage to drag the messiah down to the ground and pump him full of thorazine.

If the bible turns out to be right, we are SO fucking screwed. And god did indeed make us in his own image. A tyrannical, vicious mental case bastard who robs, pillages and just occasionally, rapes (I don't recalling him ASKING the virgin Mary if she wanted to have his kid, no, he just went down there and shagged her then fucked off without so much as buying her a drink, never mind paying child support.

(lestat says loudly, in a sing-song high-pitched snarky tone of voice) 'BORDEEEeeeeERRRRR---liiiiinnneeeee'
Beyond the pale. Way, way beyond the pale.

Requiescat in pacem, Wolfish, beloved of Pyraxis.

Offline Yuri Bezmenov

  • Drunk-assed squadron leader
  • Obsessive Postwhore
  • *****
  • Posts: 6663
  • Karma: 0
  • Communist propaganda is demoralizing the West.
Re: Facebook's Though Policing.
« Reply #14 on: February 14, 2019, 07:59:28 PM »
Facebook has no responsibility to maintain Facebook as some kind of free speech platform.

That might be changing in the future. After Patreon's latest shenanigans, there's a push to have silicon valley tech giants regulated as public utilities because they have become the main medium through which people communicate. It's possible that the FCC will regulate them in the future, it might depend on a few lawsuits that are in the works right now.

That may well be the case, but I can't see any proposed regulation forcing social media platforms to carry hate speech getting much traction.

Then again, 'Merica never fails to surprise me.

Who gets to decide what is and isn't hate speech though??

As it stands right now, silicon valley is defining hate speech as anything that challenges the Leftist dogmatic orthodoxy.

We're already seeing this on Twitter and since many "journalists" source from Twitter, that skews the news in a Leftist direction. Twitter is indirectly creating propaganda this way.