We tend to think we’re more in control of our behavior than we actually are. That is, our brains are operating from habit a large amount of the time, and the idea of the CEO brain consciously deciding our every action is largely an illusion.
This is why emergency exit doors must open outwards. People have died in fires in theaters trying to push on doors that said “Pull”. The crash bar on an emergency exit door is what’s known in design as an affordance.
When you present people with a door handle that affords “pulling”, they will pull even if the sign says “push”.
Now consider the fact that the primary affordance of social media is the “reaction”. Is it a surprise that content that garners a reaction will trend towards the outrageous?
If proactivity is the road to a more fulfilled, more civilly minded life and society, maybe we need to think of our affordances. Because we’ve made it awfully easy to be reactive, and awfully cumbersome to be proactive.
Edit: The comments below correctly point out an error; what I should have said was that there’s a danger in putting a “pull” handle on an emergency door that pushed outward, because of the confusing affordance.
I can't wait for the word "toxic" to drop out of favor. It implies that the person is not just giving a bad opinion, but is in fact fundamentally flawed and dangerous. Arsenic isn't toxic because it had a bad day at work, arsenic is by its nature a deadly poison and you can't change it, only avoid it. The vast majority of people labeled "toxic", though, are just humans with a variety of opinions and beliefs, some you may agree with and some you surely disagree with. They may change these beliefs, but not by being vilified and told they are intrinsically bad.
For, uh, purely scientific purposes I've noticed the posting volume on pr0n subreddits like /r/gonewild and uh, a few dozen similar subreddits, is perhaps 100 to 1000 times the sheer posting volume of a controversial subreddit like /r/the_donald. The sheer volume of relatively R rated pr0n is perhaps 1000 times the volume of everything else.
There seems to be tap dancing around the issue that reddit is a 1960s Playboy magazine fifty years in the future. There's just enough excellent articles to keep the advertisers amused, but the fact has to be faced that 99% of the traffic is young men looking at scantily dressed young women. You need a little submarine PR to stay controversial about subreddits nobody on a statistical basis reads to keep things legit appearing, whereas all the traffic and money is over there at /r/randomsexiness
Just like the old saying about Playboy, I only read it for r/ama not the nekkid ladies.
I'm not complaining; I'm just pointing out that reddit is THE most successful pr0n site out there with the most brilliant strategy I've ever seen. Please don't confuse it, and its achievements within the pr0n industry, with legacy news media or anything like that.
So we've tried:
- Real identities (Facebook comments)
- Voting and self moderation (Reddit, HN, etc.)
- Strong moderation (Reddit, HN)
They all result in toxic comments, trolling, an echo chamber,or worse, a complete lack of participation. There's no real solution to this problem. However, if you create consequences or a cost to commenting you'd eliminate toxic comments and trolling at the cost of less participation and an echo chamber (though, you could argue that not all participation is equal and you're mainly removing poor participants).
There's no perfect way to do this because even if you made a subscription necessary, for instance, you may just create an echo chamber. As part of the solution you'd need to prevent the creation of new accounts to circumvent any punishment received.
I'd say the most straightforward solution is that you have a forum and you get an account. Physical mail is sent to your house in order to get a single account. Then, regular moderation practices would be taken seriously as there's no way to create another. The community would be left with those who care enough to not be banned. The problem is that the moderators themselves may be corrupt or wrong.
This problem is easy to solve: give users the tools to do their own filtering. But services like to pretend that it's a hard problem because they don't want to actually cede that control to their users. The whole point of controlling the communication networks is to control what and how people say and hear.
Just the past week:
- New Yorker publishes this article
- Mayor of London at SXSW called for tighter tech regulation.
- UK Culture Secretary declares social media "broken", states he wants to time-limit children's access to social media and "impose stricter checks on the age limits when children can first set up a social media profile"
- Tim Berners-Lee said "we must regulate tech firms to prevent 'weaponised' web"
- The Economist prints an article "On Twitter, falsehood spreads faster than truth" (reporting on a recent study by MIT’s Laboratory for Social Machines)
- A report is delivered to European Commission: "Final report of the High Level Expert Group on Fake News and Online Disinformation"
- The House of Lords International Relations Committee concluded tech firms were negatively affecting our society
It's worth noting that instead of community management, Reddit has been soft-pivoting to become a new Facebook with features like group chat, new profile designs, and a News Feed-esque view on the official mobile apps.
Incidentally, Facebook itself hasn't solved the problem of its community/fake news, hence the recent algorithmic changes to the News Feed to surface more content from friends, and a public push toward Facebook Groups. To be like Reddit.
It's a never-ending cycle.
Bah, I recently deleted my 5 year account with nearly 4000 comment karma on Reddit.
Worse than being toxic, Reddit is an echo chamber. Many users find vote counts as more than a validation but rather a righteousness or correctness, which contrarily means opposing votes are not a disagreement but a vilification.
I just found myself becoming progressively more depressed contributing to conversations there. In the end it seemed to almost never be about the conversation but about being RIGHT + 1!!!
Since deleting my account a month ago I am a much happier person.
Hacker News has voting as well, but with an incredibly important distinction. The votes are hidden from all but the respective contributors so as to eliminate votes from ever being used as any kind of vindication by ignorant people.
Reddit used to be a place where you could post links to anything on the internet (kind of like this place) and comment on it (without signing up with your email). Now it's place more like Facebook with algorithms based on your personal taste, geolocation, forced email, shadow banning, banning of politically incorrect subreddits, endless political posts and "Futurology" and "Uplifting" news articles. Free speech on reddit died a long time ago.
Why do we think we'll succeed at "detoxifying" the internet? To me it seems like wanting everyone to say things that don't differ from what everyone else says. That doesn't happen off the internet.
Off the internet, most people interact with those who are more or less like themselves, so there's less chance of disagreement. For anyone who differs significantly from you, you hold back your real thoughts and/or stew silently. We just have to accept that echo chambers are an inevitability on the internet with censorship and moderation, and discontent is an inevitability otherwise.
If people knew how to talk diplomatically or to get along, this wouldn't be a problem in the first place. Maybe that's something that can be taught, but it's not a problem that gets fixed on the internet, or is a problem caused by the internet- fix it at home; this may also take care of the trolling problem.
Rather than censorship, just give readers better filtering tools. This was largely solved on USENET years ago, has been largely solved in email (to a great degree) -- it's just Twitter, FB, etc. which seem to have a hard time solving it.
If you made certain sets of filters the default, or available in groups, and did filtering based on your best beliefs about the desires of readers (based on keywords, senders, and maybe other characteristics of the conversation), it would go a long way.
I can't be the only person who is just not bothered by this kind of thing at all, am I? I'm way more worried about political indoctrination in upper education than trolls on the internet.
Aside from the degenerate cases like “kill all white men” or “liberalism is a mental disorder”, one man’s “toxic comment” is another man’s “common sense”. And as frustrating as it may be to SV digital overlords they don’t get to decide which is which, at least not without unpleasant repercussions down the line.
So there’s really no way to solve this. There is a way to contain it somewhat: create homogeneous bubbles. Which is what we’ve been observing over the past few years.
As long as there are no consequences to being a dick to people over the internet, "toxic internet" is here to stay.
Reddit has zero problem leaving post that threaten the lives of an entire religious group on their frontpage. If they were serious certain subs would be nuked.
The only way to detoxify the internet, is to get rid of the "social" aspect. Starting with gasp comments. My online experience has been that much better, since adding comment blockers to my browser.
> “Does free speech mean literally anyone can say anything at any time?” Tidwell continued. “Or is it actually more conducive to the free exchange of ideas if we create a platform where women and people of color can say what they want without thousands of people screaming, ‘Fuck you, light yourself on fire, I know where you live’?
This comment, by Reddit's General Counsel, says a lot about their approach. They believe that minorities/women will be subjected to nonstop violent comments unless Reddit pursues proactive and forceful moderation, free speech be damned. I think (hope) that's a very cynical view. The author of the piece does a great job juxtaposing this quote with the results of r/Place, however, in pointing out that it did not devolve into swastikas and "toxic" content when left to its own devices.
But reddit moderation is not just about quality content. I think many commenters here miss this. Moderation of a hugely successful website is not just about the technical issues or the logistics. Huffman, the CEO, is quoted as saying he believes Reddit can sway elections. So now moderation is about power. Imagine the temptation: you're one click away from destroying the popular, impactful subreddit of a political candidate you detest. One click to sway an election, maybe.
> To its devotees, Reddit feels proudly untamed, one of the last Internet giants to resist homogeneity.
Waitwhat? Reddit is a huge hivemind, with each subreddit having more or less the same views, since people who do not agree to the narrative of that subreddit are quickly downvoted so much that they can't post more than once a day or something.
I don’t know if you can do this. The problem with social news sites is not that the trolls are winning but it is that the moderators aren’t properly compensated because no website has the capability of moderating large websites. Also to get advertisers on your platform you need a moderated forum that aligns with your advertisers. The internet always had free speech and with that comes trolls by definition . The difference now is that there are competiting ideologies associated with websites that want to control vast parts of the internet . We have websites that want your pictures , attention and comments for free in exchange to promote other people’s products. On the site operators side they need to do deals with advertisers who don’t want that controversial topics. So the advertisers force discussions on the platform to only align with them. Or the site operators do this. Solving the troll problem is not going to be solved just by reddit.
All this centralization has created all these problems. We need to federate our systems, perhaps at the app level even. I worked at OpenTable for a short and their arch was interesting. Each PoP in the old version is its own system that pushes data central. Tho they’re replacing it with a new generation of centralization presumably due to eng maintenence concerns. Similar with their regional frontends, seems they’re being consolidated.
I am coming to think nested intranets and special purpose vpns might be a good approach to global commerce and expression. I love building massive scale no touch systems, but they’re never that perfect once they start being used. Eg. Usually you end up with more severe and harder to solve issues with one massive db vs. smaller systems with derived data and links. And on the FE a lot of hard problems go away, beyond i18n etc also cyber bullying etc become tractable.
Would it be sufficient for every user to just have client side configurable filters? Or is this movement more about denying "toxic" people a platform to begin with?
If it's the latter case, I can't get behind it. Free speech means everyone should have a platform to speak. As society transitions away from using our mouths to using our keyboards, the first amendment implies that speech should be protected on the common medium - whether verbal or text.
> Reddit was also an important part of Trump’s strategy. Parscale wrote—on Reddit, naturally—that “members here provided considerable growth and reach to our campaign.”
So were Hillary's and Bernie's campaigns, who each had a major presence on Reddit, both much more than Trump's.
You are making the problem seem worse than it actually is. Account age and inability to delete comments make an anonymous reputation pretty realistic on HN at least. Just in this thread we see examples of toxic comments being downvoted into invisibility (oblivion).
Or is the problem that /r/the_donald exists? Just don't go there if you don't like it. And don't go to /r/trees if you don't like marijuana.
I quit reddit for good on Saturday. There are two major problems with it:
- Downvoting creates echo chambers,
- No accountability for moderators.
I think the downvoting should be removed completely. Downvoting is basically giving people the power of a mini-censor and the people love it.
Subreddits can be completely censored by moderators and there's no way to know this is happening. Maybe reddit can develop a way to judge the behaviour of the moderators. Like showing what posts they removed, for example.
I've been part of plenty of internet communities that don't become like this. Even completely public forums and IRC channels. I guess HN isn't as bad as reddit because we have an above-average intelligence crowd, but there are some opinions I am careful not to mention here as well.
I detest this smug mentality, that so called trolling is trivial to identify and differentiate from legitimate, but unpopular opinion.
I feel strongly about it because it creates opportunity for censors, both in the form of moderators and communities themselves, to stifle debate under the guise of "detoxification." This kind of communal policing is ground zero for the echo chamber phenomenon.
We live in a reality where there will always be those who abuse platforms of communication for nefarious purposes. We also live in a reality where we are able to temper our own outrage, and we must if we wish to maintain healthy, open dialogue.
Offense is taken, not given.
This is stupid. Facebook, reddit, twitter are trying to make the internet this beautiful family friendly place. This is bullshit, we as humans are toxic. This is the way we function. This is the way react on things. Censoring all of this might affect things we should not censor. Fuck family friendlyness if it has a chance to affect free speech.
> which brand themselves as strongholds of free speech and in practice are often used for hate speech.
Um, hate speech is free speech.
Since the dawn of social media, as an anthropologist, I find the way it has evolved to be incredibly fascinating.
The original point of social media was to bring people together and build communities. Now, more than a decade removed from it's inception, it is doing the exact opposite. It's tearing people's communities apart, making people more polarized, and driving huge wedges into the very foundation of this country.
And yet, with all the privacy concerns, people continue to use it more and more, seemingly unaware of the consequences of doing so. It's like a train wreck you see coming a mile away. Yet you continue to stare, thinking or hoping its going to get better.
With the benefit of hindsight, I think “free-form platforms” like reddit or youtube may be the road to mediocrity.
Youtube is still the major player even though twitter, Facebook & netflix grew around them. It is less opinionated on how video should be consumed. Less opinionated on what should be consumed. Anyone can post. Anyone can see it.
Problem is, youtube doesn’t really have a purpose that it is trying to serve. At least not an obvious one. It has a lot of content. It’s searchable (I hate that the web is no longer searchable..). It’s the obvious place to put stuff. It’s everywhere. But…
The platform does not distinguish between the lazy mashup with a clickbait headline and the youtuber making original content for people who would really miss it, and could not get it anywhere else. No preference. No values. No taste. No goal. Just views. One content is as good as the next. Monetisation reinforced this, as did the partial rollback.
Go to the youtube homepage now. Is the content good? Read the comments. If the conversation interesting?
In the middle is Facebook. It started off highly opinionated. Use real names (big deal at the time). Connect to your real friends. Watch their videos and read their posts.
Over time though, Facebook has lost opinions. Stuff is stuff. Views, likes, pokes, plays and comments. Features that get users doing more stuff, they win. This caused their political pickle. Political baiting gets lots of views, likes, pokes, plays and comments. So, as they “optimized” for more stuff, that’s the stuff they got.
On the other end of the spectrum are wikipedia and stack overflow (for example). Highly opinionated. Wikipedia is for XYZ. Stackoverflow is for ABC. If you want something else, go someplace else.
HN is opinionated too. It takes effort to avoid comment threads full of bullbaiting an even more to avoid a flood of one liners.
So Reddit… Reddit did avoid uniformity, that’s a nice way of putting it. It has values. It has things (e.g. censorship) that it does not want to do. But, I feel it still lacks purpose. What does Reddit actually want, what’s the ideal comment or post or sub reddit?
Personally, I would love more HN-like places, more wikipedia like places, places with a job to do.
gdubs has an excellent comment about what Reddit is doing wrong (promoting impulse gut reactions), and I suspect it’s related. When you don’t have an opinion about what you are trying to do or be, and so you’ll tend to end up with whatever is cheap. It happened to youtube and it happened to Facebook.
Algorithmic/centralized vs Human/community moderation is the largest divide between Facebook, Twitter, and Google vs. Reddit, HN, Twitch, Mastodon, etc.
I’m thoroughly in favor of the latter. Algorithms can be used as tools to aid human moderators, but context, discretion, and diversity are important. In the case of bad communities, you have a much more tractable problem in terms of tracking/banning/avoiding what goes on —- as both a platform and as a user.
One more opinion for heap?
People are too sensitive about seeing their snowflake opinions validated. Upvote and downvote do not map to right and wrong. What makes an opinion popular has as much to do with rhetoric and appealing to a million fallacies as it does to facts or logic. This is human nature, not an inherent problem with the internet.
I just started reading a book that got me thinking about this - Conspiracy: Peter Thiel, Hulk Hogan, Gawker, and the Anatomy of Intrigue 
If you can remove just a couple of the worst actors from the internet, does it have an outsized benefit? Are those people defining "acceptable behavior" and by example giving more reasonable people permission to behave that way? Interesting questions regardless of the specifics of the Thiel/Gawker case.
[0 affiliate]: https://www.amazon.com/Conspiracy-Ryan-Holiday/dp/0735217645... [0 non-affiliate]: https://www.amazon.com/Conspiracy-Ryan-Holiday/dp/0735217645
A lot of commenters here are mentioning that better education could help solve this problem. I agree.
What if there were a new type of "teacher" who replaced or augmented regular mods on sites like Reddit?
Universities strive to hire professors who will teach students to be open-minded, consider the sources of information, and just think better in general. The prestige of each university is largely based on this factor. We tend to measure this by looking at the research accomplishments of the professors, or by looking at how many great thinkers/leaders come out of those universities.
What if sites like Reddit did the same thing, and tried to build their prestige based on excellent moderation? Could they be measured by how many great ideas/movements come out of the platform?
How do universities prevent the "echo chamber" effect where a few mods or a few users can take the whole conversation in one direction?
Social media sites are learning the hard way that they're media properties like any other media property, and that they have to have strong editorial control over their media property to be a proper media business.
Reddit is a hive of working-class populism, which is incompatible with any advertising-oriented business. Advertisers don't want their ads next to shitty toxic content. They want their ads next to elite, well produced content. You think Calvin Klein wants their beautiful fashion ads next to a photo of a dead body?
Sure they can get 10 cent ads of your neighbor's garage sale to place next to the photo of a dead body, but to get the $5 million ad buy from Proctor & Gamble, they'll need editors to raise the quality of their content.
Detoxification is central to any ad-based business.
How quickly they learn this is the question for Reddit.
It's not that the "trolls are winning", it's that people are allowing the trolls to bother them. Trolls have always existed; it's our heightened sensitivity and inability to just shrug them off or laugh in the face of their obscenity that's letting them "win".
The article reminded me about another proposal of how to fix reddit: http://chuqui.com/2015/07/fixing-or-replacing-reddit-some-qu... I see a problem with this but there are some valid points. Maybe mastodon.social is heading there:
So I wouldn’t host the stuff. But I could build an easy to install environment that would be a standardized system that could be installed on effectively any hosting site. Start with WordPress, WordPress’s P2 theme, a forum plug-in, the Disqus commenting system and a couple of weeks of hacking some custom work, and you’d have something that could be easily installed and run by a non-geek on any hosting service that supports WordPress.
There are big advantages to this: If someone really wants a topic to exist, they can get it going for well under $100 (including domain name) and keep it running for $10-20/mo. Most of these sites will be very low traffic and a lot of them will in fact be pop-up and collapse as people figure out running sites is work and audiences don’t appear by magic — but the good ones will thrive and grow, and for most of these, it’ll be cheap enough to operate that most people can run them out of pocket. By building it as an independent site, though, that person would have the option of doing advertising, or running a Patreon or GoFundMe, or find other ways to pay for hosting the content of the site.
It also shifts the liability for the existence of the content to the owner and host of the site and away from the central authority. If that person wants to find an offshore hosting service that doesn’t care what the content is, that’s up to them. So you’ve removed the need for a central authority to have to censor to protect its own interests.
You still need a way for people to find these topic-specific sites. Enter the central authority. It hosts a directory, much like the original Yahoo! directory was. Building something like this is dirt cheap and easy to host, so someone (like, ahem, Reddit) could do so at low cost so that it doesn’t have to host those subreddits any more but could still support them by hosting a directory of them where they could be found.
Back 15-20 years ago, forums didn't like personal abuse or being off-topic. Now, the problem seems to be "dangerous ideas" or "might influence people to believe something I don't". It's changed from staying reasonable to trying to influence real world politics by silencing dissent. Why not have freedom of political ideas on a political forum? Stop worrying about influencing voters in the "wrong" way and meddling in elections and all that nonsense.
Even on HN, there are "wrong" ideas that you can't even hint at, even when they're on topic and you're being non-abusive. It's dominated by aggressively enforced political opinion.
I'm an old BBSer. I was on message forums seeing the entire array of so-called problems of today, which people summarize as "toxicity." I saw the full gamut of what can happen, including court-ordered restraining orders, personal harassment in real life, fights. But I also saw communities of what were (in those days) teenagers and young adults (sometimes older adults) finding self-moderation through trial and error and experience.
We were anonymous in those days and no, it wasn't more toxic, it was less. I credit those days with teaching me to learn to write correctly, to learn a style of approach to the world which is a mix of rational and skeptical, and with being exposed to numerous philosophical and ethical viewpoints that I would NEVER have been exposed to in the actual toxic environment that I experienced as "high school in America."
Now, this was in the mid 90's. Fast forward to the Internet revolution, we start to see the beginnings of message forums online. Fast forward a little bit, we're shocked to find an extremely low level of intelligent engagement, and a high level of emotive hissy-fitting, anger and knee-jerking. Worse, there's a set of interests and focus areas that belong to the masses. My BBS friends and I quickly abandoned our years-long dedication to messaging. It was finito.
Because suddenly, a computer nerd niche was exposed to the masses, and the interests of the masses overwhelmed this niche, to the point that intelligent message forum discussion boards because extremely hard to find. RIP intelligent message forums (that I could uncover, anyway) circa early 2000's.
Now, to me Reddit is the pinnacle of that. I've never written a single item on Reddit but I've read hundreds of posts. And to me they are:
-- Very short, one or two sentence one-liners Me-too's, "this!", etc. Blanket dismissals. There's nothing inherent about anonymity or any other element of the media itself that leads to that.
-- Controlled by autocratic "mods" who delete or otherwise punish views that aren't held by them. The political ones are classic examples. Newsflash, guys, there's nothing inherent about politics that should lead to that, and it's not the medium that leads to that degree of censorship. Echo-chamber my butt - that's not an echo chamber, that's a top-down dictatorial type of dictatorship. And evil at that. An echo chamber is when people self-select (in my view) and bounce and reinforce each other's ideas.
-- Emotiveness. This is number 1. In any discussion, whether discussing a muscle pain in one subreddit, Trump in another subreddit, or altcoins in another, what you find are people view the message medium as a chance to express their feelings. And so when one person disagrees, it's almost an attack or a failure to acknowledge the other's feelings, if you look at this psychologically. But it's perceived as "toxic." What you say is toxic is a manifestation of people who are not USED to rational back-and-forths -- they didn't qualify for the debate team if you know what I mean -- and they think that it's about expressing personal feelings. Almost like they are VOTING on what is true and false.
Folks, I assert this: it's not the anonymous aspect. It's not "toxicity." There's no "cure." There's only people. And levels of education and intelligence (yes, I'm sorry, intelligence plays a role here.).
I was going to talk about Facebook but I deleted the paragraph as it's a whole other set of variables in addition to these.
If there's ANY cure it's education. Teach people what argumentation is. Ad hominems being the first thing they should learn is naughty. Next, a code of conduct for MODERATORS. If you have moderators who behave like the Stasi intercepting private correspondence and throwing out the bad ones, that should be number one on the hitlist.
It's about a, education on argument logic, b, ethics on what moderation is (and isn't). Again, my BBS peers learned this when we were 13-18, gradually.
Given that reddit is a Y Combinator alumnus, did anyone at YC contemplate the current mess of the internet back when they applied? I'm curious about the questions asked back then and how they would be answered now
Reddit doesn’t owe anyone anything.
how about asking kids parents to detoxify their kids than solving the problem way down the chain of command?
People are shitty online because they have anonymity. Now instead of going ahead and removing that, it might be worth going backward even more and going to the person.
people need to be nice on the internet. The internet does not need to be nice to people. Detoxifying internet is no different than censorship. I can say this with certainty that as soon as we ask companies and governments to detoxify, it will get misused [un]intentionally.
I'd argue that a cross-internet reputation service would fix this problem.
1. The reputation system is affected by voting on all sites in which you participate.
2. The history of your participation across sites is viewable in your history (ala reddit, HN)
3. Your reputation is displayed with you participation wherever you participate.
Sites could then put reputation limits only allowing users above a certain reputation to participate. Making the cost of toxicity increase.
The biggest problem would be getting the walled gardens to adopt the system.
I think, somewhat counterintuitively, the ability to "anti-like" or "Report as bad take" could actually make the internet a much more positive place.
The insulation from negative perceptions of those outside your in-group causes polarization and rewards unpopular but extreme opinions.
I'd take toxicity over censorship any day. The media machine & politicians have been going at it recently.
I wonder when being toxic will become illegal.
This, to me, sounds like the zombie apocalypse we were promised.
What are your thoughts on requiring or at least encouraging verified accounts - somehow putting a verified name behind the poster?
I would argue we don't need to detoxify the internet, stopping trolling is missing the point however we should stop the spread of misinformation, in fact sites like reddit and facebook rarely create information, they just consume it blindly without questioning the source
> Struggle to Detoxify the Internet
The internet isn't toxic.
It's the people who are toxic.
These people are just as noxious in real life even if some of them hide it when their identities are known.
Can anything be done to detoxify the people, or do we just treat them like spam and filter them out?
And what happens next when millions of rabid voices are suppressed?
The toxic people don't cease to exist, we just won't be able to see them as well.
Perhaps we'll find the social and political environment of 2022 to be much darker and more dangerous than 2018.
Decent, moral people avoid people and places like reddit altogether in their every day lives while walking down the street.
A news analyst, on NPR a few years ago, called reddit a "Frankenstein's monster they can't control".
Does detoxify mean getting rid of mean speech?
> Some of the conspiracy theorists left Reddit and reunited on Voat, a site made by and for the users that Reddit sloughs off. (Many social networks have such Bizarro networks, which brand themselves as strongholds of free speech and in practice are often used for hate speech. People banned from Twitter end up on Gab; people banned from Patreon end up on Hatreon.) Other Pizzagaters stayed and regrouped on r/The_Donald, a popular pro-Trump subreddit. Throughout the Presidential campaign, The_Donald was a hive of Trump boosterism. By this time, it had become a hermetic subculture, full of inside jokes and ugly rhetoric. The community’s most frequent commenters, like the man they’d helped propel to the Presidency, were experts at testing boundaries. Within minutes, they started to express their outrage that Pizzagate had been deleted.
This is a critical thing that we as a society need to recognize about censorship and political correctness. When we censor people or ostracize them for saying things we don't like, even when what they say is legitimately awful, these people don't disappear, change their minds, or stop voting. They go elsewhere, become more entrenched in their awful beliefs, and because we've pushed them all together, they become more united. They become stronger. And because they become stronger away from us, we don't notice, and we're blindsided when they flex their political power.
Underlying this mistake are some important truths:
1. People who say hateful things are human beings worth engaging with. No, I don't like reading a lot of what people say. It's easy for me to forget that they are human beings with their own struggles and traumas that cause them to believe the awful things they believe. When we dismiss them as trolls, we're dehumanizing them. Such a society doesn't leave room to be wrong and learn--if you're wrong, you're dismissed--and it dismisses the people who are the most dangerously wrong. It's not our responsibility to educate people, but that's irrelevant to the fact that if we don't educate people no one will. We need to engage people who believe awful things, try to understand what needs cause them to believe those things, and try to address those needs with compassion and courage. Truth is the antidote to hate.
2. Free speech doesn't just matter in a legal context. Free speech is protected in the US constitution because it's important in a free society. If we're going to let the discourse of our society move into privately-owned platforms like Reddit/Facebook/Twitter instead of publicly-owned platforms like street corners where newspapers are sold (or, the rest of the internet) then we have to value free speech on those platforms as well.
Too many people are stuck in this idea that the Trump election was an anomaly--that in November Congress will change and in 2020 we'll have a new president. I see no reason to believe this will happen. We have changed nothing about our behavior and we're hoping the ones who elected Trump to change.
> Huffman can no longer edit the site indiscriminately
There's precisely zero proof of this.
Toxic statements = statements from a different political orientation
Why are people so willing to embrace Orwellian doublethink? My prediction is that such a sword will do decapitate undeserving people of all political aisles.
I think you replied to the wrong comment.
Oh, mai. Where is the tldr; for this one
OK, so how did r/Trees end up focusing on marijuana, and r/MarijuanaEnthusiasts on trees? Was r/Trees first, and then tree lovers did r/MarijuanaEnthusiasts as a joke? Was there a war?
Most people are idiots. Consider that half of the population (hypothetically) has less than a 100 IQ score. Idiots did different things together 100 years ago and, 1000 years before that, groups of idiots would get together to raid and rape and pillage. They still do that in other places in the world. Now they come onto the internet to send hateful things and porn to one another. Most of the biggest idiots do not go on reddit. When the idiots on reddit act up, management cleans it up a little bit. Is there an issue that needs to be fixed here? I think anyone even claiming to have a solution may belong to the former group
Not one word about the HUGE Antifa presence or doxing by left wing conspirators and agitators. This is exactly why I don't read the New Yorker.
I would consider reading it if they gave light to both sides of this issue. But they don't. This is not journalism.
Edit: u/spez was caught editing user comments on certain subs he publicly disagrees with. How he's still CEO is beyond me. At least he admits he's a troll.
Edit 2 - Downvotes all you want, all I'm saying is FACTS. Here's more FACTS: https://motherboard.vice.com/en_us/article/z4444w/how-reddit...
Reddit culture starts at the top and this is Steve Hoffman. He should resign. He's completely in over his head.