Moderation Methods vs Censorship Claims

One of the major points of contention in the Bitcoin community these days revolves around claims of censorship on popular Bitcoin forums. The counterargument to such claims is that moderation is simply not censorship. I believe that both claims have some validity; let’s delve deeper into them.

Bitcoin Forum History

I speak from personal experience here, as someone who has had a handful of my posts removed from /r/bitcoin, been attacked by trolls on a variety of subreddits, been banned from /r/buttcoin for posting inconvenient facts, and was once a moderator of a Bitcoin subreddit (/r/bitcoinxt.)

My thesis is that Bitcoin is a living protocol that is evolving as humans work to form new points of consensus around what Bitcoin should be. As such, humans should be free to discuss any aspect of Bitcoin and any proposal for modifying aspects of the protocol. My first brush with heavy-handed moderation was when Bitcoin XT was gaining traction:

A few days later I received an explanation for why a post I made about hashpower signaling was removed. Unfortunately, I couldn’t comprehend the logic that a proposed protocol change is necessarily an alt-coin, given that Bitcoin is whatever set of rules humans decide to agree upon. I’m quite confident that there will be a future version of Bitcoin that is by (some) definition an alt-coin.

It went a bit downhill from there as further changes were implemented by moderators.

More recently, one of my comments was removed because it was in reference to the head /r/bitcoin moderator “Theymos.”

The cherry on top was when it was revealed that mentions of censorship are blacklisted by /r/bitcoin’s Automoderator:

If you have plenty of free time and want to learn more of the dramatic details behind the events that have led to this current state of affairs, you can read:

Freedom Of Speech

Let’s be clear: it’s true that users don’t have a “right to free speech” when posting content onto servers that are someone else’s private property. Given that online speech restriction is only applicable to citizens of countries that guarantee the right to free speech, this is a moot point anyway since we’re approaching this from a global perspective. Also, when a user creates an account with an online service, they are voluntarily agreeing that their speech may be restricted. Thus, any arguments claiming that users have a right to post whatever they want onto sites that are not owned by them are fallacious. However, let’s dig a bit deeper past a user’s legal rights and look at the morality of speech suppression.

If a moderator decides to blackhole someone’s post without even notifying them, they are steering the conversation by deciding which content is acceptable. On the flip side, they are also, in a sense, stealing the user’s time by erasing their efforts. I, as a user, have an expectation that people in the community will be able to read the content I wrote and respond to it if they find value in doing so.

It’s true that the user can always post their content to some other site or even use another medium of communication. Copying content takes relatively little time and effort in the Internet Age. However, if the “moderation” is occurring without the knowledge of the user, they won’t even be aware of the need to find an alternate discussion forum. As such, their content will never have a chance to flourish.

I particularly like the model I see used on subreddits such as /r/AskWomen and /r/BitcoinMarkets where, if a moderator removes a post, they will publicly reply with the reason for the removal. This informs both the user that their speech has been restricted and the wider community about rule enforcement.

The Moderation — Censorship Continuum

Where is the line between censorship and moderation? This seems to be a matter of perspective.

Do you think of forums as an incoming stream of content, some of which may be “spam” that is unwanted by anyone reading the forum?

Or do you instead think of a forum as a wall upon which anyone can write, but if objectionable content is reported by members of the public, a moderator can review their objections and decide whether or not the content is valuable to the community and should remain in it?

I’d argue that the expectation of users who post to online forums is that their content will be readable nearly instantly by those with whom they are conversing, not that it will be sent to a queue where it might possibly be released out into the community if a moderator judges it worthy.

Some people argue that stifling speech is not censorship unless it’s complete and total suppression of information from the rest of the world. However, censorship can broadly mean the suppression of any public communication. As mentioned earlier, if a user has no clue that they need to repost their content to another forum, the content has been suppressed for all intents and purposes.

The goal of moderation is to keep discussions on topic and thus hopefully valuable to the participants. The downside to heavy moderation is that it can lead to a forum becoming an echo chamber where dissenting views are deleted. While this does appear to happen in /r/bitcoin, I’d argue that it also happens in the less-moderated /r/btc for a different reason — many /r/btc members harbor resentment toward /r/bitcoin moderators and their vision for Bitcoin, thus when supporters of those ideals argue for them in /r/btc, they are heavily downvoted by participants as opposed to being removed by moderators. Thus the choice between these diametrically opposed subreddits appears to be tyranny of the moderators versus tyranny of the majority. I pose to you that an ideal forum should not be subject to either.

Lessons Learned by Other Communities

The “technology” subreddit was delisted from the default subreddits by admins after a similar keyword based system of moderation / censorship was revealed. Coincidentally, several of the blacklisted keywords were “Bitcoin,” “bitcoins,” “dogecoin,” and “MtGox.” Also, some of the moderators who perpetrated this blacklisting added their own usernames to the Automoderator keyword list, presumably to stifle discussion amongst community members about their behavior. This strikes me as particularly insidious and reminiscent of blacklisting references to censorship. It caused so much turmoil that the Reddit administrators felt the need to step in.

A particularly interesting case with regard to the necessity of moderation is the Anarcho Capitalism subreddit. The mods of this subreddit are staunch supporters of voluntaryism and believe that the community should be able to regulate itself without the need of any authoritative guiding hand. Their “welcome to this subreddit” post notes:

This sub is lightly moderated, well, really unmoderated. Some of the mods have stated in the past that they don’t want Anarcho-Capitalists to become weak at arguing so they allow free speech on this subreddit. All ideologies are welcome to come and discuss and critique Anarcho-Capitalism.

I’m a fan of this mentality; it reminds me of Andreas Antonopoulos’ “sewer rat” analogy. As a result, sometimes the subreddit is flooded with statists and trolls who do not participate in good faith, resulting in an interesting variety of sentiments displayed by users, such as:

While /r/Anarcho_Capitalism’s quality of discussion can be volatile, the trolls have never been able to sustain a permanent Sybil attack against the community. These Anarcho Capitalists have upheld their ideals, taking the high road when they could have easily chosen to suppress the speech of those with whom they disagree.

Power Dynamics

Another perspective that may be causing contention is who holds the power. From a technical standpoint, today’s moderators have more power than users. However, I would argue that moderators ought to be serving at the behest of the community — their job is to enforce whatever rules the community agrees upon. As such, accountability for moderators is important. Public moderator logs would be useful, though they would need to be tamper-proof or at least tamper-evident. I recall that /r/btc set out to provide public moderation logs, though at time of writing they appear to be broken and I haven’t had time to review the code to determine its trustworthiness.

Let’s take a look at the guidelines Reddit admins have set forth for how moderators should act. A few points stick out for me:

  • Try to inform users when you remove their content. You can leave a distinguished comment or mark their post with link flair.
  • Don’t purposely mislead users with custom CSS.
  • Don’t act unilaterally when making major revisions to rules, sidebars, or stylesheets.
  • Don’t take moderation positions in communities where your profession, employment, or biases could pose a direct conflict of interest to the neutral and user driven nature of reddit.

In my experience, moderators of both popular bitcoin subreddits have broken these rules at some point but the greatest problem is the final bullet point. I think this is a fatal flaw for any Bitcoin forum — they are all moderated by Bitcoin users who are nearly guaranteed to have some sort of vested interest in seeing their personal vision of Bitcoin promoted.

What’s the Solution?

Thus far I’ve identified a number of problems and a few patches that might help alleviate them. But I think a comprehensive solution won’t be a simple bolt-on tweak to the existing structure of forums. I suspect that this will require a reimagining of how online content is disseminated, much like how Satoshi Nakamoto reimagined trust-based relationships. Our best hope at the moment appears to be Yours; perhaps by fixing the fundamental incentives behind social communications channels, we can build disintermediated platforms that are both censorship resistant and promote quality content above the noise.

Imagine a forum / Reddit-style social site that has no owners, no administrators, no entrenched authoritarian moderators. Imagine if the power was shifted back to the individual users, who could hire and fire personal “moderators” (content curators) to best guide discussions in the way that the user found to be valuable. Think of is as soft forking rather than hard forking a community — requiring community consensus upon moderation can cause fracturing of discussions, decreasing their value. This is possible today; we have the technology — we just need to build it.