Why I'm leaving discuss.python.org

(BTW: judging by post IDs, a post was indeed removed between this one and that one.)

And I actively don’t want to be in an environment where leadership feels that way, because I understand how destructive that ideology is.

I know this must be difficult for some people to accept. But the vision of inclusivity you describe - the one in the Code of Conduct - is actively exclusionary.

Including to people who consistently try their hardest to get along with others, believe in FOSS principles, affirm that the author’s identity can never disqualify a piece of code, have “real-world” political views well within the Overton window, and have not even the slightest interest in discussing those views where they would be off topic.

I know that you consider it off the table to “not enforce the PSF CoC”.

I assume that rewriting it is also off the table.

Which is why I’m not offering constructive suggestions on this point. Because there is nothing I could possibly offer in the face of a values conflict this fundamental.

I thought that, because the listing you cite is tacked on to a charter document (not the most obvious place to find a list of members) in a “ConductWG” subsection (which doesn’t even have a proper index page) of the PSF section (labelled as “private”; I can read your direct link, but I have no idea how I was intended to navigate to it) of the wiki (which is very poorly advertised).

The fact that there even is a work group isn’t mentioned on the wiki page about the CoC, never mind that they’re the ones responsible for enforcing it. Also, as of my writing, that article was last edited in 2013. It predates the WG (formed in 2018) and was approved by the Board of Directors, not the WG.

Instead, I found out that the WG exists through the CoC itself. But the CoC does not offer any information about WG members, nor about how to find out who they are. Indeed, it’s written in a way that quite strongly implies that the reader is not intended to know who those members are. If you don’t believe me, I encourage you to re-read it.

If I put who enforces the PSF Code of Conduct into a search engine I don’t get a direct answer. Although I do, admittedly, get your wiki link. But not with a summary that even remotely implies that I’d find the answer at the bottom of the page.

I get the impression that the Board of Directors, the core dev team, etc. believe that typical Python users are well aware of the wiki on the main Python web page. I believe they are extremely mistaken, if so.

Just for a sense of scale: the recent changes page tells me there have been 9 edits to the wiki in the last month, none of which creating a new page. That compares to about 6 thousand questions asked about Python on Stack Overflow in the same time frame, and more than 3 hundred new topics per month across all sections of this forum.

The wiki is not a suitable way to communicate information of this importance.

It is not reasonable to expect people to understand Python governance when it is communicated this poorly.

Again: openness is indeed severely lacking.

I hope I don’t need to explicitly state my suggestions for improvement here. They follow directly from my grievance.

This in itself is part of the problem.

7 Likes

See? Neurodivergence is a superpower :smile:. You’re doing great.

It’s a good catch. I’m sure it’s this post, which a mod moved from this topic to another:

I posted it in this topic at first by mistake, but found no way to trick Discourse into letting me move it (for example, I deleted it here, and then tried to post it in the other topic, but Discourse refused to allow this, complaining that the “new” post was too similar to something I had already posted, never mind that it was deleted in whole almost at once).

So there should be a “missing” post ID here, and the timeframe matches.

I would edit the intro on that post to remove the now-purely-confusing obsolete information, but the 12-hour speed bump in the other topic seemingly applies to every action (that is, not just to making new posts, but also to making the tiniest edit to an older post - fixed a spelling error? that’s it for now, pal, come back in 12 hours).

Speaking of which, the 8-hour speed bump in the voting topic managed to kill off posting entirely in just a few days. Why boost it to 12? For most people, I expect 12 is effectively the same as 24.

3 Likes

Let me just say I appreciate what you said in this post and the following ones. That’s a move towards the kind of transparency that I’m hoping for (and it seems others are as well).

Perhaps a separate topic should be started in the Discourse Feedback category to discuss this?

Personally, most of the concerns I have are not with the technical/mechanical issues of Discourse moderation, nor with anything so drastic as not enforcing the CoC. Rather (as I mentioned in my earlier post) they are about interpretations of the CoC, matters that are relevant for moderation but not covered under the CoC (e.g., level of tolerance for off-topic-ness), and similar areas where a reasonable person could read the CoC and still have considerable doubt about how it is applied in practice.

6 Likes

These things get tricky. Jack has a big mouth, and knows it, and is very much at home stirring the pot. He’s also a sweetheart - really! He’s heard far, far worse than what was thrown at him in that topic. You’ll note that he didn’t reply in kind. He’s “flameproof”. If I were a mod at that time, “no harm, no foul” would have been my take. I would have asked Jack offline, but I’m sure he would have sincerely said he laughed it all off.

I wouldn’t expect anyone to get tagged for defaming me either, and indeed would oppose it. In my early Usenet days I was flamed by true masters of the art, and became flameproof the hard way :wink:.

The worst Python-world thread in my memory was on the python-dev mailing list. I couldn’t find it just now, but just as well. Nobody wants to relive it. “Both sides” were vicious.

For which I bear some responsibility, alas. I was the only active python-dev admin for a large number of years, and has been mentioned before Mailman has very crude moderation abilities. I flat-out refused to bar anyone from posting, regardless of which side they were on, because neither side was (to my eyes) behaving one whit better than the other, and both sides were expressing sincerely held opinions.

The powers that be did pick a side, though, and eventually banned one participant. I’ll never understand that one (“it’s secret”), but heard indirectly from mutual friends how devastated the target was. Serious, extended psychological distress. The protracted cheering from the “winning” side was, to my ears, obscene. Schadenfreude should also violate the CoC.

Anyway, I offered to resign my python-dev stewardship if someone more inclined to take a heavy hand volunteered to step up. Which they did, to my relief.

Those posts were so over the top I don’t think mod editing would have done any good (& I wouldn’t have done it anyway). But I’ve seen here how “just” an 8-hour “slow mode” effectively kills discussion cold, and I would have been happy to try different speed-bump values. At least enough to slow things down enough so that I’d have time enough to hold posts and review them, and take a stab at requesting edits before approving.

Don’t know whether anything of that sort would have helped - but it could not possibly have made things worse :frowning_face:. I do think, though, that even a 1-hour speed bump would have helped some. At times, these incensed people were screaming at each other as fast as they could type.

"Life is full of questions. Idiots are full of answers.” - Socrates :wink:.

7 Likes

PSF staff here, I just want to chime in and say the comments about there being room to improve the discoverability of PSF governance topics has been noted.

12 Likes

Mailman has very crude moderation abilities.

I’ve seen here how “just” an 8-hour “slow mode” effectively kills
discussion cold, and I would have been happy to try different
speed-bump values. At least enough to slow things down enough so
that I’d have time enough to hold posts and review them, and take
a stab at requesting edits before approving.

You’re probably aware, and it’s only of historical reference now for
this community anyway, but for the sake of other readers I just want
to point out that the blunt instrument available on Mailman is
“emergency moderation” mode, where all posts are automatically held
for moderation. A list owner can easily flick that on and implement
a somewhat manual “slow mode” themselves if desired. It does work,
but relies on the list owner being attentive enough to catch the
start of a flamewar before it gets out of hand, and does imply
additional work for the list’s moderators while in effect.

1 Like

As noted, I was the only active python-dev admin for years. There was no moderation team. I was it, for a popular list with 24/7 posts from around the world.

There were other admins, but they never did anything - and I didn’t expect them to. They were other Python old-timers who were admins on many other python.org mailing lists too - their real “function” appeared to be for emergency recovery services if the “real” admin, say, died.

I thought about freezing the list (“emergency moderation mode”), but thought that was above my pay grade, “Moderation” in the Discourse sense wasn’t part of the job description then, and isn’t something I would agree to do anyway (hence why I asked for someone else to take the job),

Why am I typing any of this? Because those who don’t learn from history are doomed to repeat it, and even in my long-gone Board days I ran on the Institutional Memory ticket :wink:. I know people overwhelmingly don’t care, but still feel it’s one of my proper roles to be open about how things looked to me.

3 Likes

Two points on moderation from somebody who may be considered a seasoned random user from nowhere :slight_smile: – i.e., sb who is hardly recognizable in the broad Python community, and whose posts at discuss.python.org have been very rare (as they were earlier in the python-dev/python-ideas era).

  1. Unless we say about absolute garbage/spam posts, I find it very disheartening to learn that a post may disappear or be modified by a moderator action without any visible sign (by visible, I mean visible at first sight, not after an arduous research).

    I am a big fan of moderation, but please be explicit. I you remove something as unacceptable, please leave the mark which makes it clear here something unacceptable has been removed. If you edit someone’s post, please make it clear that it has been edited by a moderator and, preferably, what exactly has been changed.

  2. Is it necessary that flagged posts become unreadable (at least for such an ordinary guy like me), so that I cannot unfold them if I really want to read them? That’s annoying and, yes, it’s alienating – as I was too little smart to allow me to know their content. (Especially that quite often that content is completely innocent, just a little bit off topic.)

14 Likes

Sorry, I just don’t understand why deletion vs hiding is such a critical point for you? As a resource not under your control, there’s always the possibility someone will do something with your content you disagree with. Now hopefully you trust enough to do the right thing, but at some point this all does come down to trust since the PSF controls the Discourse account and thus what happens with what’s in the database.

But as I said, if you can find some Discourse setting or something to alleviate your concern you can let us know and we can see if switching it on makes sense.

I personally disagree with that assessment and I am willing to say the other moderators do as well as we all chose to become moderators knowing we would be enforcing the PSF CoC.

If you truly think the PSF CoC is that harmful then this place simply might not be for you as the CoC will continue to be enforced. There might be enough of you interested in having Python discussions in a non-CoC space to warrant you all setting up your own forum or mailing list.

Based on what you’re saying here and thus my interpretation of what “rewrite” means to you, I would say it’s off the table. You might suggest to the CoC WG some tweaks, but wholesale changes that lessons what the CoC protects would very likely never pass (I know I wouldn’t vote to approve anything that weakened the CoC).

Deb already replied about fixing where the information is made public, but I honestly don’t remember anyone asking for this info explicitly, so I suspect no one concerned themselves with moving it off the wiki where it started years ago. (Plus some people in the past disagreed w/ your view of the wiki, to the point that when the idea of shutting down the wiki has come up some people have fought against it.)

Perhaps, but the people who are very bothered by how things are being handled are here and this whole topic was started because of disapproval of how moderation was being handled, so it’s a weird gray area of being on/off-topic.

What Discourse setting needs to be changed to allow for that?

No, because the offensive content would still be available. Think about the situation that someone posted hate speech; that cannot stay up and be available to the public in any form.

2 Likes

Yes, technically someone could delve into the database and delete posts. But currently, I have to assume that any moderator could, at any time, delete any post, for any reason. The solution might not be a technological one, it might be a published policy that posts do not get deleted without leaving behind a residue, but I want to be able to trust that people’s posts are not getting invisibly suppressed.

By definition it is exclusionary. You want to exclude the type of behaviour that you do not want. It is fundamentally impossible to be “inclusive” without being “exclusive”. Is that just grammatical nitpicking? No, because someone has to choose what gets included and what gets excluded. You aren’t being inclusive of everybody, because that’s impossible; and someone had to choose who gets excluded.

Maybe that’s a problem too US-centric for me to understand, but is there a form of hate speech that is so terrifying that you can’t even leave behind an indicative marker showing what was removed?

12 Likes

@brettcannon One thing I would like to see is 100% clear moderation policy that is also actually followed with regard to always leaving behind a trace when editing a post (i.e. an “Mod edited” commented inside the text of the post) and a mark when moving a post/multiple posts (in both the new thread and the old thread, potentially as part of the moved message).

Especially this latter one is something that I have seen brought up a few times but doesn’t appear to be followed. An edit without markers is something that has happened in the other thread is AFAIK the worst moderation action taken there.

This is something the moderators need to hold themselves to - I don’t see how discourse settings can help with this.

You might need to be ok with a simple marker “hate speech removed” (with no edit history), but otherwise no, I can’t imagine any example that would have to be pruged that much - and I don’t think that is what Brett Cannon meant either.

Note that there does not appear to be a discourse setting to permanently leave behind a “this post has been deleted” marker without making the content accessible via the edit history - This is something that could probably be brought up on the meta discourse.

6 Likes

I would be satisfied with that (especially if it included enough metadata that people knew who posted it and when), but if that’s not possible, then yeah, that would have to be raised upstream.

But here’s a question. What happens with email and newsgroups? If it’s such an appalling situation that this MUST be purged, how do you cope with the possibility that it’s already been sent out, and is now in people’s inboxes/newsreaders? I don’t understand how this is suddenly a problem now that we have a web interface to the forum instead of email.

4 Likes

What @Rosuav said.

It’s easy to disagree when you aren’t one of the ones who feels excluded.

I agree wholeheartedly, and in fact I am about to start assembling my departure thread. In fact I have had this in mind the evening of the 14th, but there’s just been so much else to say and read about it, plus I was in the middle of reading through several previous Packaging marathon threads pseudo-simultaneously.

Because of legal liability? Or because having the option to see something objectionable is inherently harmful? Or because forms of expression that everyone agrees to be socially reprehensible except the fringe minority who express them, somehow empower that minority when they’re allowed that expression? Or because said expression is somehow convincing to others? Or because intimidating others causes them to become more sympathetic to one’s own views, rather than less?

None of those strikes me as especially plausible. As far as I can tell, this rule is entirely deontological.

From what I can tell of American political discourse, absolutely yes, to the point that I don’t feel comfortable trying to give any more hint to you about what it is, never mind that their country was built to uphold freedom of speech as one of its most sacred principles. (To the extent that people from other countries often wrongly conflate the First Amendment of the US Constitution with the concept of freedom of speech generally, which is a serious and fundamental philosophical error that causes great harm to the discourse.) It seems wrong-headed to me (see above), but it is what it is.

However, it’s worth noting here that Brett is not just “not from the US” but about as proudly Canadian as it gets, at least among members of the core dev team. (It’s kinda nice here overall, actually.)

3 Likes

Yes and no :wink: The US is one of the few countries where the law, in fact, recognizes no such thing as “hate speech”. Legally speaking, it doesn’t exist in the US. “First Amendment” protections are very strong in the US. The most vile speech you can imagine has routinely been protected by US courts.

However, that only applies to the government objecting to speech. Private entities (like the PSF) are free to make up (almost) any definition of “hate speech” they like, and enforce that in their own spaces.

The answer will surely be that nothing substantive can be left behind. But you can’t be given an example because any instance of “hate speech” (however the PSF defines it) is so objectionable that even the most hypothetical of examples could injure someone exposed to it.

5 Likes

FWIW, the way Discourse moderation tooling works is not a hidden secret of some sort. It’s written down and public what tools are available to the moderators and how they work.

9 Likes

Just for the record, I personally have no objection to “community moderation” (i.e. content being auto-hidden by flags) in principle, and I recognize that none of you control how the Discourse software fundamentally works (beyond perhaps tweaking some thresholds or disabling certain features, but I don’t have a problem with those features).

My primary personal objection, and what is driving me out here, is the overall attitude towards what content is deemed objectionable and what content is not. The lack of transparency doesn’t help, but it isn’t the most important thing for me.

1 Like

We do have that power.

What would that take? Just a stated set of guidelines that the moderators follow?

That’s very true, but I think what’s being excluded and which is worse is what we’re disagreeing about. I apologize if I phrased my response incorrectly by having it in any way suggest CoC enforcement doesn’t, by definition, mean someone may have their content taken down.

Anyway, I don’t want to belabor the point as it goes down a rabbit hole and I don’t think it’s beneficial to anyone.

A marker that something was removed or why, no (and as I have said, if Discourse has such a mechanism that I’m not aware of, please let me know). But the post I quoted wanted the actual text to still be available, like a hidden comment on GitHub, and I’m saying we can’t universally have that be the only enforcement mechanism we have; content removal needs to be an option.

I simply don’t know if we’re overlooking a setting or a plug-in that could automate that sort of thing.

Because your email copy is private and not being left here at the place of posting for future readers to come across. And yes, you can mirror everything, but once again that’s not this site where we are saying, “within the confines of discuss.python.org, we enforce the PSF CoC to the best of our abilities”.

While I’m happy to explain the thinking behind this, but I do not see any way where the possibility of removing content from public view isn’t made available to moderators as a tool to use.

IANAL to answer the former, but definitely the latter as we all know someone will eventually see it.

That’s fine, but as I have said, this site was established under the PSF CoC and its enforcement is not going to stop.

5 Likes

Forgive me if I get back to the topic for a brief time :wink::

Steve meant it when he said he was done with posting here, so I’ll confirm (without having been asked to by him, and at the risk of annoying him) that you did the right thing. He knew about the typo, and if he didn’t want it to stand, would have edited the post himself.

To my mind, that he very uncharacteristically left off the trailing “e” says something about his state at the time. History is best served by letting it speak for itself, unaltered.

6 Likes

As long as we know we can trust the moderators. At the moment, that’s far from certain, but maybe it’s something the moderation team can move towards.

1 Like

I’m posting with some trepidation here, as:

  1. This discussion has become very heated and accusatory.
  2. I don’t want to be labelled as speaking from some sort of “privileged white guy” perspective, even though that’s exactly what I am. I’m certainly trying to see things from a broader perspective.
  3. I genuinely do appreciate the work the moderators do, and I absolutely wouldn’t personally be willing to take the amount of criticism they take for doing an unpleasant and stressful job.

Having said all that, yes, I think a significant improvement here would be if there were some more explicit and strictly adhered to rules that the moderators followed:

  1. Hiding or removal of any post must be accompanied with a separate post describing what was removed and why. Some details are needed, to allow the community to be aware of what went on - for example “Two posts by X were removed because they contained hostile comments directed at another participant”. This post must be made by the moderator who took that action, which adds a level of transparency and accountability.
  2. Moderators should never edit posts under any circumstances. My words are my expression of what I wanted to say. They should not be changed by another person, even with the best of intent. Even innoccuous changes like fixing typos are problematic, simply because they undermine the trust that what I typed will remain what other people see[1]. This isn’t an academic report, it’s a community discussion - typos and suboptimal wordings are fine, and don’t need fixing (unless the original poster chooses to do so).
  3. Wherever possible, advice or correction should be done in public, with both the moderator and the poster being identified. Clearly, that’s not always possible, but for many instances, the advice given may well be of benefit to others as well as the offending poster. There’s been an example quoted here of the “wink” emoji being problematic - I have no idea why, and I like to use it as a “not completely serious” indicator, so it would definitely be of value to me if the problem with that emoji had been aired in public rather than via some private communication channel.
  4. I’m reluctant to make this particular point, but I think it would be helpful if there were a channel for community members to raise concerns about moderation issues. It’s at least as difficult for community members to flag problematic moderation as it is for moderators to flag inappropriate posts, and the risk of escalation is far higher (it’s way too easy for a complaint about a bad moderation decision to turn into a witch-hunt - moderators are humans, doing a very difficult job, and they have to be allowed to make mistakes).
  5. It would also be good to have some public clarity on how one becomes a moderator. There have been comments here about “self-appointed custodians”[2] and while I don’t think that’s particularly fair, it does suggest that the community here don’t feel represented by the moderators. Heck, I don’t even know how to find out who the moderators are! I could probably find a way to do so from the Discourse documentation, but it feels like a failure if I have to consult a software manual to know who’s responsible for upholding[3] the values I want to see in my community…

Transparency is critical here. It’s clear from the discussions that the moderators are at risk of losing community trust (if they haven’t already done so) and once lost, trust is really hard to regain. So stronger measures now will avoid worse problems in the future.

I appreciate that the above makes the job of a moderator significantly harder. But surely the power to control who is allowed to say what in a public discussion forum is a significant one, and should be wielded with care and involve a certain cost?

One final point regarding the CoC. I support the CoC, and I accept that this is not the place to be lobbying for changes to it, but I had intended to point out that “assume everyone is acting in good faith” was a core principle - and I was surprised to find that it’s not explicitly stated in the CoC[4]. Regardless if that fact, I think it’s a crucial attitude to foster in any community, online or not, and to be perfectly frank, I don’t think any of the discussions here have lived up to that principle. Whether it’s been too much focus on “worst case scenarios”, or vaguely expressed concerns about “hidden activities”, or the assumption that moderation is about prevention rather than cure, it’s very hard to read this discussion and come away with the impression that we’re a community with shared goals and a respect for everyone’s approach and attitudes. And that honestly says more to me than any CoC rules or moderation policy statements :slightly_frowning_face:

There’s more I could say, but this is all I have the energy for right now. It’s a shame that important discussions like this are so emotionally draining (for all participants) because we end up with polarised, “sound bite” style comments where nuanced views are often too exhausting to express.


  1. I made a typo in that sentence, and I was very close to leaving it in, because I felt the typo lightened the tone a little. A well-meaning fix would have left what I wrote looking a little more blunt than I wanted it to. In the end, I decided the humour was too subtle, so I didn’t do that - but how would a moderator know any of that? ↩︎

  2. sorry, I haven’t got the time to find actual quotes, but that’s a clear impression I’ve got from the discussion ↩︎

  3. Yes, I know “we’re all responsible for upholding the values”. I didn’t want to say “enforcing” because I don’t think moderation should be equated with enforcement. I picked a not-quite-right word that avoided implications I didn’t want to make. I point this out in support of my earlier comment that “suboptimal wordings are fine”. I explain it here, because I’m concerned that people don’t forgive suboptimal wordings as much as they should… ↩︎

  4. or if it is, I missed it… ↩︎

46 Likes