How do you build strong online communities?
The recent history of social media has made me interested in the factors that make online communities successful/healthy, or toxic etc.. This is one of the appeals of Tildes for me. I'm also emotionally invested in seeing a healthy future for the Irish language, which has seen some interesting developments in the internet age but remains in a precarious position as a community language in the country. You can see how these two interests dovetail together. At the moment this is a thought experiment, but later, who knows...
Tips I've got so far:
I've heard that some barriers to entry can increase group loyalty by making members feels slightly "invested" by earning a place in the community
I've also noted that some of the most persistant subcultures operate online but also have a strong in-person element (eg: furries)
There's also the common observation that good moderation is crucial to user experience and therefore group cohesion
Then I got some pointers from the Tildes docs:
- Trust by default, punish abusers
- Focus on user experience, not growth metrics
- Favour deep engagement over shallow/clickbait
- Empower members to make choices
- The golden rule (apply charitable interpretations, don't tolerate bad actors)
So, people of Tildes: what factors do you see as crucial to building and maintaining a strong cohesive online community?
I could write a lot about this. But I also would be repeating a lot of good stuff that is out there. For example, has been a lot of research on this subject as well as people on other platforms theorizing about all of this. If you dig deep enough in the archives of /r/theoryofreddit (I am talking well over a decade old posts) you will find a lot of very insightful information. Unfortunately, a lot of that was never put in practice or simply couldn't work on reddit due to how the website itself worked.
Still, some random points come immediately to my mind.
On moderation
Fluff principle
"The Fluff Principle: on a user-voted news site, the links that are easiest to judge will take over unless you take specific measures to prevent it." Source: Article by Paul Graham,
What this means is basically the following, say you have two submissions:
So in the time that it takes person A to read and judge he article person B, C, D, E and F already saw the image and made their judgement. So basically images will rise to the top not because they are more popular, but simply because it takes less time to vote on them so they gather votes faster.
Ironically, both on HN and Reddit this was never really done much about. HN still suffers from this as far as fluff comments go. Here on Tildes it is less of an issue as people sort content by various different ways.
Getting community feedback
All this to say that when a community is asked for feedback, it would be foolish to blindly implement all feedback. It takes a lot of effort to gather meaningful feedback and turn it into actionable items. Things that need to be considered are "who is the feedback coming from", "will this negatively impact other people", etc, etc.
The 1% rule
There is a thing called the 1% rule, or 90/9/1 principle, the obvious choice here is to nurture the 1% and 9% groups the most, but purely focusing without a focus on what you expect can have odd side effects. Karma farmers on reddit did belong to the 1% but a lot of their stuff would be complete crap.
I know you're probably referring to the second-to-last sentence, but I've gotta say that the rest of it is one area where I feel Tildes falls very short. Moderation is quite opaque here.
I've both had and seen comments get removed where absolutely no explanation was provided, and the reasoning for the actions taken were not intuitive. I once posted a heated comment here and got a month-long ban for it, which might suggest that the comment in question was not my first screw-up, but nevertheless I don't actually know that because absolutely nothing was said to me. The whole thing left me feeling extremely unwelcome and I very nearly quit the site entirely. Without getting into further examples, I've also seen long-time, friendly users get permanently banned for reasons I just can't explain.
I do very much agree that transparency and understanding in moderator action is important; people have to learn from their mistakes somehow. I just wish Tildes did that. I have my suspicions as to why it doesn't, and they are kind ones. I in turn would have suggestions to fix it, but I don't feel as though Tildes is very receptive to criticism of moderation generally, and even if it were this thread probably isn't the place for it.
Oh yeah, Tildes isn't perfect in that regard and I was indeed referring to the Deimos responding to feedback and generally being really open about the principles behind Tildes.
I have spent a lot of time in the past talking about the Fluff Principle, and how subreddits needed to do something about it or be flooded with low effort content. It took a lot of years, but after 15 or so on reddit, I realized that Reddit is for low effort content and that moderators making efforts to disallow low effort content were fundamentally misunderstanding what the reddit system is for. I spent a few more years on reddit after that realization hit - that I had been trying to make communities into something that they
'vewere not - and then I left.I think that Tildes approaches content differently, and addresses and pretty much nullifies the Fluff principle. This shows that while it's not "easy" to do so, it's certainly technically possible to do so. Sites like reddit that reward memes do so because their business model is served by showing you memes - sites like Tildes that reward content do so because their business is served by showing you content.
I don't agree with that assessment of what reddit is for. These days I fully agree that it is a platform fully aiming for fluff content.
That was not always the case, and for a long time it was possible for communities to exist with high quality content. What changed is simply the scale of things, when reddit started to grow they never truly invested in the tooling needed for the higher quality subreddits to keep up with moderation demands.
You have to remember that reddit was founded in 2005, it has a very long history, and we tend to remember only the past few years or so.
When I started work on toolbox in 2012 reddit was already 7 years old. At the time it really was nothing more than a few scripts with some nice to haves for moderation. It only grew into the essential tool it became a few years later because of how massively reddit grew and most mod tool development ceased.
I agree and also disagree with what you have said. I started using reddit almost 19 years ago - my account is 18.5 years old, and I lurked for the first 5 or 6 months of use, so it's been almost 19 on the money for me. I recall what old reddit was like; I used it in every iteration.
I had a paragraph here that talked about learning about the fluff principle around 2010, but it actually just references the article you put in your original comment above, so I can make this brief:
Reddit knew about the Fluff Principle since around 2010, almost 15 years ago. Since they were ineffective at making adjustments to their algorithms to account for the Fluff Principle, they became a site of Fluff. As more and more people came to reddit, the Fluff became overwhelming, and we moderators spent a lot of time working at keeping the fluff at bay.
I think that as time progresses, it become more and more clear, that the system of reddit rewards Fluff. That is how it is now, but that's also how it has always been. This does not mean that a moderator wrestling high quality content away from Fluff wasn't possible. There were (and still are) loads of great communities on reddit that feature fantastic content, with great mod teams. But they are there in spite of the system, not because of the system. Every subreddit that wants to reward posts that are not just fluff posts has to struggle against the system to achieve that. And I think that's been the case basically forever.
That's what I mean by what I said above, and I think that it is notable that only through tools like the one you made are moderators able to wrestle with the crapgasm that is reddit. And over time, more and more people just want the firehouse of memes, so it become increasingly difficult for moderators of subreddits of any size to have non-fluff pieces naturally rise in their subreddits.
Yeah that's entirely fair. I suppose I am more or less arguing about how bad the odds have been stacked against communities.
To give an example, a lot of visual fluff used to be external to reddit. It was there and already pretty dominant, but still external links. I still agree that communities for more serious content flourished despite reddit. But at that time you could reasonably argue it was more or less inertia of reddit simply not undergoing much change at all to begin with.
Once reddit started self hosting images in 2016 and videos about a year later is the time where I am more or less fully on board with reddit fully disregarding serious content over fluff content for growth metrics. Further reinforced by the choices made for the redesign a little bit later where images and videos are made first class citizens and loaded in line.
As far as comment fluff goes (cheap jokes, etc), no argument there actually. Well one argument, bringing in Deimos and allowing him to integrate automod. Without automoderator being a native tool it would have been game over for many communities years earlier.
But also there my understanding is thar Deimos had a lot of trouble being allowed to properly work on automod.
I won't add much other than that I think we are in full agreement and just thinking about degrees and timelines of reddit's enshittification.
Deimos was one of the bright lights, and automoderator was (probably... is? I don't know if it's still working) a great tool. There were also lots of other reddit admins over the years who were good people trying to make the site better. There are still some today! But the leadership isn't enabling them to do things that need to be done.
Communities flourished in spite of them for a long time. By design: they wanted to be sure harvesting users wouldnt kill them first
We saw that happened with all kinds of start ups: group buys, ride shares, streaming platforms.... It's not our fault we thought they were something else because they sold them to us as these things.
I'm over their site, but I still lament all those little communities that flourished in between the cracks of their hidden intent.
Eh, as I pointed out here reddit has been around since 2005, it is nearly 20 years old. In that period it drastically changed several times in as far as ownership goes and direction of the company behind it. Frankly, now that I am typing it out again, I am amazed we got to enjoy reddit for as long as we did.
For the last few years I do fully agree, it was in spite of what the direction it had taken that communities persist.
Fair, more than fair. If they had kept Reddit to what they originally thought to do it would be a very different story (and not as popular or successful on a wide scale probably). Sorry, the latter years bitterness coloured my happier memories of the place.
That's unfortunately how human brains tend to work. We tend to focus on and more easily remember the negatives. Probably some survival mechanism (you want to avoid negative things) but doesn't help us in the modern world.
I'm reminded of the infamous drama in the atheism subreddit when post thumbnails were disabled to discourage meme posting, which led to the historic comment "Socrates died for this shit and we're taking it too lightly."
I miss the euphoric atheist drama. Tangentially, I feel like I got into being a compulsive popcorn-muncher at the worst possible time. Had I tried to get into online drama-watching any later, my first major drama would've been Gamergate. Reading threads about that is like snacking on stale piss popcorn. Could've closed the tab and logged out. Instead, I got hooked on previous drama and didn't realize that online culture was entering a boring forever war. Had I joined the dramasphere earlier, that would've been that many more years of acutally entertaining drama.
"desserts are for after supper".
I enjoy memes, jokes, self referencing comments etc a lot, but I also recognize that they need to be restricted to a weekly thread of silly news, or nested under substantially more nutritious comments. They should never be presented with equal weight alongside the main course, just like how one should not serve dessert in the same quantify and at the same time as the main. (Opinion does not apply to ~talk, which I hope remains a more free wheeling dessert cart to the main train)
Small things like placing the comment box at the end, helps. Not giving dumb rewards for jokes helps. Not having a site wide cumulative high score for users also help.
Not showing any form of status or anything tangible for people to distinguish oneself from others e.g. Karma. It's a useless thing but ppl give meaning to it and thus it has a runaway effect of it having to matter. On forums this was fairly often the amount of posts (higher = better?) etc. The value is in the content that you provide and people will remember you if you had anything meaningful to say.
In the extreme situations like TikTok or Instagram likes are everything and it works demotivating when your posts don't seem to hit a certain expected value and this directly can influence your mental well being.
In the context of transparency I can see why the 'upvotes' on tildes give reason to see why a post is trending or on the frontpage but it would be neat for that to be an opt-out thing.
I think not including any kind of low-effort feedback would be a mark against a platform at this point. Displaying karma totals or post counts is obviously unnecessary, but if you make a contribution and receive no response whatsoever there's less incentive to keep contributing. Which for a fledgling community is pretty likely.
I pretty frequently use votes as a filter for what I actually read in online communities for which I am familiar and as a way to begin to judge online communities for which I am unfamiliar. I can't speak for all of humanity, but I'm a garden variety human, perhaps less commonly I know for certain I'm susceptible to echo chamber effects, and yet I still do it...
Even in larger, established communities it's important I think. My personal YouTube experience has definitely gone down since dislikes are no longer visible.
I mean if you tell a joke at a funeral you cannot read the room. My point being; growth happens naturally and organically already. Fostering a solid community takes time and not everyone will like the content. But not everything can be to everyone's liking. And that is okay.
Every funeral I've been to (which isn't a ton, but it's more than a few) has had plenty of jokes told.
Everyone grieves differently, but if I were at a funeral with zero jokes told I'd probably wonder if anyone around even liked the person or ever had any good times with them.
You're right, it was a little bit insincere from me to jump to such a conclusion. Bad argument on my part.
I don't know how you made the leap to not being able to read a room when I'm talking about not being able to tell whether the room even has anyone in it.
yea, it was a bad argument, sorry.
I'm the kind of guy who laughs at a funeral. Can't understand what I mean? You soon will.
(I just realized that lyric sounds vaguely threatening)
Grandpa liked to say that if everyone wanted the same thing there'd be billions of dudes on the back porch trying to get into your grandma's knickers... =D
In this regard, the very nature of Tildes works in our favor. Tildes doesn't exist to garner "engagement": it has no KPIs to meet/exceed investor expectations.
Part of the reason why Karma was given out on Slashdot was to curate a segment of the userbase who could be sometimes partially trusted with moderation points. Then within that group another hidden metric was used to reward meta-moderation points. These points don't get displayed publicly and they're widespread enough for the small power that there was no point bragging. And in fact low effort garbage would even detract one's points from this system.
Not so with Reddit. They intentionally made all these things visible, so users would mass pile on votes in either directions; they gave out shiny baubles for people to chase afte and give each other and create its own celebrities.
There's no algorithm here: posts live or die on their own merits. The most famous Tildes members' comments look exactly like those of a new member. Any semblance of fame only exist in long time members' minds at best.
It's boring for folks who chase that high. And I like it this way
*edit: Deimo's username is distinguished though, but I think he only uses that account to talk about Tildes anyway. (Aside: Ive wondered what his usual commenting account is...)
Even if there is no total karma, I do think that a bit of that is here on Tildes. Votes on comments and posts do have influence what people see if they have things sorted like that. Then there is also the “exemplary” label.
And I certainly have seen comment behavior from people where I couldn't help but feel that they are chasing after votes and/or fishing for an exemplary label.
But, overall, it is indeed a far cry from the whole karma rat race over at reddit.
Perhaps folks (ahem, myself) need a period of time to detox from chasing that high, y'know? And then some of us are more susceptible to that kind of a rush than others, and good site design only goes so far to remediate a personality level thing.
Question for you: do you envision a time in the future where Tildes might grow so much we'd need a suite of mod tools and a team of mods?
To be clear, I did not have you in mind specifically ;) I can't deny being immune to the effect of a comment doing well with votes either.
As far as Tildes needing more tools and more mods. Maybe? I don't think Tildes is growing at a rate where it is something that requires much consideration soon. Effectively, Tildes already has mods, the people who can edit tags and move posts.
It is a difficult question to answer because I am not involved in running Tildes. So I am not sure what sort of work goes on behind the scenes right now. I sometimes do spot a user being banned or a post being deleted, but that is about it. I get the feeling Deimos has a handle on things right now and with the current invite system things probably remain the same for a while. So the most likely scenario where I see a need for more tooling and more mods is one where registration becomes easier.
I've observed that votes on my comments drive near future in engagement more generally on the site significantly. If something I commented on/posted fails to garner votes I take it as a sign I may be uneducated on the topic I'm commenting on... and am also much more hesitant to post for a bit.
I do miss avatars, though. I'd like more memory triggers for who these people are
I have this weird thing that I actually like not knowing who people are online for the most part. I rarely look at usernames and I like to think it’s me unconsciously trying not to judge based on experience and recognition. Come to think of it maybe it’s also because I try to avoid taking too much accountability towards who I’m interacting with…furthermore to be completely honest maybe I’m envious of the eloquent “power” users and secretly wish I was one myself.
I like that angle as well though with a community the size of this one, I think it'd be sort of cool to see more personality for each user displayed.
Does this also apply to less public things that gamify desirable behaviour, like achievements/badges or (this one is definitely controversial) streaks?
I mean, it goes to show what is important in the end right? If a platform really incentivizes engagement, focusing on user retention and gamifies interactions and has reward structures then it's safe to bet there's a financial incentive at play behind it all instead of purely fostering constructive discussion.
Based mostly on my experience with the Tildes community, I suspect having some intentional meta* goals for community interaction pretty early on attracts/repels/filters and sets a tone. I'm part of the Great Reddit Diaspora. Tildes was sold to me as an intentionally non-toxic replacement for reddit.
Walking into the site with expectations on me around how I interact with the community greatly increased my investment in and connection to the community.
Non-toxicity doesn't have to be the intended meta goal, rather just an easy example. I'm reminded of a statistic about Howard Stern's morning show in its early(er) days. Listeners who liked Stern's antics listened on average 2 hours a day. Listeners who hated Stern's antics listened on average 4 hours a day. (Those numbers are made up but reflect my memory of gist of the stat... haters listened 2+ times as long as non-haters). In an online community like that the meta goal could be "A place to let it rip unfiltered. We want your gut reaction."
*I'm possibly mis-using 'meta' -- the comments I read about tildes and the interactions I viewed on the site before I sought out an invite all supported an intentional 'tone' of not just non-toxicity but ?intentional civility? Community tone doesn't strike me as a typical goal of a community, but rather a goal for the members of the community as they commune, so I've dubbed it a meta goal.
I think it mostly comes down to clear rules for moderation with tiers.
My first forum was pretty straight forward. If you cross a line, you get a warning. You could see your warning level, and it would "cool down" overtime. If you went over a certain level, it was an automatic suspension, and then eventually perma ban.
Mods had some ability to circumvent this for extremes.
I feel the only tricky part is transparency. There's both good and bad reasons to be transparent, and it's a sticking point for a lot of people, especially now that so many hot button issues are politics. Someone has "the right views" but is also an asshole, and then it's "well they got in trouble becuase mods don't agree with their views" and it's more just...
Well honestly 90% of modding seems to boil down to that scene from The Big Lebowski. "you're not wrong, you're just an asshole"
That's pretty apt. I probably had to ban or suspend people more times because they couldn't stop being an asshole. The first rule on communities I mod is often “Keep it civil”, not even “be nice” just keeping it civil. Which for a lot of people is still very difficult.
The difficult part of moderation is finding that line between keeping things civil but also not going too far in that direction and sanding off all the rough edges and destroying the personality of the place. I've definitely seen 'good vibes only' moderation styles come from the admirable goal of keeping things civil turn into exactly that. But then, the opposite (allowing tons of ragebait engagement) is exactly why most social media is so unusable. It's a very fine line to walk.
Just read all the Tildes docs linked below every Tildes page.
Some things are not in the docs though.
Community seems to emerge in spite of anything you do. If you look for deimos past comments you will see that the Idea was never for Tildes be a community with strong emotional bonds. We where supposed to be content driven like Usenet used to be, or like a Hacker News that is not just for STEM people. We are not really supposed to be here everyday to meet our buddies, we should be coming here once a week or even less.
A community came about nevertheless.
Ironically, it seems to me that the very fact that deimos is not really engaged in the community is one of the conditions for Tildes success. That allows admin enforcement to be impartial and consistent, although still largely informed by the idea of a website that is focused on content rather than emotion or personal connection.
So I think the formula might be: (1) try your best to make non-pretentious Hacker News, (2) fail at that, (3) now you have a great community.
Although I do think we have a fairly cozy community here, it only goes so far. There isn't much scope for close ties so long as most of us are hiding behind aliases, we don't post personal details to maintain opsec, and we don't meet in person.
Some questions to consider: do you remember their real name? Do you have their email address and phone number? Do you ever meet in person? Have you been to their house? Did you invite them to yours?
I think of it as similar to work friendships: some of them survive long after you leave the team, and others end when you switch jobs, however friendly people might have been as a time.
It's okay though. Not all ties need to be close.
I could talk about what makes healthy communities for ages! My search for a better community has brought me through quite a few spaces, including Tildes, and eventually to helping form a community to continue to experiment in the space and test some communal hypotheses on ways things can be made better.
While your question is about 'strong' communities, I'm going to interpret that through a lens which prioritizes the following: cohesion, good/nice behavior, diversity, being a relatively safe space. You may not hold all the same ideals when you think strong, but I'll do my best to point out throughout this post what interventions and designs are primarily effecting so you can choose what makes sense for you.
On a high level I think there's a few thoughts which hold pretty universally:
First and foremost a community needs to center the idea that there's a human behind a computer behind every post. This is primarily targeted at keeping a space nice but it's also quite important for cohesion and a sense of community. I believe the most effective policy in this space is to encourage folks to treat others in good faith until proven overwhelmingly otherwise. When people treat others with bad faith or assume trolling or other negative behavior you need to step in and publicly remind them to ask questions and confirm suspicion. Moderators and power users should strive to open comments early by predicting the kinda of questions which will likely arise when questionable content is posted in a community. By far this is one of the trickiest needles to thread because it requires fairly high touch moderation or an engaged community and it can break down easily if you're not keeping a finger on the pulse of the community. It may require proactively sequestering certain conversations in specific communities or threads and for major issues may require regular stickied or high visibility posts from administration or moderation reminding users to be on good behavior or reminding the users of the values of the community.
Secondly I personally believe that a good community always has treasured people. These treasured people are often extraordinarily nice and this helps to reinforce nice behavior within the community and ultimately leads to stronger cohesion because folks are more engaged. Your community needs to center the needs of nice folks over rude ones and you need to pay close attention to this. Losing nice people is much easier than losing people who aren't nice and losing nice people is much more damaging to a community than losing anyone else. This is touched on quite a bit in an excellent blog post which touches on a concept called 'evaporative cooling'. The first point about treating others with good faith goes a long way in helping ensure you keep your nice folks, but it's not the whole story.
Which brings me to the third point, I strongly believe that hyper specific rules are damaging/bad for a community and that less specific, more interpretable rules are better. The reason for this comes down to an explicit recognition that communities are there to support the community, and not the individual. Most in person communities do not have rules- think about your friend group or an activity club you belong to. These almost never have explicit rules, but if you're a dick to someone in these groups it certainly won't go unpunished. I think that healthy communities have lots of discussions about behavior when folks have conflict and I think that's part of what makes them work. People feel engaged with a community when they feel heard and respected and it's extremely alienating and dehumanizing when someone's being a dick but following all the rules and you get ignored because of it. This is perhaps the most important driver behind long term community engagement- without it folks will interact more cyclically and leave after they get too fed up bad actors.
Fourth, I think that diversity is extremely important in a community and the only way to ensure adequate diversity is to explicitly be a safe space. This can be tricky to deal with because being a safe space means being flexible on some level around rules. A transgender or black person, for example, may not be in a space to do the emotional burden required to take a potentially offensive content in good faith and you need to be able to adjust moderation tactics to give them space to vent about the harms and injustice they have experienced while gently reminding them and the community of the ideal behavior. You'll also need to manage the emotions of fragile folks in majority groups who may take offense at what these minority folks are saying. This is definitely one of the stronger personal beliefs I have about websites and one that is extremely infrequently centered on the internet which caters predominately to white men in tech. This is one of the largest mismatches of community ideals I personally experience on this website and it's why I've significantly disengaged from it as compared to the early days. I believe that this mismatch is primarily behind the decline in diversity I've seen here and the biggest issue I have with the community ideals here.
Fifth, I believe that communities cannot grow beyond a certain size and feel strong or cohesive. I do not know what this size is and I suspect the size is actually on some level actually a function of how many intermediaries exist- the number of moderators and highly engaged individuals which engage in community discussions and help to keep a community healthy. This is talked about a bit in a blog post which highlights the difference between a village and a train station.
Sixth, moderation needs to be more high touch and involve more conversations and less moderator actions. Folks need to directly see next to and after questionable behavior, moderators and admins stepping in and asking clarifying questions or reminding folks of the appropriate behavior. Sometimes it can be helpful for problematic material to be left up because the visibility of pushback can help to signal community values. A highly responded to comment where everyone is pushing back against what the person shared can be important to a space feeling safe to minorities. Sometimes the material might be too inflammatory and need to be removed. Often this requires discussion amongst moderators and admins on how others would step in and sometimes it's informed directly by how many reports the community makes or reports coming from specific well known individuals such as the nice/treasured ones mentioned above. Sometimes this also means having meta discussions with individuals when they step out of line or just reminding them of how they are expected to behave.
Seventh, moderation should not only be done by moderators. The above point talks about situations in which the community steps up to protect how safe and welcoming a space seems by pushing back against a problematic opinion. This can be done entirely without the intervention of moderators and admins and in fact this models how in person communities work. Multiple people might chime in if someone says something fucked up in person and none of them need to be the organizer for it to be effective. Similarly if someone says something possibly problematic, their friends or close companions in the community might pull them aside and talk to them in depth about how that might come off to others. This kind of behavior should happen regularly as well as be celebrated when it does happen and increases engagement and cohesion as well as helps to keep a space diverse because it becomes clear to minorities that their safety is prioritized. You need to keep in mind that it's still a community and a safe space does not excuse repeated negative behavior by heavily marginalized folks- the community always needs to come first but there needs to be some flexibility and understanding for human emotions and a lens on equity (which necessitates examining need in depth).
Eighth, you should have a space for admins and moderators to discuss problematic behavior and to double check with peers whenever they are unsure how to act. This space should exist away from the eyes of non moderators to allow for discussion to happen privately and should be a safe space for moderators to share opinions on appropriate action because opinions on this will not always be aligned. I find that it helps for this space to be open to all moderators regardless of the community and its purpose is to help gut check borderline behavior and to brainstorm with peers on what the appropriate action is. This space often will drive folks to take less action and start more discussions. Often you'll find other moderators have exactly the words you're struggling to find or the perfect response to specific behavior and can help to reduce the cognitive burden of deciding the right reaction to problematic behavior. It also increases cohesion and engagement of your moderators by allowing them to chime in on community and specifically moderator ideals and set precedent for how specific behavior should be treated more globally on the website.
I could probably keep going on about online community health, but instead of positing every hypothesis I have, I'll point you at the philosophical musings available on the documents section of the community I helped to form and run.
In my (very old) experiences running a dial-up BBS back in the late-80's/early-90's, you have to decide beforehand what kind of vibe you want, and what sort of interactions you want. It helps to write out at least a brief explanation of such, but really, the only solution is active moderation with a human brain, which can get exhausting, as it is fundamentally thankless work. A few bad actors can upset the entire applecart exceedingly easily. Maliciousness isn't even required, just abrasiveness.
One automated feature I've seen on HN that really does help is a simple timer that lengthens in time before a reply-post is allowed the deeper in the chain you get. Didn't apply in my day, as people were limited by the number of minutes they could dial in per day anyway, but it seems to genuinely help cool off the intensity of the flame wars. DanG in general does a pretty good job, considering the personality makeup of the folks that post there. (No shade, just acknowledgement of how tech folks can be. Myself included.)
If your community is based on a shared interest, then the broader the topic of interest, the deeper the engagement between users is likely to be. Deeper engagement leads to users seeing and treating each other like human beings.
This is an observation I've had over ~3 decades of online involvement. If you form a community around a particular interest -- a hobby, a fandom, a support group -- then every interaction you have with other community members will be under the lens of that particular interest. It's very hard to break out of that and get to know other people as people, even if you realize you should try. I've joined Discord servers based on particular interests, and they may have a #general or #off-topic channel, but most of the conversation is going to still revolve around that one thing.
Back in the day, I liked to visit a website that was not about any one thing in particular. It had various sections filled with humor pieces, various kinds of games and brain teasers, film reviews, and the like. It also had a message forum and chat room, and when a new person would enter the fray, we'd have no idea what particular part of the website brought them in. All we really knew was that it was likely they had a similar sense of humor and possibly liked to play games that exercised their brain in similar ways. We had to start from scratch with introductions every time, which naturally led to talking about any and all aspects of our lives. I formed friendships there that are still going strong 25 years later, and I've met all of them in person at one time or another.
I've rarely had a chance to test this theory with a similar style of community. Most communities I can find are based around one specific interest, sometimes extremely specific, and I feel as though I can never break through that barrier to truly humanize the person behind the other keyboard. Tildes is possibly the closest I've gotten since. Most engagement here is about one specific topic or another, but the topics are all across the board and I see different aspects of some of the same users, which is quite nice.
I think a community about the Irish language and life in Ireland has a strong chance of being a broad enough topic to encourage that deeper engagement. You'll know practically nothing about a new user! Fair chance of guessing their nationality, I suppose, but do they live in the city or country? Are they a movie buff or do they cook fancy meals or do they like to hike? Or maybe all three! People can have so many different things that are interesting about them which leads to many different interesting conversations. I feel as though you shouldn't split the community into too many hyper-specific topics, because going too granular would encourage people to only participate in certain sections. You might want to have a section on Dublin, but not on restaurants in Dublin, because the latter could easily go in the former. Does this make any sense?
Makes a lot of sense, and is actually a valuable reminder that while it'll always be important to push learning resources to new speakers and learners, it's equally important not to let that become THE primary focus or we would be squandering a chance to bring together a really general-interest community, and instead become seen as more of a school. I like this advice a lot.
Following the point you make in your first paragraph, I think that size is something that should be paid close attention to. Growth should only be a goal when the community and moderation feel that it has capacity for some number of new users and there should be no hesitation to slam the brakes if deleterious effects begin to be observed. If care is not taken, you can end up with such a high volume of new users that culture and norms get diluted and moderation is overwhelmed, which can send a community into a downward spiral.