27
votes
Does Tildes have a Warrant Canary?
Previously, reddit had a warrant canary that was removed, and it occurred to me that I hadn't checked to see if Tildes had one at any point.
Previously, reddit had a warrant canary that was removed, and it occurred to me that I hadn't checked to see if Tildes had one at any point.
Not currently. Has been discussed (at least) twice before, you can find Deimos' opinion in the comments. :)
TL;DR: Mixed opinions, probably not worth it.
Thanks!
Deimos's strategy of keeping as little data as possible is a much better privacy defense anyway.
For curiosity's sake, does Tildes have server presence in any of the Five Eyes countries?
Probably doesn't need to be a bot, does it? Just built-in functionality that could be implemented via a PR.
The less bot comments the better, in my view.
Agreed. Bots making comments has never struck me as a semantic/pure use of "discussion". If it's implementing functionality, IMO that's the responsibility of the hosting site; and if it's an extremely in-demand feature, per the principles of open source it'll slowly filter to the top until someone sufficiently motivated works on it.
I don't see the point of having to wait until someone makes it as a built in functionality. Deimos can't go very fast on implementing everything and I see no harm in having community-made simple bots to do the job meanwhile.
We'll have to disagree then. For me, comments are designed for humans, and semantically represent an input from a human. Deimos isn't the only open source contributor, and there's a non-zero amount of work regardless of implementation via site or bot.
A bot can be configured completely independently though, and very easily. I just don't see the problem except for "it doesn't look that nice", which for me is a non-issue.
Have to agree. There are bots that help and annoy. The helpful ones are pretty handy. I've yet to be annoyed enough to ban the concept all together. But maybe in time.
I think it's likely most of the things bots do will end up as part of the site at some point or other. That said, there may be tasks better suited to implementation in a bot/api framework. We talked about bots before and even giving them specially marked accounts that have expanded and restricted access - such as bypassing posting restrictions yet being unable to vote. We could even require the bots to be open sourced to give them accounts, which would help spur development of few-good bots instead of many-bad bots like we see on reddit. It'd cut down on duplication of effort.
Right now since we don't even have the API done bots are a bit premature, but they'll come along eventually. I'm sure we'll figure out how to make it less of a mess/security risk/annoyance here than they've been on reddit where there is no quality control.
Anything open source is better overall. People can look at and suggest fixes when the code is transparent. Which would help, and give a sort of community approach to the ordeal.
At the very least, when someone says they don't like something, they can point to the actual code they have an issue with.
What if there was some way to annotate posts by bots? Approved bots attach to posts small footnotes that contain a link or a little bit of text. It wouldn't clutter up the page like a full comment, there wouldn't be any replies to it, it might even be able to be filtered out by users, and it would be completely clear that it is a bot.
I was thinking that any bots doing this would be approved by mods. Here is a sloppy and rather aggressive mockup of a few elements I thought it could have. For the mobile layout, it could swing from off-screen like the side bar if this seems two cluttered.
But now imagine this kind of scenario with say... 20bots, it will surely become a mess and you'll have to click through to see these annotations a lot of times. That's without mentioning all the web scrapping and data collection in general that these bots would require.
Overall, just not worth having :
In an open platform like Tildes, it's a lot more effective to simply modify the server code.
Its mostly just an idea I had for how to handle certain features that people miss from Reddit and aren't against Tildes's approach, like the bots for mobile/desktop linking on Wikipedia and providing non-amp links. This solution would allow for an easier way for the community to develop features and have them be under the control of moderators.
But I guess if it would be more difficult to implement than some way to handle a couple of links then it wouldn't be worth it, unless there is some reason to have any more bots.
I guess it depends exactly what you're asking about, but overall, yes, I'd definitely like to be doing link canonicalization. It can be a bit trickier to do with links inside comments and text topics than it is with link topics for a few reasons though. Just off the top of my head, some of the difficulties:
Overall, I'm supportive of the idea and I think it's best to "fix" links as much as possible, but there will be some tricky parts to figure out with it.
I'm not sure what you mean by this. The "proper" way to canonicalize a link is to load the url (following all redirects), and check for a
<link rel="canonical">
tag on the final page you end up on. If one is present, its value should be what you canonicalize to, and if not, the final url you ended up at may be the canonicalized version, but not necessarily.In some cases you may be able to canonicalize without actually accessing the url, but that's not a general solution. It also may become wrong at any time if the destination site changes how their urls work, and the transformation can start breaking links instead of fixing them until it's updated.
Yes, but like I said, the problem with that is that it's not a general solution, and can start failing at any time. You have to determine and hard-code transformations for hundreds or thousands of different sites, and then you become responsible for monitoring and maintaining all of those transformations.
If Wikipedia changes something about how their urls work, suddenly the transformation might start breaking every Wikipedia link people try to use on Tildes, and nobody will be able to post a working Wikipedia link until the code is updated. For a frequently-linked site like Wikipedia this will probably be noticed quickly, but for less common ones it might not be obvious what's wrong, especially if the person trying to post the link doesn't realize that Tildes does these transformations.
To be clear, I'm not saying that we shouldn't do anything like this, but it requires being careful.
The difference is that if regular links start failing, that's the linked-to site's issue. Every site is independent, and not something we can control anyway. But if any transformation starts failing, that's Tildes's problem to detect and fix.
I don't know what Deimos would think about bot annotations, but there was a fair amount of discussion on bots in the distant past. And while I also don't know what Deimos ultimately decided on regarding them (if anything), the most popular idea put forward was forcing all bots to use special bot specific accounts so they could be easily identified by users and filtered out by them should they wish to. This would also allow for better monitoring of all the bots active on the site, standards regarding their purpose and behavior to be enforced, as well as removal of ones that were abusive or more nuisance than value-adding. E.g. "spelling correction bots", "random fact bots", etc.
The other major idea was incorporating a CLI (command line interface) into tildes so bots could be interacted with and their functions called through that instead of users having to make public comments to do so, comments which only serve to increase the noise in threads.
And another was to have all bot comments show up like noise labeled comments show up now, automatically collapsed with all their replies visible only on manually uncollapsing them, so they don't take up as much space as comments made by actual users.
Really it should be a PR to wikipedia to stop using mobile links. Media queries have been around for a long time.
There is a Firefox extension called Redirector that you can use to filter URLs and redirect them using regexps or wildcard expressions.
Though I think we'd rather have filters built in to the posting process for the most common sites, like Wikipedia.