Looks like it’s working. Time for a beer!
Looks like it’s working. Time for a beer!
Ah, I guess I must have overlooked that part. There are several reasons for not wanting to allow signups.
One is quite simple, cost. Right now this is running on a small, single core instance. It often stutters (especially when handling video updates), and that is not an issue, since that just means it’s going to take small while before updates are sent out. But you wouldn’t want to have that delay for actual users. Right now the costs are quite manageable, if I have to scale up in order to provide a fluent experience for its users, not so much.
Most of the other reasons come down to the responsibility of having to provide a home to any outside users that sign up. I don’t have the interest or time to maintain a community of people, nor to guarantee the uptime that such a server would require. It also wouldn’t work. The largest Lemmy instance in existence, lemmy.world, has defederated from this instance. So any users that sign up here, would be devoid from content on there. And as you said, any other instance can decide to do so at any time (in fact, I very much suggest they do so in the FAQ).
I could go on, but I think you get my drift.
Can’t blame you for that. Personally, I still think it excels at content where communication with OP is irrelevant, like !itookapicture@lemmit.online, !todayilearned@lemmit.online or !dataisbeautiful@lemmit.online. And by far best example of this, if you look at the subscriber count, is nsfw content.
Nope. That would be very hard to implement, and probably very confusing and disliked by other lemmy users.
I don’t know how the karma thresholds work behind the scenes, but might I suggest for the bot to do a “top for” sort instead? Like it will only repost top content for the past 6 hours only. This will also help get more quality content as well and avoid reposting low effort/quality posts.
This is effectively already kinda how it works. For each subreddit it periodically (anywhere between every 30 minutes to every 12 hours, based on subscriber count and posts per day) requests the “hot” content feed. It then checks each post if it has at least 20 upvotes, and a 80% upvote to downvote ratio. Those numbers are configurable, but that’s what they’re currently set to - I believe they’re a good mix between filtering out the complete garbage while still making sure it doesn’t miss good content is.
@criitz@reddthat.com @Halosheep@lemm.ee @Cagi@lemmy.ca @Hanabie@sh.itjust.works @SatanicNotMessianic@lemmy.ml @WallsToTheBalls@lemmynsfw.com @cypherpunks@lemmy.ml @DogMuffins@discuss.tchncs.de: First of all: those are some wonderful usernames. Secondly: I have taken your concerns to heart and made some changes. See my update here: https://lemmy.ml/post/6190779.
Funnily enough, it initially was the intention to have the bot check up on everything it posted, to see if it got deleted. In that way, it would outsource moderation to Reddit. I never got around to that, and am not sure I ever will.
So for now, handing out moderation to others is a good workaround. In order for me to make you a mod, you’ll need to leave a comment in the community, and mention me @admin@lemmit.online.
Actually, checking out the subreddit in question, that’s exactly the kind of content I want to avoid on here. Most of the posts on there are to invoke discussion, either with the OP or other members. You’d be better off starting a new community on your own instance.
Voila:
2023-10-07 17:23:54,906 - root - INFO - Community Boise is ENABLED, has 67 subscribers and 0 posts per day.
I understand your argument, and fully agree. There’s over 800 communities that I have to check though, so mistakes will be made.
No problem, as far as I’m concerned, all suspensions are negotiable. Which one was it? I’ll re-enable it straight away.
These communities were deemed too interactive (mostly invoking discussion with OP or the community) for Lemmit, and will not be receiving updates from the bot.
If anyone builds this as an optional feature, I’ll gladly accept a merge request for it. I don’t I will be running it on this instance though.
I think @tubbadu@lemmy.one wrote something to that effect (I’m still a mess with making proper links on here :/)
And I also found something else that was written in java (not javascript).
The downside from using the RSS feed is that it doesn’t contain the whole body, which my scraper does fetch.
If that’s what happens, that’s what happens. ¯\_(ツ)_/¯
I’m just here to offer a service for people who Do like it.
Yups. Combination of scraping and rss. With a bit of client-side throttling thrown in to stay under the radar.
Yups. It’s all done by one bot though, so you’ll just have to block that to get rid of them.
I’ll consider your opinion.