Keep in mind that the upcoming Lemmy update will probably fix this I think. (Replacing websockets)
Lemmy.World Announcements
This Community is intended for posts about the Lemmy.world server by the admins.
Follow us for server news 🐘
Outages 🔥
https://status.lemmy.world
For support with issues at Lemmy.world, go to the Lemmy.world Support community.
Support e-mail
Any support requests are best sent to info@lemmy.world e-mail.
Report contact
- DM https://lemmy.world/u/lwreport
- Email report@lemmy.world (PGP Supported)
Donations 💗
If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.
If you can, please use / switch to Ko-Fi, it has the lowest fees for us
Join the team
Yes I really hope so!!
Maybe this is a dumb question, but why would replacing websockets speed things up? I read the Wikipedia page on it, but I guess I don’t understand it fully.
In general websockets scale badly because the server has to keep open a connection and a fair amount of state. You also can't really cache websocket messages like you can normal HTTP responses. Not sure which reasons apply to Lemmy though.
The caching problem is definitely part of it from conversations on GitHub.
Thanks for the explanation!
I really hope someone is doing some level of performance testing on those changes to make sure the changes fix the performance issues.
Just hopping into the chain to say that I appreciate you and all of your hard work! This place—Lemmy in general, but specifically this instance—has been so welcoming and uplifting. Thank you!
At least the "reply" button goes away so I don't end up double- triple- or even duodecuple-posting! Thanks for all the hard work that must be going on behind the scenes right now!
I kept getting a timeout message from Jerboa which led me to think I hadn't been posted. So I ended up submitting the same joke to the Dad Jokes sub three times. Which actually is how dad might tell that joke.
Lemmy is now your digital dadlife assistant.
I think in that case it's a feature not a bug.
I get this occasionally with Jeroba too, I had assumed it was because I'm on Mint and the connection is shoddy but maybe it's an issue with the client.
maybe related, but I've noticed that upvoting/downvoting has similar lag delays
Same. It would be good to fix this
I can't up/down vote at all.
Been noticing this in the app I’m working on. Pretty much all POST requests fail to return a response and just timeout after 60 seconds. A quick refresh shows that the new items do successfully get created though.
Have you tried enabling the slow query logs @ruud@lemmy.world? I went through that exercise yesterday to try to find the root cause but my instance doesn’t have enough load to reproduce the conditions, and my day job prevents me from devoting much time to writing a load test to simulate the load.
I did see several queries taking longer than 500ms (up to 2000ms) but they did not appear related to saving posts or comments.
Again, thank you for the outstanding work! You are awesome!
Also, the new icon for lemmy world is great!
I assume that there is something that is O(N), which explains why wait time scales with community size (amount of posts, comments)
Oh, Big-O notation? I never thought I’d see someone else mention big O notation out in the wild!
:high-five:
you are going to meet a lot of OG redditors in the next few weeks. Old reddit had Big O in every post, even posts with cute animals.
That’s pretty neat! I’ve honestly never seen it mentioned on Reddit before, so got a bit excited to see someone mention it here, admittedly maybe too excited.
there was a time, before the digg invasion, where someone would post a picture of a woman feeding 30 cats and there would be Big O jokes about how well this would work and crazy modifications to the situation to improve it. This would be almost every thread at one point on the site. I miss it.
That actually sounds like something I would have enjoyed. I joined Reddit around the time it started taking over, I think.
Those years before the Digg invasion are often the magic and mythic times OGs speak of, lots of amazing happened after too, but that early culture was almost entirely washed away by Digg and subsequent mainstreaming of some subs. the move from being a focus on quality link aggregation to a points game really helped push things down hill.
At this point I'm not sure meta-moderation is really a workable system and I kind of hate internet points. There are some new tools and techniques we can try in a system like this.
Thanks for your and the other Lemmy devs work on this. These growing pains are a good thing as frustrating as it can be for users and maintainers alike. Lemmy will get bigger and this optimization treadmill is really just starting.
Does this behaviour appear on other big instances? E.g. lemmy.ml?
Yes it does, tried this workaround before.
Yes. Absolutely does happen on other instances that have thousands of users.
Great, so it's reproducible and Lemmy-the-app related, not instance-specific. Should be fixable across the board once it's identified and resolved.
Thank you for your hard work and keeping us up to date.
In my case, the page keeps spinning but the post is not submitted, regardless of reloading the page or waiting for a long time. There was one case where I cut down significantly on the amount of characters in the post and then it posted, but I have been unable to replicate this.
I have the same issue with image posts. If I submit them through the app the posts counter on my profile goes up, but there's no post. I also can't retrieve any posts for my own account. It says I have 3 but it shows none.
Comments work OK so I'm not sure what the problem is. I was worried I got restricted or something.
Thank you so much
I noticed this, thanks for the clarification
I noticed that too, page keeps spinning but comments are posted immediately anyway.
ok, so it's not just me. Hope it gets resolved soon!
its def more hung up today, oddly its only first level replies for some reason
Thanks for posting the workaround and for working to resolve the issue. Lemmy is a great place, and a real breath of fresh air after Reddit.
@ruud@lemmy.world Yo dude, first off huge props and a big thank you for what you have setup. I’ll be donating monthly while I am here. I appreciate that we have an alternative to Reddit at this critical moment in time.
I do have a question on your long term plans, do you want to continue to expand and upgrade the server, as funding allows, or is there a cap that you will close off the server to new members? Or perhaps make it more of a process to join?
Well if all the Reddit users would get over to Lemmy I guess all servers would need to scale up... but I think the server we have now is powerfull enough to grow quite a lot, as long as the software gets tuned ..
I’ve done this twice in the last 20 minutes and the content is not there. This workaround was working earlier today though.
This is the biggest issue I have run into. Thanks for looking into it.
This issues is affecting other instances from what I can tell
Is the slowdown that it the instance has to send out updates about the comment to every other instance before returning a successful response? If so, is anyone working on moving this to an async queue?
Sending out updates seems like something that’s fine being eventually consistent
Ooh that’s a good remark ! I’ll see if that’s the cause
Reading more about how this works, sending out updates to each instance shouldn’t block the request from returning source.
It might be due to poorly optimized database queries. Check out this issue for more info. Sounds like there are problems with updating the rank of posts and probably comments too
So it looks like YOU SOLVED THE ISSUE with this reply! This led me to check the debug mode, and it was on! It turned that on when I just sterted the server and federation had issues....
We no longer seem to have the slowness!!
That’s awesome! Thanks for hosting the server!