this post was submitted on 04 Jul 2023
2247 points (99.0% liked)

Lemmy.World Announcements

29079 readers
192 users here now

This Community is intended for posts about the Lemmy.world server by the admins.

Follow us for server news 🐘

Outages πŸ”₯

https://status.lemmy.world

For support with issues at Lemmy.world, go to the Lemmy.world Support community.

Support e-mail

Any support requests are best sent to info@lemmy.world e-mail.

Report contact

Donations πŸ’—

If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.

If you can, please use / switch to Ko-Fi, it has the lowest fees for us

Ko-Fi (Donate)

Bunq (Donate)

Open Collective backers and sponsors

Patreon

Join the team

founded 2 years ago
MODERATORS
 

Status update July 4th

Just wanted to let you know where we are with Lemmy.world.

Issues

As you might have noticed, things still won't work as desired.. we see several issues:

Performance

  • Loading is mostly OK, but sometimes things take forever
  • We (and you) see many 502 errors, resulting in empty pages etc.
  • System load: The server is roughly at 60% cpu usage and around 25GB RAM usage. (That is, if we restart Lemmy every 30 minutes. Else memory will go to 100%)

Bugs

  • Replying to a DM doesn't seem to work. When hitting reply, you get a box with the original message which you can edit and save (which does nothing)
  • 2FA seems to be a problem for many people. It doesn't always work as expected.

Troubleshooting

We have many people helping us, with (site) moderation, sysadmin, troubleshooting, advise etc. There currently are 25 people in our Discord, including admins of other servers. In the Sysadmin channel we are with 8 people. We do troubleshooting sessions with these, and sometimes others. One of the Lemmy devs, @nutomic@lemmy.ml is also helping with current issues.

So, all is not yet running smoothly as we hoped, but with all this help we'll surely get there! Also thank you all for the donations, this helps giving the possibility to use the hardware and tools needed to keep Lemmy.world running!

you are viewing a single comment's thread
view the rest of the comments
[–] Sigmatics@lemmy.ca 59 points 1 year ago (1 children)
[–] SomeOtherUsername@lemmynsfw.com 50 points 1 year ago (4 children)
[–] ObviouslyNotBanana@lemmy.world 92 points 1 year ago (1 children)

Rust makes holes and that's how leaks happen

[–] Sigmatics@lemmy.ca 11 points 1 year ago

Underrated comment

[–] donalonzo@lemmy.world 32 points 1 year ago

Rust protects you from segfaulting and trying to access deallocated memory, but doesn't protect you from just deciding to keep everything in memory. That's a design choice. The original developers probably didn't expect such a deluge of users.

[–] bad3r@lemmy.one 18 points 1 year ago* (last edited 1 year ago)

Leaking memory is safe

[–] SomeOtherUsername@lemmynsfw.com 9 points 1 year ago (1 children)

I’m calling it - if there’s actually a memory leak in the Rust code, it’s gonna be the in memory queues because the DB’s iops can’t cope with the number of users.

[–] SomeOtherUsername@lemmynsfw.com 25 points 1 year ago* (last edited 1 year ago) (2 children)

I think I found what eats the memory. DB iops isn't the cause - looks like the server doesn't reply before all the database operations are done. The problem is the unbounded queue in the activitypub_federation crate, spawned when creating the ActivityQueue struct. The point is, this queue holds all the "activities" - events to be sent to federated instances. If, for whatever reason, the events aren't delivered to all the federated servers, they are retried with an exponential backoff for up to 2.5 days. If even a single federated instance is unreachable, all events remain in memory. For a large instance, this will eat up the memory for every upvote/downvote, post or comment.

Lemmy needs to figure out a scalable eventual consistency algorithm. Most importantly, to store the messages in the DB, not in memory.

[–] Coelacanth@lemmy.world 18 points 1 year ago

You should consider bringing this up at !lemmyperformance@lemmy.ml

[–] tool@r.rosettast0ned.com 8 points 1 year ago (1 children)

You should take this entire comment and paste it in as a issue on the Lemmy Github Issues page.

I think the devs have been aware of the issue, theoretically, for a while. A proper solution requires some significant changes, so it was being postponed because this wasn't considered urgent.