this post was submitted on 17 Jun 2024
34 points (62.5% liked)

Firefox

17952 readers
308 users here now

A place to discuss the news and latest developments on the open-source browser Firefox

founded 4 years ago
MODERATORS
 

Hi, Once in a while I try to clean up my tabs. First thing I do is use "merge all windows" to put all tabs into one window.

This often causes a memory clog and firefox get stuck in this state for 10-20 minutes

I have recorded one such instance.

I have tried using the "discard all tabs" addon, unfortunately, it is also getting frozen by the memory clog.

Sometimes I will just reboot my PC as that is faster.

Unfortunately, killing firefox this way, does not save the new tab order, so when I start firefox again, it will have 20+ windows open, which I again, merge all pages and then it clogs again !

So far the only solution I have found is just wait the 20 minutes.

Once the "memory clog" is passed, it runs just fine.

I would like better control over tab discard. and maybe some way of limitting bloat. For instance, I would rather keep a lower number of undiscarded youtube that as they seem to be insanely bloated.

In other cases, for most website I would like to never discard the contents.

In my ideal world, I would like the tabs to get frozen and saved to disk permanently, rather than assuming discard tabs can be reloaded. As if the websites were going to exist forever and discarding a tab is like cleaning a cache.

you are viewing a single comment's thread
view the rest of the comments
[–] interdimensionalmeme@lemmy.ml 1 points 5 months ago (2 children)

Thanks, I didn't know that one.

I have been experiementing with a transparent proxy like squid or something like Archive Box, to create static pages on the fly and load that.

But so far I've not made something seamless and pleasant to use. It would have to be at least as low friction as using google.

I am going to try using Mixtral 8x7b to perform natural language search over my archives and pull tabs from the collection of all pages I have ever seen. But that's still a long way away from being operational !

[–] optissima@possumpat.io 3 points 5 months ago (1 children)

....has Google still been giving you the same results recently? This is an extremely weak link in your setup to me. You'd be better off looking at a locally run search engine like peARs or something similar with locally downloaded and indexed files if you insist on using search, and it'll be waaaay more reliable than an LLM here.

[–] interdimensionalmeme@lemmy.ml 2 points 5 months ago

Google is giving me increasingly poor results, I am looking into deploying Searxng locally.

I really would like to operate my own local crawler and sorting algorithm.

I will check out the peARs you mentionned !

[–] throws_lemy@lemmy.nz 1 points 5 months ago

If you need offline version of the sites you can save them with SingleFileZ