InnerScientist

joined 1 year ago
[–] InnerScientist@lemmy.world 5 points 5 days ago (1 children)

Hasn't ended yet, as soon as we reach 75% the simulation will end.

[–] InnerScientist@lemmy.world 42 points 2 weeks ago* (last edited 2 weeks ago) (7 children)

....what are your expectation for GTA 6? Mine are pretty low considering shark cards and enshittification

[–] InnerScientist@lemmy.world 13 points 2 weeks ago* (last edited 2 weeks ago)

At least Reddit can be searched by google or on site^1,2,3^

1 terms and conditions do apply
2 you need an account for some subreddits
3 you also need the app for some subreddits

[–] InnerScientist@lemmy.world 7 points 3 weeks ago

Define "sandboxed"

Application can only access a limited part of the system? = use flatpak or build a container/VM image using the nix pkgs.

Application can be uninstalled completely and has separate libraries? I prefer nix.

[–] InnerScientist@lemmy.world 3 points 4 weeks ago

Especially since they don't talk about how they secure the local data

They don't because they don't

All the data you import is indexed in a SQLite database and stored on disk organized by date, without obfuscation or anything complicated.

Probably because this is still in early alpha and "the schema is still changing".

[–] InnerScientist@lemmy.world 1 points 1 month ago

How does mergefs compare to btrfs and bcachefs in using multiple partitions?

[–] InnerScientist@lemmy.world 4 points 1 month ago

Drives connected to usb have an unstable connection in my experience, this is very annoying and gets worse with hubs.


RAIDs reduce the time a system is offline and reduce data loss, if a drive fails and you can afford to wait for the new disk and the backup to restore, and have regular backups that ensure no important data gets lost (though remember the data added between backups may be lost) then you don't need a RAID.

I don't use RAIDs cause if my disk fails then I can stomach the 2-4 days it takes to buy a new one and restore the backup

Very important: use S.M.A.R.T and a filesystem with checksums to make sure you're not backing up corrupted data and know to get a new one


For encryption at rest you may want to look at clevis and tang, though you need a server in your home network for this to work. The client (with clevis) then decrypts the disk at boot if it can reach the server (tang). The server can't decrypt the data without the client secret and the client can't decrypt it without the server public key.

Don't know what your server could be though, maybe a router with custom firmware?


You should also look into cloud storage/rclone, that way you can automate your backups more and reduce the need for manual intervention.

I use rclone and restic to automatically backup my servers daily which takes a few seconds most of the time due to them being incremental backups.

[–] InnerScientist@lemmy.world 2 points 1 month ago (1 children)

Something I don't get is, why try to make all browser look the same when you can do the easier thing and just make each browser session have a new fingerprint?

A unique fingerprint doesn't matter much if it's only valid till I close that website, right? So why not change a lot of variables by some small amount to make the data useless?

[–] InnerScientist@lemmy.world 2 points 1 month ago

As long as you only copy off the disk, you can just reboot and the whole system in RAM vanishes and the normal system boots again for the second try.

[–] InnerScientist@lemmy.world 6 points 1 month ago

FYI you can use kexec and a prepared initrd to do something similar with only one command.

[–] InnerScientist@lemmy.world 20 points 2 months ago

Or encrypt it before uploading

[–] InnerScientist@lemmy.world 20 points 2 months ago (1 children)

Would this even cause a kernel panic? I think this just causes a userland "panic"

view more: next ›