Massive amounts of telemetry data and nearly every app these days just being a web app just chews through your hardware. We use Teams at work and it's just god awful. Hell, even steam is a problem. Even having your friends list open can cause a loss to your fps in some games.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
"web app" gets used disparagingly frequently, but they can be done well. I use a couple PWAs; one for generating flight plans for simming (simbrief) and what I'm writing this on currently, wefwef. I think they're fine in the right circumstances, and it's harder for them to collect telemetry compared to a native app.
I think PWAs use your already installed browser whereas apps like teams use electron which bundles its own browser which a lot of people see as wasteful.
Given how prevalent web technologies are, I am honestly surprised there isn't a push towards having one common Electron installation per version and having apps share that. Each app bundling its own Electron is just silly.
Teams eats up my macos memory something awful and it's just sitting there... no one saying anything.
wtf I hate it
As an individual that has worked in IT for over a decade, yes. We keep making things incredibly fast, and for complex operations, that speed gain is realized, but for diverse, simple tasks, there's ultimately, very little difference between something rather old and something rather new. The most significant uplift in real-world performance has been the SSD. Simply, eliminating, or nearly-eliminating the delay of spinning disks seek times is by far the best thing that's happened for performance. Newer OSes and newer hardware go hand-in-hand, because with added hardware speed, comes software complexity, which is why a late-stage Windows 7 system typically will outperform an early stage Windows 10 machine; what I mean by "stage" here, is the point in time where the OS is considered "current" where early-stage is that it has recently become the currently newest OS, vs late stage, when it is soon to be overshadowed by something newer.
Microsoft made great performance gains over many years with windows since migrating to all NT-kernel OSes, around Windows XP, things got faster and faster, right up to around windows 8. Windows 7 was the last version, IMO, that was designed to be faster than it's predecessors with more speed improvements than losses from the added complexity of the OS; from then on, we've been adding more complexity (ie, slowing things down) at a faster rate than we can optimize and speed them up. Vista was a huge leap forward in security, adding code signing, specifically for drivers and such; in that process, MS streamlined drivers to run in a more-native way, though kernel-mode drivers were more or less a thing of the past; this, however, caused a lot of issues as XP-era drivers wouldn't work with Vista very well, if at all. Windows 7 further streamlined this, and as far as I know, there have been minimal if any improvements since.
In all of these cases, based on the XP base code (derived from NT4), it is still functionally slower than 9x, since all versions of 9x are written in x86 machine code rather than C, which is what NT is based on AFAIK. The migration to C code brought two things with it, the first, and most pertinent thing is slowdowns in the form of compiler optimizations, or rather, the lack of compiler optimizations, the second thing is portability, as the codebase is now C, the platform can now be recompiled fairly easily for different architectures, this was a long-term play by MS to ensure future compatibility with any architecture that may arise moving forward, all that MS would need to make Windows work on x (whatever arch is "next"), would be to write a C compiler for x, then begin compiling and debugging the code. MS has been ready and even produced several builds of windows for ARM and for MIPS specifically, and can likely migrate to RISC V anytime they want (if they haven't already). This was the most significant slowdown from 9x to XP.
As time went on, security features started being integrated into the OS at the kernel level, everything from driver and application signing, to encryption (full-drive, aka bitlocker, and data-in-flight, aka AES or HTTPS), and more. The TPM requirement for Windows 11 is the next basic step in this march forward for security. They're going this way because they have to. In order to be considered a viable OS for high-security applications, like government use, they must have security features that restrict access and ensure the security of data both in flight and at rest (on disk), the TPM is the next big step to doing that. The random seed in the TPM is far superior to any pseudo-random software seed that may exist, and the secured vault ensures that only authorized access is permitted to the security keys on the TPM, for things like full-disk encryption. The entire industry has been moving this direction just under the surface; and if you haven't had an eye to watch for it, then it would be completely invisible to you. This describes most consumers and especially gamers, who just want fast games and reliable access to their computers.
Speaking of consumers, at the same time, MS, like almost all software/web/whatever companies, have been moving towards making you, and specifically, your data, into a product they can sell. this is the Google approach. As an entity, at least until fairly recently, Google didn't make any money from their customers directly, instead they harvested their data, profiled all the users and sold advertisements based on that information, and they were INCREDIBLY successful at it and made plenty enough to keep them afloat. They've recently gotten into hardware and service (all the "as a service") offerings which has allowed them to grow. Facebook and Amazon have both done the same (among others but there's too many to list), and many other companies, including MS, are wondering "why not us too" because they see dollar signs down the road, as long as they can collect enough information about you to sell; So MS in their unique position, can basically cram down your throat all the data-harvesting malware they want, provided it never gets flagged as what it really is: MALWARE.
IMO, since Windows 7, they've been doing recon on all their users to try to obtain this information, which is part of the reason why everyone is being forced into using their Microsoft accounts for their PC logins on any non-pro and non-enterprise version of windows. This way they can tie the data they're collecting about you, to you specifically. As of Windows 11, this has ramped up significantly. More and more malware to observe you and your behavior, and basically build an advertising profile for you that they can sell. They want more information all the time, and the process of collecting that information and pushing it back to MS to sell has become more and more invasive as time goes on; these processes take computing power away from you as the consumer to serve MS's end goals, of selling you, their paying customer, to their advertisers. They will be paid on both sides (by you, for their product, and by their advertisers for your information). The worst part about it is that they haven't really had any significant push-back on any of it.
If you go back to Windows 9x, or DOS/Windows 3.1 days, none of this was happening, so the performance you got, was the performance the hardware could deliver; now, all of your programs have to go through so many layers to actually hit the hardware to be executed, that it's slowed things down to the point where it's DRAMATICALLY NOTICABLE. So yeah, if you're doing something intensive, like running a compression or encryption or benchmark or similar, you'll get very close to the real performance of the system, but if you're dynamically switching between apps, launching relatively small programs frequently, and generally multitasking, you're going to be hit hard by this. Not only does the OS need to index your action to build your advertising profile, it also needs to run the antivirus to scan the files you're accessing to make sure nobody else's malware is going to run, and observe every action you take, to report back to the overlords about what you're doing. In this always-on, always-connected world, you're paying for them to spy on you pretty much all the time. It's so DRAMATICALLY WORSE with windows 11, that it's becoming apparent that this is happening - to everyone; as someone who has seen all of this growing from the shadows in IT for a decade, I'm entirely unsurprised. Simply upgrading your computer to a newer OS makes it slower, always. I've never wondered why, I've always known. There's more moving parts they're putting in the way. It's not that the PC is slower, it just has SO MUCH MORE TO DO that it doesn't move faster, and often, it's noticeably slowed down by the processes.
Without jumping ship to Linux or some other FOSS, you're basically SOL.... Your phone is spying on you (whether android or iOS), your PCs are spying on you (whether Chromebook, Windows, or MAC), your "smart" home everything is spying on you, whether you have amazon alexa, google home, or Apple's equivalent... Now, even your car is starting to spy on you. Regardless of what it is, if it's more complex than a toaster, it's probably reporting your information to someone. There's very few if any software companies that are not doing this. Your choice then becomes a choice of equally bad options of who collects your data to sell it to whomever wants it, or go full tinfoil-hat and start expunging everything from your life that has a circuit more complicated than a 1980's fridge in it, and going to live in the forest. I'm doomed to sell my data to someone; so far it's mainly been MS and Google. My line of work doesn't really allow me to go "off-grid" and survive in my field; not everyone is in my position. So make your choice. This isn't going to get better anytime soon, and as far as I can see, it will never stop.... so choose.
2 decades of IT here. I'm comfortable on linux but still irritated that there is no phone that won't spy on me. I'm not irritated enough to go through the effort of jailbreaking my android. I own zero smart home products. Firefox and ublockorigin seems to block 99% of ads.
Not only are we sold as advertising profiles, but we are also getting profiled by our governments. The Chinese identification system is extremely advanced. Your favorite democracy is not far behind. The UK fines you automatically if you drive your car to the wrong place, etc.
Glad I'm going to be old and dead before the worst of it. Third world countries are still relatively untouched.
As much as I don't appreciate the state in which they left the world for the rest of us, the boomer generation really did live in a golden era. During a technological boom, where technology was improving lives faster than it was doing anything else. The advent of modern refridgeration techniques, consumer vehicles, automatic telephone switching systems.... even to the point of mobile communications, cellphones, and the internet; almost all of which was before privacy became a much larger issue like it is now. On top of that, they made more compared to the cost of goods sold, they had fair wages (thanks to unions), and working conditions, especially safety, was on a steady up-hill climb throughout their lifetime. On top of all that, their major assets, like homes and such were consistently appreciating in value throughout the years. A $60k home in like 1950, depending on location, now sells for 4x or even 10-20x the price today.
Millenials, Gen Z/Zoomers, we may live solidly in the information age, with access to a vast wealth of knowledge, almost all the time, constantly connected and constantly in touch, but it has come at a cost. We do not enjoy the same privacy and freedom that our parents did. Everything is posted online, whether we want it to be or not, private conversations get recorded and analysed automatically by our smart assistants either from our phones or the various google home/amazon alexa/whatever devices that litter our households, and at all times we're being monitored in some way, shape, or form. What's happening now, is that companies, governments and the ruling class are starting to make use of that information against us, to drive everyone further into disorder, futile infighting and dissention that's only distracting us from their main plot of essentially robbing us of every dime, nickel and dollar they can. More for them, less for everyone else; but everyone is so blinded by idiotic notions like race and gender politics and treating eachother like hot garbage because they're different, ancient ideas of racism and anti-gay propeganda that we're either fighting for or against very passionately, as they slowly raise prices, lower wages, reduce how much you can buy with your hard earned dollar, steal our land, force us to rent from them, never own anything, provide everything as a service.... All while trying to convince us that it's in our best interest. The sad thing is, for most, it's working. The whole "you can trust us" kind of mentality they've more or less pushed on us for decades is starting to crumble, and people are starting to ask WHY we should trust them. Then they just jazz hands look at this gay person, isn't he such an evil sinner? or the alternative of, jazz hands look at these bigots oppressing these gays, aren't they evil? (which can be substituted with people of color, or people of alternative lifestyles, or hell, even the amish... IDK, they're not consumers like the rest of us are, let's go after them, why not?)....
And we keep falling for it, every single fucking time. We're ALL being oppressed. No matter what your personal beliefs are, WE'RE ALL IN THIS. Unless you're part of the 1% or more accurately, the 0.1% of ultra rich fuckwads, you're a target. you may be privileged, or well-off, from family wealth or whatever, but you're still their target. Any wealth you have, they want. Their only real interest is in taking it from you, and giving you enough to distract you from the fact that it's happening, so it can continue.
I love this but my lemmy client really needs to shorten long messages and have an (expand) button if we're gonna have huge character limits like this because scrolling past this took an hour.
Amazing explanation throughout and super insightful. I don't believe Lemmy has gilding type emphasis outside of the upvotes, but you deserve it.
The real sad part for me is the amount of e-waste this produces. Especially in devices like laptops.
A clean Linux distro can extend a laptops life by a decade. I have a laptop from the c2d era that I threw an ssd in and put Linux on. Perfectly serviceable as a basic machine.
It's shocking how much faster Linux runs compared to a modern windows installation. I do worry however that as more and more programmers focus on web apps that we will eventually see the same problem on Linux as well. Developing desktop applications for Linux is already a pain and the ease of making modern web apps will amplify the problem. At least Linux won't have all of the awful bloat that Microsoft runs in the background on windows these days, but I don't think we will be able to escape from web app hell on Linux.
I do worry however that as more and more programmers focus on web apps that we will eventually see the same problem on Linux as well.
I wouldn't worry about it. If it becomes a problem in Linux, there would have already been an implosion in Windows land and I am pretty sure people will start to notice and come up with better solutions. I honestly don't understand why web apps have to bundle their own Electron builds instead of just sharing a common installation of specific versions of Electron.
I got a decade old laptop running Linux that's still kicking and outlived my MacBook. Surprisingly usable with only 4 gigs of ram thanks to Linux and a ssd upgrade. Without it would have tossed it long ago.
Hasn’t this always been the case? Software development is a balance between efficiency of code execution and efficiency of code creation. 20 years ago people had to code directly in assembly to make games like Roller Coaster Tycoon, but today they can use C++ (or even more abstract systems like Unity)
We hit the point where hardware is fast enough for most users about 15 years ago, and ever since we’ve been using faster hardware to allow for lazier code creation (which is good, since it means we get more software per man-hour worked)
which is good, since it means we get more software per man-hour worked
In the same way that more slop is good for the hog trough
Human development is the development of labor saving practices (i.e development tools and methods) that liberate humans and labor to do other things. In this case "good software" is bound to that it 's efficient enough to run on the system and do it's job and not slow down the whole system unjustifiably. Why on earth would anybody go full performance optimization autism mode, spending hours grinding down fractions of efficiency out of code, when one couldn't even notice the difference between it and less optimized code running on the target system? One could spend all that time to do something actually productive for the project like a new feature or do something entirely else. Those earlier game and software devs would have killed for hardware that didn't require everything to be custom built and optimized to a T. Not having to optimize everything to to a max doesn't produce "slop", it produces efficiency.
Your examples are honestly terrible. C++ is a fast language, and it's not easy to write fast x86 Assembly, especially faster than what the C++ compiler would spit out by itself. C++ doesn't cause a slowdown by itself.
20 years ago people could code in Python and JavaScript, or about any high-level language popular today. Most programming languages are fairly old and some were definitely use for game development in the past (like C++), and game engines definitely date back way before 2003, or 1999 when RollerCoaster Tycoon was released.
RCT is an anomaly, not the rule. People who didn't need to wouldn't program in Assembly, unless they were crazy and wanted a challenge. You missed the mark by about a decade or so, even then we're talking about consoles with extremely limited resources like the NES and not PC games like DOOM (1993), which was written in C.
This is why I hate electron apps on Linux and Windows alike. Sadly, most apps are going electron, especially popular commercial apps. 😮💨
100%. An example I noticed was Balena Etcher and Rufus - they about the same functionality. Etcher is 151MB thanks to Electron, Rufus is 1.1MB.
I'm extremely thankful to dd
and gparted
Isn't flutter running mostly native? Afaik most newly created apps are using flutter which does not use electron in it's native versions, right?
twas always thus, software development is gaseous in that it expands to take up all the area it is placed inside, this is both by the nature of software engineering taking the quickest route to solving any action, as well as by design of collusion between operating system manufacturers (read Microsoft and Apple) and the hardware platform manufacturers they support and promote. this has been happening since the dawn of personal computer systems, when leapfrogging processor, ram, hard drive, bus, and network eventually leads to hitherto improbably extravagant specs bogged down to uselessness. it's the bane and very nature of the computing ecosphere itself.
"We don't need to optimize because modern hardware!"
-Lazy developers kicking Moore's Law square in the junk
Can confirm 90 percent of modern software is dogshit. Thanks electron for making it worse.
JavaScript was a mistake.
I hadn't considered the latency of abstraction due to non-native development. I just assumed modern apps are loaded with bloatware, made more sophisticated by design, and perhaps less elegantly programmed on average.
My laptop was running slow until I blocked windows 10 phoning home 3000 times per day. Looking at you browser.pipe.aria.microsoft.com
goddammit. I was watching this going "hey, my system is like that!" Check and yes, my 24 core Ryzen 5900X with 32GB ram with NVMe drive is painfully slow opening things like calculator, terminal etc. I am running Fedora 38 with KDE desktop.... That the hell man
Single thread performance matter more when your metric is the speed of opening calculator, terminal and other apps. However, the 5900x has pretty good single thread performance already, roughly 1.5x faster than my old processor (i7-4790) and opening apps is pretty fast in my case (<1 second). Something is probably wrong with your setup. Perhaps you accidentally set your desktop power mode to "power saver" instead of "performance"?
It's not just applications. I recently "upgraded" two of my PCs from Windows 8.1 to to Windows 10. Ever since that having the mouse polling rate above like 125Hz and moving the cursor would result in frame drops in games.
This happened across two machines with different hardware, the only common denominator being the switch in Windows version. Tried a bunch of troubleshooting until I ultimately upgraded CPU + RAM due to RAM becoming faulty some time later on one of the machines. That finally resolved the issue.
So yeah, having to upgrade your hardware not because it's showing its age but rather because the software running on it has become more inefficient is a real problem IMO.
Rabbits
Isn't this equivalent to https://en.wikipedia.org/wiki/Braess%27s_paradox
and
Went down a bit of a rabbit hole here. Very interesting. Would love to see a Vsauce style video going into various real life examples of this
Thanks for sharing.
Cell phone batteries. Phones used to have shit batteries but they would last 3 days because the software running on it was minimal and the hardware was low power. Batteries have become way better in the last 20 years but phones don't even last a day because if all the hardware and software running super demanding and inefficient programs.
The biggest contributor is actually screen size. Your phone screen is about 8x larger and 16x higher resolution than an old dumb phone.
Part of that is just us using our phones more. If I used my smartphone the way I would use an old flip phone, the battery would probably last a few days
A tale as old as computers. Here's a very good talk about it: https://youtu.be/kZRE7HIO3vk
Stop buying games that need 220gb of drive space, an Nvidia gtx 690000 and a 7263641677 core processor then. More than a 60gb download size means I pirate it unless it's a really really damn good game. Games with no drm that can be run without a $20k computer, I buy.
Games are far from the worst examples of this. Largely games are still very high performance. Some lax policies on sizes are not the norm, most data is large because it’s just high detail.
The real losses are simple desktop apps being entire web engines.
So what I want to know is why do we still have programs that run on a single core when nearly every Windows PC out there is running a multi-core processor?
What are we missing to have the OS adapt any program to take advantage of the hardware?
You can't automatically adapt that. Multi thread safe programming takes extra work (or certain languages or frameworks that are able to optimize this). And even then, not all types of tasks make sense to multithread.
The OS still does take advantage of multiple cores, though. You never have just one process running. The OS will schedule different processes on different cores.
To run something on multiple cores you need to detect a bunch of different tasks it is doing that don't depend on one another. Then you can execute each task in its own thread. The problem is that most often these different task don't exist, or, if they do, figuring them out automatically by the code is likely equivalent to solving the halting problem, that is it's undecidable and there can't exist a program that does this.
Time to uninstall Windows 11 and go back to 3.11! Sure, it won't run anything made since the mid 90s at best, but what it does run will surely be lightning fast!!
A lightweight Linux distro can get you the same results with current software. Hell, even Ubuntu will. The deterrent has always been that you have to tinker with it to get it to work right, but that's a lot less true now then it was in the past. I recently installed Ubuntu 22.04 on my wife's old iMac and it's lightening fast and worked straight out of the box with no tinkering whatsoever. It's about 20 times faster than it was running iOS.
For applications developed natively, the response times would be expected to be quite good, but fewer applications are developed natively now including things that might seem like they otherwise would be. Notepad, for example, is now based on UWP.
UWP creates native Apps though?
UWP apps tend to have a lot of overhead compared to regular Win32 apps in my experience on low end machines.