this post was submitted on 09 Apr 2024
313 points (98.8% liked)
Linux
48732 readers
1161 users here now
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Why in Linux, Software uses a particular version of a library? Why not just say it's dependent on that library regardless of version? It become pain in ass when you are using an ancient software it required old version of newer library so you have to create symlinks of every library to match old version.
I know that sometimes newer version of Library is not compatible with software but still. And what we can do as a software developer to fix this problem? Or as a end user.
Software changes. Version 0.5 will not have the same features as Version 0.9 most of the time. Features get added over time, features get removed over time and the interface of a library might change over time too.
As a software dev, the only thing you can do is keep the same API for ever, but that is not always feasible.
Hey, Thanks I have one more question. Is it possible to ship all required library with software?
It is, that's what Windows does. It's also possible to compile programs to not need external libraries and instead embed all they need. But both of these are bad ideas.
Imagine you install dolphin (the KDE file manager) It will need lots of KDE libraries, then you install Okular (KDE PDF reader) it will require lots of the same library. Extend that to the hundreds of programs that are installed on your computer and you'll easily doubled the space used with no particular benefit since the package manager already takes care of updating the programs and libraries together. Not just that, but if every program came with it's own libraries, if a bug/security flaw was found in one of the libraries each program would need to upgrade, and if one didn't you might be susceptible to bugs/attacks through that program.
Thanks you so much for explanation.
That is possible indeed! For more context, you can look up "static linking vs dynamic linking"
Tldr: Static linking: all dependencies get baked into the final binary Dynamic linking: the binary searches for libraries in your system's PATH and loads them dynamically at runtime
Thanks
Absolutely! That's called static linking, as in the library is included in the executable. Most Rust programs are compiled that way.
Yea, That's why I am learning Rust but I didn't know it called Static Linking I think it just how Rust works LMAO. And Thanks again
No problem. Good luck with your rust journey, it's imo the best programming language.
Doesn't that mean that you have a lot of duplicate libraries when using Rust programs, even ones with the same version? That seems very inefficient
It's true that boundaries get inflated as a result, but with today's hard drives it's not really a problem.
In addition to static linking, you can also load bundled dynamic libraries via RPATH, which is a section in an ELF binary where you can specify a custom library location. Assuming you're using gcc, you could set the
LD_RUN_PATH
environment variable to specify the folder path containing your libraries. There may be a similar option for other compilers too, because in the end they'd be spitting out an ELF, and RPATH is part of the ELF spec.BUT I agree with what @Nibodhika@lemmy.world wrote - this is generally a bad idea. In addition to what they stated, a big issue could be the licensing - the license of your app may not be compatible with the license of the library. For instance, if the library is licensed under the GPL, then you have to ship your app under GPL as well - which you may or may not want. And if you're using several different libraries, then you'll have to verify each of their licenses and ensure that you're not violating or conflicting with any of them.
Another issue is that the libraries you ship with your program may not be optimal for the user's device or use case. For instance, a user may prefer libraries compiled for their particular CPU's microarchitecture for best performance, and by forcing your own libraries, you'd be denying them that. That's why it's best left to the distro/user.
In saying that, you could ship your app as a Flatpak - that way you don't have to worry about the versions of libraries on the user's system or causing conflicts.
Thanks to let me know about Licensing issue.
Appimage might also be a way
To add some nuance, all features in v0.5.0 should still exist in v0.9.0 in the modern software landscape.
If v0.5.0 has features ABC and then one was then changed, under semantic versioning which most software follows these days then it should get a breaking change and would therefore get promoted to v1.0.0.
If ABC got a new feature D but ABC didn't change, it would have been v0.6.0 instead. This system, when stuck to,helps immensely when upgrading packages.
When having a breaking change pre 1.0.0, I'd expect a minor version bump instead, as 1.0.0 signals that the project is stable or at least finished enough for use.
Because it's not guaranteed that it'll work. FOSS projects don't run under strict managerial definitions where they have to maintain compatibility in all their APIs etc. They are developed freely. As such, you can't really rely on full compatibility.
That's the same on ANY platform, but windows is far worse because most apps ship a DLL and -never- update the damn thing. With Linux, it's a little bit more transparent. (edit: unless you do the stupid shit and link statically, but again in the brave new world of Rust and Go having 500 Mb binaries for a 5 Kb program is acceptable)
Also, applications use the API/ABI of a particular library. Now, if the developers of the said library actually change something in the library's behavior with an update, your app won't work it no more unless you go and actually update your own code and find everything that's broken.
So as you can understand, this is a maintenance burden. A lot of apps delegate this to a later time, or something that happens sometimes with FOSS is that the app goes unmaintained somewhat, or in some cases the app customizes the library so much, that you just can't update that shit anymore. So you fix on a particular version of the library.
You sometimes can build software that will work with more than one version of a C library, but less and less software is being written that binds only to C libraries. The key topic you want to look up is probably "ABI stability".
IMHO the answer is social, not technical:
Backwarts compatibility/legacy code is not fun, and so unless you throw a lot of money at the problem (RHEL), people don't do it in their free time.
The best way to distribute a desktop app on Linux is to make it Win32 (and run it with WINE) ... :-P (Perhaps Flatpak will change this.)