Linux packaging headaches and how to solve them?
#1
Linux sucks in the area of packaging standards because it doesn't have any, something very critical for end users. PCSX2 has a megaton of library dependencies. The Linux binaries of PCSX2 are much smaller than the Windows binaries because the Windows ones either include those dependencies or they come standard with all versions of Windows.

My question: Can't the Linux binaries be packaged with the depended upon libraries so that users don't have to go scouring the net and dealing with all kinds of crazy issues just to run PCSX2?
OR
Can't there be some more focus on standardization, for instance what about using a cross-distro Linux package format like Zero Install?

Just some ideas. Laugh

Thanks!
Reply

Sponsored links

#2
curiosity: what is your distro?
Reply
#3
It shouldn't matter, that's the point...I use all sorts of distros though. All Linux software should be easily installable no matter what you currently have installed. Anything less isn't software freedom.
Reply
#4
I'm no expert, but I can think of a few objections:

You'd end up with some very large packages if you included all the dependencies in each one, and it would be a very inefficient process as you'd end up reproducing dozens of already installed components - and that would reduce the portability of linux as an operating system, as well as slowing everything down if you were simultaneously running several different versions of the same process if several different applications used it.

I also think it would become a very arduous process to build anything for linux as you'd need to consider al the supporting packages and build them specific to your needs.

You say you use 'all sorts of distros' but this is really the root of the problem. If you want standard packaging you'd be better off sticking with one distribution (I'd vote debian as it's biggest) and one type of package. The 'software freedom' you (and most linux users) value so highly means that each distribution is built with certain features and user groups in mind; this leads to differences in the creation of software packages, which leads to the incompatibility of their dependencies (for instance the different features wxWidgets is compiled with between debian and mandriva or fedora).

Naturally none of these problems are insurmountable, and it's great you start off your comment by pointing to zeroinstall which looks like an interesting project. Someone with more knowledge than I might let us know why a zeroinstall build of pcsx2 is / isn't possible - or maybe you could build one yourself? (I'm serious Smile)
Reply
#5
IMHO, the issue with PCSX2 is 32bits on 64 bits system. Otherwise it is no difficult to build it. It could have more automation to automatically install dependency (mostly what 0install do) but it is not too difficult. The build process is standard.

Nowadays, we have debian/ubuntu, archlinux and finally fedora packages. Others distribution could based their work on those 3 examples. For me it is not issue, but improvements are welcome.

Note: there is a package standard which is rpm (lsb standard) but more people loves deb.
Reply
#6
(08-24-2011, 09:50 AM)Gumbo Wrote: I'm no expert, but I can think of a few objections:

You'd end up with some very large packages if you included all the dependencies in each one, and it would be a very inefficient process as you'd end up reproducing dozens of already installed components - and that would reduce the portability of linux as an operating system, as well as slowing everything down if you were simultaneously running several different versions of the same process if several different applications used it.

This is both true and untrue. Lets say you were using one package manager, which is the way it should be. This package manager did things right, and if a particular piece of software needed specific versions of certain libraries, those would be included in that package or, ideally, those versions would only be downloaded if needed (you didn't have them yet) or wanted (you wanted to group everything needed for some software in one big package for archival or software sharing purposes). So lets say you did end up with two versions of the same library because each one was needed by different programs. First off: If this is necessary, then so freaking be it. Users would rather have working software than worry about "wasting" (it's not wasted) some disk space, or loading two different libraries into RAM at the same time because they were using two apps that needed different ones. This is a good thing for multiple reasons, the first one I just mentioned, both programs will run. Secondly, this will put pressure on library maintainers to make their libraries be more standardized and compatible, and it will put pressure on developers to choose good, standardized, reliable libraries. If you are a user wanting to run bleeding edge software, software which might be so far ahead that it uses some libraries with a new version of ABI, then you need to be prepared to either run the new libraries for that new software, and keep the old libraries for the other older software you're using, or you need to be willing to upgrade your older program to the version using the newer libraries as well, or you need to be compiling from source. All those options, both easy (but maybe very slightly theoretically "inefficient" somehow because you used a little bit more of your hard drive space, oh noes) and difficult (compiling) should always be available to end users. Users run alpha and beta software on Windows too, after all, and sometimes this is exactly what they have to do. Sometimes they have to install Microsoft's C++ Library of Doom alpha 1262 in addition to their current stable libraries just because they want to be a beta tester for some game or whatnot.

Either way, it needs to he a choice. Right now, Linux users have no choice because they have no good and widely used standards allowing for that.

(08-24-2011, 09:50 AM)Gumbo Wrote: I also think it would become a very arduous process to build anything for linux as you'd need to consider al the supporting packages and build them specific to your needs.

Not sure what you're talking about here. Build anything? Having a standardized package management system means that it would propel developers to define clear standards for things, and it would allow you to easily create any kind of Linux software bundle ("distro") that you want to have. For instance, if you had a good package management system, you should be able to easily install any desktop environment that you want, or any version of Xorg that you want, etc. You shouldn't be waiting on distro ASDF to release their new distro version in order to get it, you should have true freedom to go and easily download and install anything you want no matter your existing software, and if it's something that you need to be prompted at some point if you want it to be loaded, for instance if you installed two different versions of Xorg, then it should ask you during installation which one you want to boot into by default (like Debian/Ubuntu do when installing certain things), or those options should be available in the login manager or whatnot where appropriate.

Distro companies have Linux users in a vice. They want to be in control, and that means having users only building packages for their OS, packages which aren't compatible with other OSes. Well they can #!% off. That's not real freedom, and no Linux user which gives a crap about real freedom should stand for that system, and having a system like we do now where the answer is "oh just go compile everything from source" is telling every normal user out there, 99.9999% of the population of earth, "Sorry, Linux isn't about freedom, we didn't think about implementing super important software standards. Go $#@! yourself, and go back to Windows!"

Wonderful definition of freedom there.

(08-24-2011, 09:50 AM)Gumbo Wrote: You say you use 'all sorts of distros' but this is really the root of the problem. If you want standard packaging you'd be better off sticking with one distribution (I'd vote debian as it's biggest) and one type of package. The 'software freedom' you (and most linux users) value so highly means that each distribution is built with certain features and user groups in mind; this leads to differences in the creation of software packages, which leads to the incompatibility of their dependencies (for instance the different features wxWidgets is compiled with between debian and mandriva or fedora).

Naturally none of these problems are insurmountable, and it's great you start off your comment by pointing to zeroinstall which looks like an interesting project. Someone with more knowledge than I might let us know why a zeroinstall build of pcsx2 is / isn't possible - or maybe you could build one yourself? (I'm serious Smile)

The solution isn't "shut up and stick to the biggest distro", because that's the continuation of the problem. The solution is standardized packages and the system I'm proposing. Zero Install is a solution which can be placed over the top of existing distros, operating in conjunction with the existing package managers, so the fact you can install it on any distro means that it is one of the first ever cross-distro packaging solutions (and no, autopackage and others don't count because they aren't managers as they don't provide any kind of "manager" to do things like upgrading or uninstalling any of the programs that they have installed). One problem is we need a distro which is based solely on this package manager for use by both system-wide root apps as well as userland apps, or quite simply we need all critical system apps to be packaged up by Zero Install.

Now, the comment you made about wxWidgets being compiled differently. To solve that problem for users so they won't have to be recompiling things is simply to do one of two things I can think of. Make a different version/revision name for the different binaries which include different features, and start putting switches on the binary levels as well. By the latter I mean basically modules/plug-ins. Many programs have them right now already, so there's no reason not to expand on that. You want wxWidgets with GTK support? Include the dependency for wxGTK then. You need the QT module? Note the dependency for that too so it will get installed automatically if the user wants to install your package. Again, for places where this can't be done, you either help get modules implemented, move to other libraries, or have multiple versions of the library.

The Linux Standard Base tries to help solve the issue by giving several general libraries and programs to include by default on any Linux system, but the key is that the LSB shouldn't be needed if the package manager was smart enough.

On a side note, one issue that has plagued distros and software which is very easy to solve is when programs use static paths. Instead, the solution is to use variable paths. Don't make your program search in /usr/lib32 for 32-bit libraries, instead you query $LIB32 for them. That way you can easily make your system put directories anywhere you want them to be. If you wanted a layout like Windows for example, you could do that, and have /Users, /System, /System/lib32, etc.

Thanks for your comments. I'm not trying to rail on you, but I'm just trying to be honest about real solutions to the Linux packaging mess that distro companies aren't interested in solving as they are each trying to be monopolies, which is really sad especially since the Linux desktop market is so tiny already. http://linuxhaters.blogspot.com/2008/06/...ckage.html

I just want progress on the issue and am more than tired of all the cop-outs. I support the Zero Install project in what ways I can but I don't have time to do much to help them, nor do I know much programming. Linux should be for everyone, not just geeks or those with tons of patience or who are okay with not having much access to software. You'd never hear many articles about "OMG THE WORLD IS OVAR, UBUNTU IS USING UNITY AND IT SUX!!!" if all users had real software freedom because users wouldn't be relying on Ubuntu. Just because Debian and Ubuntu have a lot of CPUs cranking away wasting energy on compiling packages specifically for their OS shouldn't force everyone to install Debian/UBuntu distros on their computers just so they can have a wider range of Linux software access. That's just continuing proprietary distros. Linux shouldn't be proprietary. "Distros" should not exist, they should be "starter software Linux bundles" or something.

(Of course, they should be called something else besides Linux, because systems using the BSD kernel or Herd or others should also be able to interoperate with this standardized system. Standardized NIX? SNIX? lol)

(08-25-2011, 08:54 PM)gregory Wrote: IMHO, the issue with PCSX2 is 32bits on 64 bits system. Otherwise it is no difficult to build it. It could have more automation to automatically install dependency (mostly what 0install do) but it is not too difficult. The build process is standard.

Nowadays, we have debian/ubuntu, archlinux and finally fedora packages. Others distribution could based their work on those 3 examples. For me it is not issue, but improvements are welcome.

Note: there is a package standard which is rpm (lsb standard) but more people loves deb.

Well, a 64-bit version not existing yet isn't really a problem though (even though it would be nice, don't get me wrong!), because running a 32-bit program on a 64-bit system is easy, on Windows, and it should be equally easy on Linux too, and sometimes it is. You get your 32-bit libraries installed and you're done. The problem is it's not easy, like it should be, whereas on Windows it is easy. If you made a properly-done ZI package which depended upon certain 32-bit libraries, and those were noted as dependencies, then as long as all the dependencies were in there, it should install and run without a hitch, easily.

To solve the problem, instead of trying to make packages for each proprietary package format, focus should be on solving that problem by supporting cross-distro standards, and ZI is the only cross-distro package solution that I know of right now, so a) support ZI and have to only make a package once, for all distros, b) help support efforts for a better solution if that one isn't good enough, or better yet help extend ZI until it is good enough, or c) do (a) until (b) is ready.
Reply
#7
Interesting post. I agree with you there is a lots of work that is done countless of time for nothing. I do not think, it will change. Standard everywhere freezes the evolution. I still want to highlight some point.

1/ no distribution thought to support multiple architecture on the same time. It was broken by design. Now debian/ubuntu are in process to fix it so you will be able to install/cross-compile any architecture package on those system.

2/ The path of library is not static (except plugins). You can change it with the variable LD_LIBRARY_PATH. /usr/lib is only the default. It would be possible to provide a big fat package with all library and run library as
Code:
LD_LIBRARY_PATH=`pwd`/pcsx2_lib ./pcsx2
Well in this case, might be easier to link them as static library. Anyway, it is done by several application (in particular proprietary) that need really special version of a package. For example ghdl.

3/ shared libraries are a matter to save resources. Yes they are cheap on a desktop but they are expensive on a mobile phone. But the main goal is to reduce security flaw and bugs (only 1 library in 1 place to update). The drawback is that the new version can add new bug/security flaw in all your system.

4/ The issue is not a new version with a different ABI but the same version with a different ABI. Couple of week ago I had libglew1.5 for my system and libglew1.6 for PCSX2. One example of issue here is wx2.8 of debian is different of wx2.8 of fedora mostly because fedora version is compatible with wx2.4. In my opinion, dev must not give this kind of option.

5/ Like I said there is already a standardized packages (RPM) but humans are humans, ZI won't change anything. RPM (replace with anything) sucks, my (insert package name) will be better and you have a new package format. In my opinion it will be easier to have tool to support multiple package on any distribution. It seem to be supported on debian, so it is possible on others distribution.
Code:
19:40 gregory ~/pcsx2% sho rpm
Description: package manager for RPM
The RPM Package Manager (RPM) is a command-line driven package
management system capable of installing, uninstalling, verifying,
querying, and updating computer software packages.
.
On Debian and derived systems it is recommended to use "alien" to
convert RPM packages into .deb format instead of bypassing the Debian
package management system by installing them directly with rpm.
Code:
19:40 gregory ~/pcsx2% sho alien
Description: convert and install rpm and other packages
Alien allows you to convert LSB, Red Hat, Stampede and Slackware Packages
into Debian packages, which can be installed with dpkg.
.
It can also generate packages of any of the other formats.
.
This is a tool only suitable for binary packages.
Reply
#8
(09-11-2011, 09:36 PM)gregory Wrote: Interesting post. I agree with you there is a lots of work that is done countless of time for nothing. I do not think, it will change. Standard everywhere freezes the evolution. I still want to highlight some point.

HTML, FTP, TCP/IP, etc. Those are all standards. Have those hindered or helped communication? Anyone living in IT knows the answer is "helped". Do you know about the UNIX wars? A lack of standards helped destroy most all the UNIXes because of the lack of interoperability. Standards allow the programs surrounding them to actually compete. A lack of standards, i.e. proprietary software, means little or no competition. That's a huge part of what made Microsoft such a culprit. They would break standards, like HTML, in order to try to further demand of their software in order to hurt competition. When new ways of doing things come about that are easier or better in some way, and there is a want for them, as long as an open and standardized API/ABI accompanies it, it will get adopted. No one wants to jump into a situation where they are being held captive by proprietary software.

(09-11-2011, 09:36 PM)gregory Wrote: 1/ no distribution thought to support multiple architecture on the same time. It was broken by design. Now debian/ubuntu are in process to fix it so you will be able to install/cross-compile any architecture package on those system.

The reason it wasn't is developers said, "Why would I ever want that when I can run native arch programs instead?" They never thought about niche needs like certain programs which haven't been ported to other arch types yet. That's understandable, niche needs are just that, they aren't as important as needs which are more widely shared. What is sad though is that if they were using a proper and good package management system from the beginning, there would have been no need to support it because it would have already been supported as a program's package would request and be provided with all the appropriate 32-bit libraries needed.

(09-11-2011, 09:36 PM)gregory Wrote: 2/ The path of library is not static (except plugins). You can change it with the variable LD_LIBRARY_PATH. /usr/lib is only the default. It would be possible to provide a big fat package with all library and run library as
Code:
LD_LIBRARY_PATH=`pwd`/pcsx2_lib ./pcsx2
Well in this case, might be easier to link them as static library. Anyway, it is done by several application (in particular proprietary) that need really special version of a package. For example ghdl.

All I know is there are often problems with apps searching in the wrong place for libraries, so apps need to follow some standardized queries for searching for these libraries. That would be an incredibly important part of freedesktop.org's standards, or whatever other standards body wants to make that standard known. These standards also need to be easily referenced to help out developers when creating programs for *nix machines. (I say *nix to include any *nix system including support for running Linux binaries, such as BSD.)

I can't tell you how many times I've run into this problem, and the solution is to manually create a symbolic link. Stupid developers? Stupid standards? Stupid documentation? Or a mix? In any case, communication apparently broke down somewhere.

(09-11-2011, 09:36 PM)gregory Wrote: 3/ shared libraries are a matter to save resources. Yes they are cheap on a desktop but they are expensive on a mobile phone. But the main goal is to reduce security flaw and bugs (only 1 library in 1 place to update). The drawback is that the new version can add new bug/security flaw in all your system.

You could design the packaging system to allow a specific company to offer support contracts over specific program versions. You could tell your package manager, "Stick to the programs and versions supported by Red Hat/Canonical/Novell/etc." You should also be able to define many other modes of operation for your package manager. For instance, if you want it to upgrade to new stable versions for most all of the software on the system while trying to make library versions or at least ABIs be the same in order to save space, you should be able to set that as your preference. If you prefer to have the absolute latest piece of "stable" software then you should be able to tell it to upgrade to those even if it needs to have one or more versions of some libraries installed. You could have it offer adjustable thresholds, too, like allowing you to have a certain number of libraries/programs/packages if one program is old but hasn't been upgrade for compatibility with a new library that has a new ABI, and those older packages are holding back your system from upgrading to a set of newer libraries and programs.

The package management system should also be able to allow you to install multiple versions of libraries and programs including a mix of beta, alpha, and stable versions if you wish it to.

(09-11-2011, 09:36 PM)gregory Wrote: 4/ The issue is not a new version with a different ABI but the same version with a different ABI. Couple of week ago I had libglew1.5 for my system and libglew1.6 for PCSX2. One example of issue here is wx2.8 of debian is different of wx2.8 of fedora mostly because fedora version is compatible with wx2.4. In my opinion, dev must not give this kind of option.

If there are different options that are hard switches, i.e. compiled in, to the same version of a library, then those differences must be noted in the binary packages. Turning on or off certain features during compilation time probably came about with the whole "show me the source!" push. Focus was placed on seeing the source code to make sure it was open source, and on compiling. In the early days of Linux, source code was the main form of distribution of programs. It was a user-unfriendly, exhaustive, time-consuming, electricity-wasting time period for Linux (and other nixes). Regardless, computers should be getting smarter, and this shouldn't ever be needed anymore. Because binaries are so much easier to pass around in the ways I just noted, these options need to start being moved onto the binary level as modules and such, similar to plugins for Firefox or modules for Linux or Apache. If the feature is big and bulky and might slow things down, make it a module. If the feature is small and okay to include, make it an option in the core program. With good planning, I think this is always possible for all software projects. If hypothetically it somehow wasn't, which I would be extremely sceptical that there wasn't a way to get it done, and also for those programs which right now are designed this way, having multiple versions of those programs is the solution for now. So, a good package manager would go:

You have Xorg already. You want to install a program which requires that Xorg include, say, MultiPointerX, or whatever, which your version doesn't include. The package manager would then download for you a version which satisfies that requirement as well as the previous requirements of other packages.

(09-11-2011, 09:36 PM)gregory Wrote: 5/ Like I said there is already a standardized packages (RPM) but humans are humans, ZI won't change anything. RPM (replace with anything) sucks, my (insert package name) will be better and you have a new package format. In my opinion it will be easier to have tool to support multiple package on any distribution. It seem to be supported on debian, so it is possible on others distribution.
Code:
19:40 gregory ~/pcsx2% sho rpm
Description: package manager for RPM
The RPM Package Manager (RPM) is a command-line driven package
management system capable of installing, uninstalling, verifying,
querying, and updating computer software packages.
.
On Debian and derived systems it is recommended to use "alien" to
convert RPM packages into .deb format instead of bypassing the Debian
package management system by installing them directly with rpm.
Code:
19:40 gregory ~/pcsx2% sho alien
Description: convert and install rpm and other packages
Alien allows you to convert LSB, Red Hat, Stampede and Slackware Packages
into Debian packages, which can be installed with dpkg.
.
It can also generate packages of any of the other formats.
.
This is a tool only suitable for binary packages.

Here goes again, same argument different day lol...sorry, I've heard it enough before.

ZI isn't "just another package manager", and RPM and DEB are both crap. Let me explain why for each of those statements..

ZI can be installed on any distro. That means that no matter the package management system you have now, you can have access to one that is actually standardized. In other words, from a developer's view, making a ZI package once means making a package for all distros, without having to do all that stupid effort. From a user's perspective, it means only needing one or more packages (including the dependencies, if any) no matter what distro they are on, instead of being shut out just because the developer only made a package for other systems.

In other words, it means it could be one of the first, and maybe the only, truly standardized cross-distro software accessibility platforms which, unlike normal compressed archive files (tarballs), offers automatic updates, menu icons, uninstallation, sharing with other users on the same computer or over the network securely, and more.

Second, why is RPM and DEB both crap? Because they often break. They are only designed for upgrading and not installing separate versions of the same program or library. They are confined to specific versions of specific distros and CANNOT and WILL NOT ever be compatible with other distros. They are not designed to be standardized and to address all the needs of a truly robust universal packaging system. Now, if they had these features added on so that they COULD become that, and they WOULD allow you to, say, install Firefox 6 even though you already have Firefox 5 installed, and if they can't share libraries between each other then also install different versions of Gecko and all the other libraries Firefox relies upon, then I would totally jump over to them in a heartbeat.

Can you easily give some random user, lets say a user of a Ubuntu system, a RPM package of a program, and have them install it? No, you can't, for so many reasons. First off, you have the inability of DEB and RPM to resolve dependencies properly and to allow the same version of libraries and other programs to co-exist. Now, some programs aren't actually designed to co-exist, but that's an easy matter to solve by wrapping around certain solutions until they get their act together. Secondly, the repository isn't going to know what dependency to query for in order to run the RPM package, even if it DID automatically run the package through Alien to convert it. That's because the package names are different because there is no standard naming going on. Part of the solution to this would be to establish the clear names and versions of programs as would be implemented in the metadata/format of this universal package format. The name of the package file itself wouldn't matter so much as would the metadata program name and version. Third, the program is probably not going to be searching in the correct place for libraries, but lets say that Alien took care of this problem. Fourth, the Ubuntu repository isn't going to contain the versions of the libraries that the program needs to function anyway.

Solving the problem with security involves having gpg keys for certain developers for certain programs. There are other security methods too which could be implemented to prevent an attack. Look at all the things web browsers do for instance, and email systems, like sharing blacklists or "unsafe site" lists, etc. The distro companies love touting the "if you get all your stuff from us, it's the only way you can be safe" argument, but that's B.S. due to the examples I just gave. You can communicate and exchange information with the world and still stay safe and there are many ways of doing that. Being dependant on one repository for all your programs is not needed and it's not something any user should be forced to do. Right now they are essentially due to the breakdown of program availability for them once they wander outside the walled garden known as their repository.
Reply
#9
In my opinion there is several "level" of standard. Multiple standard is a good things (or at least evolution). Only 1 standard is bad because it miss competition. For example http, the protocol is very very old and not efficient, google try to create a new one (do not know the status maybe they drop it) which is more faster but it would take ages to move to somethings more efficient. I think it is more a matter of openess rather than standard. Look at the different standards for the cell phone, somethings like 10 with multiple revisions. It creates competition, jobs and money Wink

Multiarch is not a too small niche because it help a lots for cross compilation. Moreover it would have help to cross-upgrade from x86 to amd64. Anyway, my point was that it would be better to upgrade existing package instead of creating new stuff.

The ability to install multiple version of the same binary/library/package is probably a niche less bigger than multiarch. I mean if you need more feature of a library install the new version, others package might need a rebuild but it is an opensource OS anyway. Linux distribution was build with this opensource hypothesis. The only issue is for proprietary stuff, but they can package everything into a single package without dependency. Actually I do not see why you need a package manager except the automatic update. A package is only an archive with some meta-data for dependency.

You are right about alien, did not think about the details Wink I think most of them can be fixed but ZI feel like a better idea.
Reply
#10
(09-14-2011, 09:06 PM)gregory Wrote: In my opinion there is several "level" of standard. Multiple standard is a good things (or at least evolution). Only 1 standard is bad because it miss competition. For example http, the protocol is very very old and not efficient, google try to create a new one (do not know the status maybe they drop it) which is more faster but it would take ages to move to somethings more efficient. I think it is more a matter of openess rather than standard. Look at the different standards for the cell phone, somethings like 10 with multiple revisions. It creates competition, jobs and money Wink

Multiarch is not a too small niche because it help a lots for cross compilation. Moreover it would have help to cross-upgrade from x86 to amd64. Anyway, my point was that it would be better to upgrade existing package instead of creating new stuff.

The ability to install multiple version of the same binary/library/package is probably a niche less bigger than multiarch. I mean if you need more feature of a library install the new version, others package might need a rebuild but it is an opensource OS anyway. Linux distribution was build with this opensource hypothesis. The only issue is for proprietary stuff, but they can package everything into a single package without dependency. Actually I do not see why you need a package manager except the automatic update. A package is only an archive with some meta-data for dependency.

You are right about alien, did not think about the details Wink I think most of them can be fixed but ZI feel like a better idea.

Sure you need competition, that's not the problem, the problem is that instead of making all the package managers compatible with the existing formats, there is interest by the distro companies to NOT do that and it's not possible because the current formats suck. For example, take DEB managers, why is there no integration with alien to get seamless RPM integration? The reason is because the systems both suck, even within the same formats. They aren't good enough to be able to cope with multiple versions of programs and libraries, or with using a naming convention for software projects so it can easily be determined what dependencies are needed, AND to get those dependencies from some place, all automatically. If another format wants to come out too that's fine, as long as it can also be implemented by the existing package managers because the format has built into it the ability for everything to be contained and installed anywhere and in any way a package manager might want to install it. If you provide enough metadata, and if the program is made to be dynamic enough, that will be possible.

Then, instead of having a billion different repositories of packages around the world for a billion different distros, there should be general Linux software repositories using GPG keys for verification, and it would be good if there were also P2P systems in place to share certain packages and to get away from centralized systems. You should be able to download a basic starter Linux bundle pack, install it, and from there you should be able to install whatever the hell you want to, easily.

Why do I save files in ODF format? It's because I can read and write them with any office software that has that ability, because ODF is a standard. I want a package standard that can be installed on any system and is compatible with or CAN be made compatible with all package managers, and currently the only solution, the only thing that can do that is ZI, because it can be installed on any system. Until another project gets started for delivering a format which is powerful enough to package up any kind of software and deploy it to systems with package managers that have implemented compatibility for the format, everyone should support ZI.

Also, the part you mention about being able to compile open source software, again, that's great and all, but you're missing the point, the point is you shouldn't have to waste your time and power compiling software if you don't need or want to when you can install binaries instead. I'm not going to say, "Sorry grandma, you have to wait 5 hours while your Pentium III compiles LibreOffice." Totally impractical to say everyone should be forced to compile all their software just because of a problem with a lack of packaging standards. Also, being able to run closed source programs on Linux, while I hate closed source software due to the lack of freedom too, should still be an option to anyone and provides more opportunities for Linux to spread which is a lot more important IMO than making sure every piece of software anyone ever uses is licensed right. Linux spreading means more software of all license types for Linux as a whole which directly effects all Linux users, even ones living under a rock.
Reply




Users browsing this thread: 1 Guest(s)