(08-24-2011, 09:50 AM)Gumbo Wrote: [ -> ]I'm no expert, but I can think of a few objections:
You'd end up with some very large packages if you included all the dependencies in each one, and it would be a very inefficient process as you'd end up reproducing dozens of already installed components - and that would reduce the portability of linux as an operating system, as well as slowing everything down if you were simultaneously running several different versions of the same process if several different applications used it.
This is both true and untrue. Lets say you were using one package manager, which is the way it should be. This package manager did things right, and if a particular piece of software needed specific versions of certain libraries, those would be included in that package or, ideally, those versions would only be downloaded if needed (you didn't have them yet) or wanted (you wanted to group everything needed for some software in one big package for archival or software sharing purposes). So lets say you did end up with two versions of the same library because each one was needed by different programs. First off: If this is necessary, then so freaking be it. Users would rather have
working software than worry about "wasting" (it's not wasted) some disk space, or loading two different libraries into RAM at the same time because they were using two apps that needed different ones. This is a good thing for multiple reasons, the first one I just mentioned, both programs will
run. Secondly, this will put pressure on library maintainers to make their libraries be more standardized and compatible, and it will put pressure on developers to choose good, standardized, reliable libraries. If you are a user wanting to run bleeding edge software, software which might be so far ahead that it uses some libraries with a new version of ABI, then you need to be prepared to either run the new libraries for that new software, and keep the old libraries for the other older software you're using, or you need to be willing to upgrade your older program to the version using the newer libraries as well, or you need to be compiling from source. All those options, both easy (but maybe very slightly theoretically "inefficient" somehow because you used a little bit more of your hard drive space, oh noes) and difficult (compiling) should always be available to end users. Users run alpha and beta software on Windows too, after all, and sometimes this is exactly what they have to do. Sometimes they have to install Microsoft's C++ Library of Doom alpha 1262 in addition to their current stable libraries just because they want to be a beta tester for some game or whatnot.
Either way, it needs to he a
choice. Right now, Linux users have no choice because they have no good and widely used standards allowing for that.
(08-24-2011, 09:50 AM)Gumbo Wrote: [ -> ]I also think it would become a very arduous process to build anything for linux as you'd need to consider al the supporting packages and build them specific to your needs.
Not sure what you're talking about here. Build anything? Having a standardized package management system means that it would propel developers to define clear standards for things, and it would allow you to easily create any kind of Linux software bundle ("distro") that you want to have. For instance, if you had a good package management system, you should be able to easily install any desktop environment that you want, or any version of Xorg that you want, etc. You shouldn't be waiting on distro ASDF to release their new distro version in order to get it, you should have true freedom to go and easily download and install anything you want no matter your
existing software, and if it's something that you need to be prompted at some point if you want it to be loaded, for instance if you installed two different versions of Xorg, then it should ask you during installation which one you want to boot into by default (like Debian/Ubuntu do when installing certain things), or those options should be available in the login manager or whatnot where appropriate.
Distro companies have Linux users in a vice. They want to be in control, and that means having users only building packages for
their OS, packages which aren't compatible with
other OSes. Well they can #!% off. That's not real freedom, and no Linux user which gives a crap about real freedom should stand for that system, and having a system like we do now where the answer is "oh just go compile everything from source" is telling every normal user out there, 99.9999% of the population of earth, "Sorry, Linux isn't about freedom, we didn't think about implementing super important software standards. Go $#@! yourself, and go back to Windows!"
Wonderful definition of freedom there.
(08-24-2011, 09:50 AM)Gumbo Wrote: [ -> ]You say you use 'all sorts of distros' but this is really the root of the problem. If you want standard packaging you'd be better off sticking with one distribution (I'd vote debian as it's biggest) and one type of package. The 'software freedom' you (and most linux users) value so highly means that each distribution is built with certain features and user groups in mind; this leads to differences in the creation of software packages, which leads to the incompatibility of their dependencies (for instance the different features wxWidgets is compiled with between debian and mandriva or fedora).
Naturally none of these problems are insurmountable, and it's great you start off your comment by pointing to zeroinstall which looks like an interesting project. Someone with more knowledge than I might let us know why a zeroinstall build of pcsx2 is / isn't possible - or maybe you could build one yourself? (I'm serious )
The solution isn't "shut up and stick to the biggest distro", because that's the continuation of the problem. The solution is standardized packages and the system I'm proposing. Zero Install is a solution which can be placed over the top of existing distros, operating in conjunction with the existing package managers, so the fact you can install it on any distro means that it is one of the
first ever cross-distro packaging solutions (and no, autopackage and others don't count because they aren't managers as they don't provide any kind of "manager" to do things like upgrading or uninstalling any of the programs that they have installed). One problem is we need a distro which is based solely on this package manager for use by both system-wide root apps as well as userland apps, or quite simply we need all critical system apps to be packaged up by Zero Install.
Now, the comment you made about wxWidgets being compiled differently. To solve that problem for users so they won't have to be recompiling things is simply to do one of two things I can think of. Make a different version/revision name for the different binaries which include different features, and start putting switches on the binary levels as well. By the latter I mean basically modules/plug-ins. Many programs have them right now already, so there's no reason not to expand on that. You want wxWidgets with GTK support? Include the dependency for wxGTK then. You need the QT module? Note the dependency for that too so it will get installed automatically if the user wants to install your package. Again, for places where this can't be done, you either help get modules implemented, move to other libraries, or have multiple versions of the library.
The Linux Standard Base tries to help solve the issue by giving several general libraries and programs to include by default on any Linux system, but the key is that the LSB shouldn't be needed if the package manager was smart enough.
On a side note, one issue that has plagued distros and software which is very easy to solve is when programs use static paths. Instead, the solution is to use variable paths. Don't make your program search in /usr/lib32 for 32-bit libraries, instead you query $LIB32 for them. That way you can easily make your system put directories anywhere you want them to be. If you wanted a layout like Windows for example, you could do that, and have /Users, /System, /System/lib32, etc.
Thanks for your comments. I'm not trying to rail on you, but I'm just trying to be honest about real solutions to the Linux packaging mess that distro companies aren't interested in solving as they are each trying to be monopolies, which is really sad especially since the Linux desktop market is so tiny already.
http://linuxhaters.blogspot.com/2008/06/...ckage.html
I just want progress on the issue and am more than tired of all the cop-outs. I support the Zero Install project in what ways I can but I don't have time to do much to help them, nor do I know much programming. Linux should be for everyone, not just geeks or those with tons of patience or who are okay with not having much access to software. You'd never hear many articles about "OMG THE WORLD IS OVAR, UBUNTU IS USING UNITY AND IT SUX!!!" if all users had real software freedom because users wouldn't be relying on Ubuntu. Just because Debian and Ubuntu have a lot of CPUs cranking away wasting energy on compiling packages specifically for their OS shouldn't force everyone to install Debian/UBuntu distros on their computers just so they can have a wider range of Linux software access. That's just continuing proprietary distros. Linux shouldn't be proprietary. "Distros" should not exist, they should be "starter software Linux bundles" or something.
(Of course, they should be called something else besides Linux, because systems using the BSD kernel or Herd or others should also be able to interoperate with this standardized system. Standardized NIX? SNIX? lol)
(08-25-2011, 08:54 PM)gregory Wrote: [ -> ]IMHO, the issue with PCSX2 is 32bits on 64 bits system. Otherwise it is no difficult to build it. It could have more automation to automatically install dependency (mostly what 0install do) but it is not too difficult. The build process is standard.
Nowadays, we have debian/ubuntu, archlinux and finally fedora packages. Others distribution could based their work on those 3 examples. For me it is not issue, but improvements are welcome.
Note: there is a package standard which is rpm (lsb standard) but more people loves deb.
Well, a 64-bit version not existing yet isn't really a problem though (even though it would be nice, don't get me wrong!), because running a 32-bit program on a 64-bit system is easy,
on Windows, and it
should be equally easy on Linux too, and sometimes it is. You get your 32-bit libraries installed and you're done. The
problem is it's not
easy, like it should be, whereas on Windows it is easy. If you made a properly-done ZI package which depended upon certain 32-bit libraries, and those were noted as dependencies, then as long as all the dependencies were in there, it should install and run without a hitch, easily.
To solve the problem, instead of trying to make packages for each proprietary package format, focus should be on solving that problem by supporting cross-distro standards, and ZI is the only cross-distro package solution that I know of right now, so a) support ZI and have to only make a package
once, for
all distros, b) help support efforts for a better solution if that one isn't good enough, or better yet help extend ZI until it is good enough, or c) do (a) until (b) is ready.