How long would it take for something like Autopackage

SleepWalkerX

Platinum Member
Jun 29, 2004
2,649
0
0
The one thing I like about Windows is that when I want a program I double-click on an .exe a graphical user installer appears that I've grown accustomed to clicking Next a couple times on and I can run the program. I'm not saying that apt-get install is hard, but it revolves around a central server. And as far as dpkg/rpm, I don't believe there's a gui with the installation. Plus it has more inoperability across distros. This seems like an great step forward for the Linux desktop.

So far I'm not sure that I've seen a distro that's even included it. I really hope this becomes adopted.
 

nweaver

Diamond Member
Jan 21, 2001
6,813
1
0
synaptic, Click, search, click add, click install, click done

/thread off
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Me too. The only time I've seen a GUI setup is with a Quake 3 point release, and other games like UT2k4 and Wolf: ET.
 

SleepWalkerX

Platinum Member
Jun 29, 2004
2,649
0
0
Originally posted by: nweaver
synaptic, Click, search, click add, click install, click done

/thread off

But then you rely on repositories. A developer would have to get his package on a lot of repositories, making a lot of different packages for different distros, and then there's just the fact that its centralized not even including the chance that some of those packages are bugged (which I'm sure has happened). I'm not saying its hard for the end user to install a package. Just that it could be much more simplified for the end user and developer to make/install a package and even distribute it.
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
GUI installers aren't really necessary. What kinds of options would you have for the user?
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Originally posted by: SleepWalkerX
Originally posted by: nweaver
synaptic, Click, search, click add, click install, click done

/thread off

But then you rely on repositories. A developer would have to get his package on a lot of repositories, making a lot of different packages for different distros, and then there's just the fact that its centralized not even including the chance that some of those packages are bugged (which I'm sure has happened). I'm not saying its hard for the end user to install a package. Just that it could be much more simplified for the end user and developer to make/install a package and even distribute it.

A centralized location keeps people from having to scour the web for a simple solution. I think that's why download.com was so popular for a while.

If it isn't centralized, you'll have to google for the program, or check freshmeat. Hope the webpage isn't down. Hope the developer made a package for your distro (you know you can't really use the same package on ALL distros, right?). etc. It'd be _more_ of a pain in the butt.
 

SleepWalkerX

Platinum Member
Jun 29, 2004
2,649
0
0
Originally posted by: n0cmonkey
Originally posted by: SleepWalkerX
Originally posted by: nweaver
synaptic, Click, search, click add, click install, click done

/thread off

But then you rely on repositories. A developer would have to get his package on a lot of repositories, making a lot of different packages for different distros, and then there's just the fact that its centralized not even including the chance that some of those packages are bugged (which I'm sure has happened). I'm not saying its hard for the end user to install a package. Just that it could be much more simplified for the end user and developer to make/install a package and even distribute it.

A centralized location keeps people from having to scour the web for a simple solution. I think that's why download.com was so popular for a while.

If it isn't centralized, you'll have to google for the program, or check freshmeat. Hope the webpage isn't down. Hope the developer made a package for your distro (you know you can't really use the same package on ALL distros, right?). etc. It'd be _more_ of a pain in the butt.

well fine then we could still have respositories, they can just carry autopackages. the fact that it'll still be easy to download it independent of repositories and be just as easy will bring the best of both worlds. plus those that aren't sure if an autopackage will be malicious can trust the repository.

there are still a lot of benefits that autopackage brings.

Originally posted by: n0cmonkey
GUI installers aren't really necessary. What kinds of options would you have for the user?

GUI installers are much more convenient for newbies (as it should launch once you download and go through an installation wizard instead of wondering what the hell you do with a deb file) and power users (as its quicker to hit next three times rather than type out dpkg programx-0.43.3r6beta2etc, but you can still just use the command line with autopackage).
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Originally posted by: SleepWalkerX
well fine then we could still have respositories, they can just carry autopackages. the fact that it'll still be easy to download it independent of repositories and be just as easy will bring the best of both worlds. plus those that aren't sure if an autopackage will be malicious can trust the repository.

Why autopackages?

there are still a lot of benefits that autopackage brings.

Like?

GUI installers are much more convenient for newbies (as it should launch once you download and go through an installation wizard instead of wondering what the hell you do with a deb file) and power users (as its quicker to hit next three times rather than type out dpkg programx-0.43.3r6beta2etc, but you can still just use the command line with autopackage).

How is that more convenient for anyone? What's the point of a NEXT button, if there are no configurations necessary? Just make the existing tools more verbose.

How does your autopackager handle dependencies? Does it check a file for bad downloads or modification (using common hashes like SHA)? How well does it integrate with EVERY distro of Linux out there? How does it handle multiple versions of libraries, like glibc?
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
I finally found the autopackage site, and I'd link to it if it didn't keep giving me pop-ups and offering to let me install this spyware crap. :|

Yeah, I'd never use this trash.
 

SleepWalkerX

Platinum Member
Jun 29, 2004
2,649
0
0
Not getting anything in Firefox. Unfortunately if they do have a pop-up I guess they have to foot the bill somehow for bandwidth. :( But don't worry about the program containing spyware. They talk about the risks of getting spyware from autopackages themselves in their faq. It pretty much covers all your questions and can explain them better than I can. I'm not exactly an expert to Linux. :p If i have time later I'll try to go back and answer your questions.

http://autopackage.org/faq.html
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
So far I'm not sure that I've seen a distro that's even included it. I really hope this becomes adopted.

I hope it never gets adopted. Packages across distributions is virtually impossible, hell you can't even use Ubuntu repositories in Debian safely and Ubuntu is like 90% Debian. Packages have to be crafted for a specific release of a specific distribution, it's just how things work and if you follow the rules it all works fine.

http://www.netsplit.com/blog/tech/autopackage.html
http://www.netsplit.com/blog/tech/autopackage_II.html
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Originally posted by: SleepWalkerX
Not getting anything in Firefox. Unfortunately if they do have a pop-up I guess they have to foot the bill somehow for bandwidth. :( But don't worry about the program containing spyware. They talk about the risks of getting spyware from autopackages themselves in their faq. It pretty much covers all your questions and can explain them better than I can. I'm not exactly an expert to Linux. :p If i have time later I'll try to go back and answer your questions.

http://autopackage.org/faq.html

They also tell me that /usr is for networked applications. :confused:

Where should I put it?
When installing a package, you can often choose which prefix to use. Most of the time, you should just leave this as the default. However, if you're curious then read on.
The prefix is similar to the installation path on Windows machines, in that it's the location in the filing system where the program files are placed. However, virtually all Linux software shares the same structure, which is standardised in the Filesystem Hierarchy Standard, so often packages are combined together into the same directory. This has a number of advantages, namely a smaller path, and it's also useful in networked scenarios. A quick guide to prefixes:

/ - used for system software
/usr - used for user software that is available via a network
/usr/local - used for user software that is available locally, or installed by yourself

/opt - not used much, in theory meant for "optional" software. Most of the time, /usr/local is used instead.

If you're not on a network, then it doesn't really matter whether you use /usr/local or /usr.

Programs specific to a desktop environment such as KDE/GNOME are usually installed into the KDE/GNOME prefix if that isn't /usr or /usr/local (SuSE for instance places them in /opt/kde and /opt/gnome)

You should normally just accept the default, whatever that is. It's been chosen for you to minimize problems. Usually it'll be /usr or ~/.local.

You choose the prefix by passing the --prefix argument to a package, like this:

./foo.package --prefix=/opt

They also don't use /usr/local despite the fact that /usr/local is made for things like this.

FHS says:
Chapter 4. The /usr Hierarchy
Purpose
/usr is the second major section of the filesystem. /usr is shareable, read-only data. That means that /usr should be shareable between various FHS-compliant hosts and must not be written to. Any information that is host-specific or varies with time is stored elsewhere.

Large software packages must not use a direct subdirectory under the /usr hierarchy.


--------------------------------------------------------------------------------

Requirements
The following directories, or symbolic links to directories, are required in /usr.


Directory Description
bin Most user commands
include Header files included by C programs
lib Libraries
local Local hierarchy (empty after main installation)
sbin Non-vital system binaries
share Architecture-independent data

...

/usr/local : Local hierarchy
Purpose
The /usr/local hierarchy is for use by the system administrator when installing software locally. It needs to be safe from being overwritten when the system software is updated. It may be used for programs and data that are shareable amongst a group of hosts, but not found in /usr.

Locally installed software must be placed within /usr/local rather than /usr unless it is being installed to replace or upgrade software in /usr.

The FHS says that autpackagers are wrong. Oops.

Autopackage does not cover the topic of library version mismatches.

On central repositories:
# What's wrong with centralized repositories, apt style?

The system of attempting to package everything the user of the distro might ever want is not scalable. By not scalable, we mean the way in which packages are created and stored in a central location, usually by separate people to those who made the software in the first place. There are several problems with this approach:

Centralisation introduces lag between upstream releases and actually being able to install them, sometimes measured in months or years.

Software packages should be tested before they're released. The KDE guys probably don't run _every_ ditro out there, so it's up to the distro maintainers to provide such testing.

Packaging as separate from development tends to introduce obscure bugs caused by packagers not always fully understanding what it is they're packaging. It makes no more sense than UI design or artwork being done by the distribution.

And sometimes they fix bugs.

Distro developers end up duplicating effort on a massive scale. 20 distros == the same software packaged 20 times == 20 times the chance a user will receive a buggy package. Broken packages are not rare: see Wine, which has a huge number of incorrect packages in circulation, Mono which suffers from undesired distro packaging, etc

I'm guessing that package maintainers work together better than these people believe.

apt et al require extremely well controlled repos, otherwise they can get confused and ask users to provide solutions manually : this requires an understanding of the technology we can't expect users to have.

First of all, yes, we can expect users to know how to use their software. Second, of course distributing software requires control. Try to manage a project without some kind of control.

Very hard to avoid the "shopping mall" type user interface, at which point choice becomes unmanagably large: see Synaptic for a pathological example of this. Better UIs are possible but you're covering up a programmatic model which doesn't match the user model esp for migrants.

They need to explain the concept of the "shopping mall" interface, I couldn't find a detailed explanation in their FAQ.

Pushes the "appliance" line of thinking, where a distro is not a platform on which third parties can build with a strong commitment to stability but merely an appliance: a collection of bits that happen to work together today but may not tomorrow: you can use what's on the CDs but extend or modify it and you void the warranty. Appliance distros have their place: live demo CDs, router distros, maybe even server distros, but not desktops. To compete with Windows for mindshare and acceptance we must be a platform.

Why must we be their definition of a platform again? I think Debian, for example, provides a very stable system for people to develop. Hell, it barely changes. Can't get much more stable than that. :p
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
Packages have to be crafted for a specific release of a specific distribution, it's just how things work and if you follow the rules it all works fine.
Yeah, I love how unstable the APIs are in the linux world. Makes releasing products that should work cross-distribution fun. For SeaMonkey, we needed to link against an old glibc, and for some reason had to use a custom-compiled (statically linked, I think) gcc. I set up an old (debian 3.0) box to use. It's a pain in the butt. For Windows, you can compile it anywhere and run it anywhere without any trouble.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
Yeah, I love how unstable the APIs are in the linux world

Actually I've had a lot of luck with mismatche library versions by just creating a symlink to whatever soname is expected. As long as it's not a big change like GTK1.2->GTK2.0 there's a good chance it'll work. But, yea in general it's a bit of a pain. Mostly because the developers don't care as much as Apple or MS because since they don't have to, most things can be recompiled for ABI changes and fixed for API changes by just about anyone.

For SeaMonkey, we needed to link against an old glibc, and for some reason had to use a custom-compiled (statically linked, I think) gcc.

IMO SeaMonkey is the exception though, not the norm. Most apps in the OSS world are a very small fraction of the size of SeaMonkey and most of them are only concerned with compatibility across unixes if they think about it at all.