(This is a long rant, if you have work, simply ignore it
To know the basic necessary things about the structure of a .repo file, somebody has to write the docs in a place that the installing user can read. At least, it should make it into the installer because at the time of installing Fedora, there is no internet access available to use Google to look up and there is no manual to read from. This necessitates a minimum second install if you are a newbie and have this set of circumstances - low bandwidth, no manual, no printer, busy programming life (keeping family/wife out
), etc, etc.
Never mind, now that it is working, let's leave it at that.
I'll look closely at the repo file format as soon as i get this set up done
Now to solution that might merit some thought:
(a) See Klik (if you havent already
) - went there from here:
(b) Give this a thought -
Assume that there are at least 10,000 fedora users who download the iso every month - that is about 3 GB x 10,000 = 30 TB. Of course, lets say caching, p2p, etc make this a total strain of about 3TB on the network. Patches are pushed out daily/hourly to end user PCs.
At mirrors, isos are updated about once in 3 months - and then you inevitably have to download patches that keep you clear of security problems and crashes introduced by new features.
Practically it boils down to 10,000 people patching their machines every day/week.
Still, the isos, they are updated only once in 3-6 months.
Now as in my case, I got the DVD from an external source (like on-disk)
I reduced traffic that wee little bit - and had _expected_ no need to update.
But update it did do. And _lots_ of it.
Which means, in effect I would have downloaded half the distro - again.
So if I'm doing that anyway, why aren't we updating iso's every week rather than month - or push out changesets rather full packages everytime?
Pushing the new iso to a mirror is cheaper than pushing the old iso 10,000 times which again download the patched stuff anyway.
The whole problem is that the sizes are not
small - "latest software" updates take anywhere from 100MB to 1GB. That's half the size of a huge distro and fully the size of a basic desktop distro.
Obviously there are changesets - well, I've seen some so I know this isn't missed out - it has to have been obvious for maybe decades to maintainers. But why is that not being applied everywhere?
Or alternatively, do we really need to download and update all those patches - well maybe it depends on what you use the system for I guess.
In which case, I'll ask for myself : what's the best frequency for a regular run-of-the-mill LAMP developer?
IMO, some words to that effect must be shown before the user presses the install button.
Then there's Klik - I've not tried it out, but it seems really nice :
I have an app. It does what I want it to do. I dont want any more features. Give me a zip file. I'll use it on any recent linux distro. At least the family that I use - rpm/deb - next two years no upgrade, no features, no bugs. Life is easy.
IMO, this is important enough not to ignore, unless I missed something obvious
. In which case, I realize I'm pretty much new to the command-line and file formats. I've been spoilt by openSUSE - it just works - and so this looks tough.
Maybe I should sum it up this way:
the install sequence must be managed by a front-end exactly like Synaptic
. If I find that the thing is taking too much time, I cancel it right there. Then on, that package or software does not exist till I redownload it. But I should be allowed to stop at will with a few standard error or warning messages. And yes, PackageKit so far does not have editing. That's a temporary problem.
Sorry again for sounding like a crybaby but 6 hours for a basic install (done twice over) takes some emotional toll.
I looked up these:
So, I think I must change my install strategy
Only ever use LiveCDs to try out a new distro
If possible, get KYum. It is awesome.
For my needs, it is better than PackageKit as of now.