Use apt-p2p To Improve Ubuntu 9.04 Upgrade 269
An anonymous reader writes "With Jaunty Jackalope scheduled for release in 12 days on April 23, this blog posting describes how to switch to apt-p2p in preparation for the upgrade. This should help significantly to reduce the load on the mirrors, smooth out the upgrade experience for all involved, and bypass the numerous problems that have occurred in the past on Ubuntu release day. Remember to disable all third-party repositories beforehand."
Website and Warning (Score:4, Informative)
The site [camrdale.org] doesn't have much information, but other sources I have read state that apt-p2p is very experimental. Use at your own peril!
Re:Website and Warning (Score:5, Insightful)
The site looks badly outdated. The caveat I would add to your warnings is that the upload speed is uncapped by default. You'll want to limit this unless you want the world to be able to leech you hard. If I left this unlimited my ISP would fucking kill me.
Re: (Score:3, Informative)
Easily found from apt-p2p's main page: protocol [camrdale.org]... please don't ask me to browse the web for you again, kthxbye.
Re:Website and Warning (Score:4, Funny)
Easily found from apt-p2p's main page: protocol [camrdale.org]... please don't ask me to browse the web for you again, kthxbye.
What are you a leprechaun? You're bound by ancient laws to comply with any mortal request, the only loophole being that you can bitch all you want :).
Just to be sure, can you please post your credit card number, name, address and CID.... oh and naked pictures of your leprechaun wife too and any daughters over the age of 250 (I'm not falling for pedochaun's trap again).
Re: (Score:3, Informative)
If we're talking about package security, there is already signing of the packages themselves.
Getting them from a different source shouldn't matter as long as the signing method is secure, and i believe with deb it is GPG so, yea.
Re: (Score:3, Informative)
I don't know about Ubuntu, but Debian uses GPG to sign all their packages, so I'd guess that Ubuntu does the same.
Re: (Score:3, Informative)
You do realize that there are no extant MD5 or SHA1 attacks that can produce data of a specified length that matches a specified hash, right? (For that matter, there isn't such an attack when the length isn't specified.) You would need such an attack to poison something like BitTorrent with false data.
(This protocol, and BitTorrent, both use SHA1.)
The existence of a type of attack on MD5 doesn't even imply that MD5 is rendered useless, much less SHA1. There's only a risk where that type of attack can be emp
Re: (Score:3, Interesting)
Good point, I had of course forgotten that the blocks are constant size. That would require a much better attack than those currently available.
You are aware that there is an attack for MD5 when the length isn't specified though? There is a demo that will produce forged pdf documents with a given md5 hash.
Re: (Score:2)
Yeah I saw that on the link that drinkypoo provided. That isn't really a whole answer though, as sha1 should be broken fairly soon. The other reply from strstrep seems more informative, even though he says that he's just guessing. A higher level signature on each package would provide much better security.
apt-p2p works fine (Score:2)
I set it up about 6 months ago on my girlfriend's computer, and except for when I'm stealing internet and i've got an intermittent connection, it's worked solidly.
Alternate CD (Score:5, Informative)
Re: (Score:3, Insightful)
That will help a lot, but you're still going to have a lot to get from the mirrors on a typical system. Odds are, many of the packages in the ISO will be outdated by the time you get it :P I'm running apt-get update on my apt-p2p'd system and so far, so good.
Re: (Score:3, Interesting)
No, as he stated, you can get the alternate disc from bittorrent as well. Then use that to upgrade to 9.04. That would DRASTICALLY reduce the load on the mirrors..
Re: (Score:2)
...some people even package up the other repositories as buyware DVDs these days.
So you could buy/torrent pretty much the whole ball of wax if you wanted to.
Re: (Score:3, Insightful)
The person who modded you up obviously misunderstood my comment and/or the situation vis-a-vis updating your system in the same way you did: every package on your system is unlikely to be represented on the alternate install CD, and even some of those which are will be outdated by the time you get the ISO, so you will still be downloading numerous packages from the repositories. I didn't say it wouldn't help. You either didn't read my comment, didn't understand my comment, or don't understand the relationsh
good idea but... (Score:5, Interesting)
So my point is sounds like a great idea but if it is enabled by default it had better have some way to detect bandwidth throttling of p2p networks and revert to http transfer.
Re:good idea but... (Score:4, Interesting)
I had wondered for a while why yum and apt did not do this by default. *snip*
Because it would be wrong to default to forcing a person to share their limited resources.
Re: (Score:3, Insightful)
Re: (Score:2)
Re: (Score:2)
The beginning flood was the BT program searching for sources and announcing itself to the network.
If you would have kept it running, it would have gone up again. Or there were not as much high-bandwidth users, so compete with the direct download server.
At least, that is my experience with P2P programs.
Re: (Score:2)
You shouldn't be using BT unless you have an alternative.
If you can switch, consider doing so and let market forces punish BT.
If BT is entrenched, then I pity you, and encourage you to use evasive maneuvers, preferably without breaking TOS.
Either way, you may wish to make it a political issue and complain loudly to whatever you britons have for our yankee FTC or FCC. I'd guess "ministry of communication" but I'm just a yankee.
Re: (Score:3, Interesting)
Re:good idea but... (Score:4, Informative)
This isn't how it works in the UK. If BT has phone lines going somewhere, then you have dozens of ISPs to choose from.
They can be buying direct from BT wholesale, or own anything quite a bit further up the chain. Noone should really be touching the BT consumer ISP for any reason.
Re: (Score:3, Insightful)
They are in bed with the RIAA and MPAA, but they give massive bandwidth, and do not block any ports or filter p2p.
So, they're letting you do whatever you want, to make sure they maximize the amount of money they can sue you for?
Re: (Score:2)
Of course, if you're not using p2p to download copyrighted material, that might not be a problem.
And to all of us who are not copyright lawyers, encryption is easier. BTW even Linux is copyrighted.
Re: (Score:3, Informative)
Please undo moderation to parent post. Signed packages anyone?
Re:good idea but... (Score:5, Informative)
Ubuntu packages are signed. The signature certifies that the package was mirrored as-is and not modified in any way.
Re:good idea but... (Score:5, Informative)
All packages are signed, the repository is just a convienient way of getting them. If you add a third party repository they usually also ask you to add their public key to the trusted package signers. That's also why you have all the local mirrors - I doubt Canonical operates very many of them. Same thing in companies, set one machine to download and the 100 others to download from the local machine, you don't need to put any trust in that machine as it's just passing signed packages. So you download the package from P2P or whatever, apt checks the signature and if's Genuine Canonical(tm) it'll install the package otherwise it'll complain. Didn't you notice the repositories are all http? No certificates or security checks there, anyone can give you any garbage data but it won't have the right signature.
Re: (Score:2)
Now, do you trust the key that you downloaded (along with the iso) through plain http?
Slashdotted... (Score:3, Informative)
mirror here: http://74.125.77.132/search?q=cache:3gY3Bq4EKnMJ:blog.chenhow.net/os/linux/ubuntu/using-apt-p2p-for-faster-upgrades-from-intrepid-to-jaunty/+http://blog.chenhow.net/os/linux/ubuntu/using-apt-p2p-for-faster-upgrades-from-intrepid-to-jaunty&cd=1&hl=nl&ct=clnk&gl=nl
Deterrent (Score:5, Funny)
Partitions are your friend (Score:4, Insightful)
Me too. Often it's quicker to do a full install from scratch.
That's why my systems always have at least two different partitions: one for "/" and another for "/home". I can reformat my system partition and still have my data intact.
Re: (Score:2)
I have never understood why Ubuntu doesn't format partitions this way by default when you use the default "automatic" partitioning scheme..
Re: (Score:2)
LVM should help with that, though. Separate /boot / and /home partitions, with / and /home resizeable.
Re: (Score:2)
Re: (Score:3, Informative)
Not going to help you - most filesystems are growable but not shrinkable online.
Re: (Score:2)
LVM is a nightmare when you cross drives with it - it's like having a striped drive - any problem to either one will destroy the entire volume.
Right now I'm configured with:
The separate /boot partition should tell you how old my system started. I had to rebuild the drive because the 8.10 installer required more space than the /boot partition had. Moving & growing the first partition on the drive w
Re: (Score:2)
To be honest me neither, there is always something not quite right, with a brand new distro, Intrepid broke sound for a number of people. Was it Hardy with the evolution bug that maxed out the CPU. I think it's always going to be that way but its usually fixed within a month at most. Die hard Ubuntu users hold up their hands in horror and say things like thats it I'm moving back to Windows but it's all good fun and we all get busy fixing the problems and finding cures and occasionally reverting back to th
I'm upgrading to 8.10 (Score:2, Insightful)
Currently on 8.04, I'll be upgrading to 8.10 sometime after 9.04 is released.
Staying 6 months behind is a reasonable compromise. Let the lab rats (er, enthusiasts!) debug the new stuff first. Last time I checked 8.10 in a VM there was something like 320MB worth of updated packages.
As for the packages themselves, run a local apt proxy like approx [debian.org], especially if you have more than one Debian or Ubuntu system. It keeps a copy of every .deb you download, and automatically purges the ones that are outdated.
Re: (Score:2)
My advice if you like the stable releases is to skip the .10 releases altogether. I always like to chase the bleeding edge, although that is kind of a dangerous place to be. My most salient advice is to make a bootable, external-disk full system backup before attempting an Ubuntu upgrade. It's pretty easy once you figure out grub.
Re: (Score:2)
Agreed. 9.04 (Beta) feels way stabler than 8.10 (Beta), just like 8.04 (Beta) felt way stabler than 7.10 (Beta).
Re: (Score:2)
My most salient advice is to make a bootable, external-disk full system backup before attempting an Ubuntu upgrade.
I'm far too lazy for that, so I tend to just back everything up to a folder -- even a folder on the same system. It's much easier to downgrade if the entire thing falls apart, but I've never had an Ubuntu upgrade completely kill my filesystem or hard drive.
Re: (Score:3, Insightful)
I've never had an Ubuntu upgrade completely kill my filesystem or hard drive.
"I know what you're asking yourself..."
8.10 upgrade glitch: downclocking (Score:2)
During the 8.10 upgrade, at some point, the cpu frequency selector will get stuck on the "Ondemand" setting, which during an OS upgrade pretty much means "use all the speed the CPU can give".
On my computer, that meant having it shut off midway the upgrade as I raced to downclock it screaming at policy-tool getting in the way ("I AM &@%!ING ROOT WHAT DO YOU MEAN I AM NOT ALLOWED"). If you need downclocking too, be wary.
I didn't experience this o
Re: (Score:3, Informative)
ondemand actually happens to be the best governor.
In theory, "powersave", by keeping the CPU frequency at a minimum would save some power in comparison. In practice, it doesn't. This is because doing anything at all prevents the CPU from entering the lowest power using modes (which go beyond simply dropping in frequency).
So it's more efficient to make the CPU run at full blast, do whatever needs to be done, then go to sleep (C3, not suspend to RAM), than to do the same work at a lower clock speed, keeping t
Re: (Score:2)
I am sorry I wasn't clear enough. The problem is that my CPU will overheat and the computer will shutoff if the CPU runs at full blast for too long, which is exactly what happens with ondemand during a distro upgrade.
Re: (Score:2)
Sounds like you have a hardware problem, not a software problem.
Can you limit the max bus speed within the bios to hard-limit the max speed of the processor?
Re: (Score:2)
Ok, I can suggest some workarounds then.
First, you can manually force the governor with this as root:
If that still doesn't work, you can try upgrading from the commandline, with:
Since this is on the commandline, you can press Ctrl+Z at any time to temporarily interrupt the upgrade process. Then use the "fg" command to resume it again.
Still, overheating isn't something that should normally happen. Maybe
Re: (Score:2)
I guess my fifth attempt should have been the refrigerator. >.<
Slower to start (Score:2, Interesting)
Good citizenship (Score:3, Insightful)
What I like about this is not so much the potentially faster upgrade as the ability to contribute a bit to others. The six-monthly upgrades are are rate enough that I don't mind if they are a bit slow - not that they have been. But I am very conscious that I am using other people's freely given bandwidth and I am pleased to be able to give some back.
Does anybody know if I can force my various machines to cross-peer from each other? If I update one first, I don't want the others searching the Net for peers - they should just copy from the first.
Re: (Score:3, Informative)
You should just set up an apt-cache on one and direct the others to fetch from the first. There are several to choose from. Search for "apt proxy."
Bandwidth usage (Score:5, Interesting)
Re:Bandwidth usage (Score:4, Informative)
Can't help you with the paying for extra bandwidth, but the wondershaper [lartc.org] has helped my limited speed home network remain responsive during downloads.
Re:Bandwidth usage (Score:4, Informative)
It will obliterate your monthly use cap.
This mode of distribution only works in a perfect world, which few of us live in now.
Re: (Score:3, Interesting)
Just installed it, there's an option in /etc/apt-p2p.conf to limit the upload bandwidth. I haven't tested it yet, however ..
Re: (Score:2)
I'm concerned that after reading the article, and apt-p2p's FAQ page, that I can't find any guide to how much upload bandwidth this thing will use. While I'm all for sharing, I find it important to cap my upload speed so my connection performs well on other stuff I'm doing, and also stop uploading once I'm at 1:1 sharing or so. Some of us pay if we use too much bandwidth!
I hope that there is still an option to limit the downloads to mirrors, otherwise at company I work at I will probably be unable install/u
Re:Bandwidth usage (Score:5, Funny)
Re: (Score:2)
Just keep using apt. There is no need to use this tool, it's voluntary. My ISP caps my uploads and downloads, but hosts an uncapped Ubuntu mirror so sorry to all the potential leechers but I'm sticking with traditional apt.
You could also explain that p2p is not inherently bad, just another protocol but it will probably fall on deaf ears.
No thanks, im no criminal (Score:5, Funny)
Hey mods... (Score:2)
Whoosh.
Mod this +5 Funny so we can see it for the sarcasm it is.
(If it's not sarcasm, of course, you are a sad little man.)
Re:Hey mods... (Score:4, Funny)
I even actually wrote it:
$ python -c 'print (lambda words, random: " ".join([random.choice(words)[:-1] for i in range(random.randint(1,10))]))(open("/usr/share/dict/words").readlines(), __import__("random"))+"."'
angoras lawgiver's Father's approbations uninteresting inferring Antonio's Clotho's chlorine.
Slashdotted? (Score:5, Informative)
It worked for me. But in case it really is slashdotted here's the story, from memory (let's test those theories eh?)
Re: (Score:2)
Re: (Score:2)
I'm sorry, I should not have accidentally used HTML format:
rsync -avH /etc/apt/ /etc/apt.orig/ --delete # Makes sure you have an up-to-date copy /etc/apt/sources.list apt-get update
apt-get install apt-p2p # or use synaptic for a GUI interface
sed -i 's%//%/localhost:9977/%g'
Re: (Score:2)
You still got the apt-get update on the wrong line, and dropped a / from your regexp. But also we both forgot to update files in /etc/apt/sources.list.d. With your help: /etc/apt/sources.list.d/*.list :)
for i in
do
sudo sed -i 's%//%//localhost:9977/%g' "$i"
done
And of course, tweak *.list as appropriate. My directory had *.list and *.list.save files... I removed the latter
Re: (Score:2)
More Linux mirrors needed (Score:5, Insightful)
Many primary Linux download sites wind up taking an unreasonable amount of traffic from default setups. If you want to contribute back to the OS's and packages that you find so useful, consider setting up a local mirror to share with the world at large. If you can't justify that, at least consider setting up an internal rsync mirror anytime you have a dozen or more boxes to make updates and downloads much faster for your site, and configure your local machines to point to that local mirror.
This turns out to be especially useful for PXE installaters and cluster setups, for any Linux or other OS. There's nothing like having 100 internal Linux machines all trying to update OpenOffice at the same time from an external primary site, through a corporate DSL line, to ensure that many of the updates will fail.
Re: (Score:2)
just install apt-cacher-ng, WAY easier than a local repository, and doesnt waste bandwidth on stuff you never install.
Re: (Score:2)
Re:More Linux mirrors needed (Score:4, Insightful)
its reasonable, but yes you should have one computer set to upgrade and hour before the rest, and large deployments could use a local mirror. Its smart unlike a regular proxy server.
Why upgrade? (Score:5, Insightful)
If it works, why upgrade at all?
Ubuntu 8.04 is a Long Term Support (LTS) release. It will have any security patches until the next LTS release, which is typically every 18 months. So, why not just wait for 9.10?
ws
Re: (Score:3, Informative)
For the same reason that you'll upgrade to 9.10 instead of waiting for 11.04: Features.
Sure, it'll have all the bugfixes for years, but it won't have any of the new features.
(In case anyone has forgotten, LTS are supported for 3 years on the desktop, so there's no 'need' to upgrade every 18 months.)
Re:Why upgrade? (Score:5, Funny)
AFAIK, women prefer men who have all the latest upgrades
Re: (Score:2)
Certainly, there are systems you'd rather keep up and running in a known-good configuration than try new software that may or may not work as well. I have a MythTV backend / web
Re: (Score:2)
Mostly because I'm already on 8.10, and I don't like being a full year out of date.
Some of that is actually legitimate -- for instance, various development tools and games would be falling out of date. Some of it's just impulsive -- I occasionally regret updating to 8.10, as KDE4 was really not ready.
Re: (Score:2)
Anyone knows a good, uptodate, userfriendly and binary distro that still uses KDE3.5?
Debian "Lenny". Though I think I'd recommend staying with 8.04 instead of going to Lenny, having tried both I'd say many things just work better on 8.04.
Re: (Score:2)
If you have 8.10, you cannot upgrade to 9.10 directly. You must upgrade to 9.04 first. That's why you cannot simply wait for 9.10.
Re: (Score:2)
Better boot performance, new notification system, OpenOffice 3, Firefox 3.1, ext4.
Besides those, it's been my experience that any new version of any Linux distro, and especially Ubuntu, that each new version supports hardware that previously didn't work, or took an act of a command-line-god to get working.
My policy is, I keep the LTS release on my server, and upgrade it when a new LTS comes out. However, on my desktops and laptops, I always upgrade to the latest release. If you look at the list of new fea
Re: (Score:2)
What "latest" stuff?
Don't be vague.
If you want to get the new release when it's hot (Score:3, Interesting)
I honestly suggest to upgrade when the RC is out (1). That's one week before the actual release date, or in other words Thursday. FYI, when I upgraded to the Alpha 6 I had to download 1.3 GBs; torrenting as much is still going to take a lot of time.
The Release Candidate is typically identical to the "gold" release; also you will help Canonical in testing everything runs as good as it should. If you install apt-p2p (2) you'll even get the warm fuzzy feeling of being a seed for the new packages. :D
The upgrade process is identical -- the only difference is in starting it. Hit Alt-F2 and use "update-manager -d" then hit "Upgrade".
(1) Or hell, upgrade /right now/. I'm using the beta and it is rather stable and experience tells me the beta is always pretty near to what goes gold.
(2) I wouldn't use apt-p2p to upgrade to a dev version as you will find far less peers. However installing it afterwards should let you act as a seed for those packages.
Re: (Score:2)
Is Alt-F2 "run" in your shiny little world? It never has been for me, and still isn't. On the other hand, I did use Compiz and gmrun to make Super+R run programs.
Re: (Score:2)
Yes. I'm sorry your little word isn't as shiny as mine (despite my 3 years old shiny little world has a 1.73 GHz CPU that shuts off when it goes at 1.73 GHz, a non-working firefox-3.0, a malfunctioning usplash, a gedit that doesn't quite like zsh, three failed updates, etc.), but my point is very simple.
If you want to run the latest and greatest software (and you want to run all the risks this takes) you may as well get it a bit sooner than the rest so that you can report bugs before the actual release, and
Re: (Score:2)
Typo: "provided the thing boots after the upgrade."
Irony (Score:5, Informative)
On a side note : web data and pages themselves could be p2p distributed too, no? Say a peer gets a webpage's hash (containing html and images) and the date/time of expiry for a webpage from a server. If other peers have that page (html+images), and it's up to date, you could download their copy. Otherwise, the server sends a fresh copy to you, and you seed it for others. Not being in computer science, I'm sure this has been proposed before and that there are glaring shortcomings I have missed.
Re: (Score:3, Informative)
Two projects that do what you say that I know of:
http://flashback.calit2.uci.edu/apache2-default/ [uci.edu]
http://sns.cs.princeton.edu/2009/04/firecoral-iptps/ [princeton.edu]
Re: (Score:2)
Well, first it screws with hit counters and any other type of dynamic content. Tben you get quite a few round trips, first one to the server asking for hash and expirery, then if you want it via p2p you have to get some peers, then you have to request from them, maybe they've left the swarm, are dead slow, feeding you junk data or whatever until finally you hopefully get a good verison. Plus you need people willing to p2p your site which people don't understand and suddenly they blame your site for making i
Apt-Cacher (Score:2)
Re: (Score:2)
I used that when I was running multiple Kubuntu machines in my house and it was definitely worthwhile, and fairly easy to set up.
I was actually wondering if anyone would recommend this method.
howtoforge (Score:2, Informative)
The original link was dead. This is from howtoforge:
http://www.howtoforge.com/ubuntu-using-apt-p2p-for-faster-upgrades-from-hardy-to-intrepid [howtoforge.com]
Re: (Score:2, Flamebait)
I honestly cannot understand why they don't just release deltas against the old packages.
They're waiting for you to overhaul apt to support delta packages.
Well? Let's see some code!
Re: (Score:2, Informative)
Re: (Score:2)
Delta packages make little sense in Debian universe: you'll never find two servers configured same way. Meaning that for the delta packages to be useful, double amount of packages has to be maintained. Actually more than double: one has to maintain the delta packages to migrate from several older versions. And for fresh installs and security updates, not only deltas but the packages themselves would also have to be available too.
As *buntus go, that might make sense. After all most people sit on a partic
Re: (Score:2)
they already have this, its called defdiff and i think there are some debian mirrors that host them. But it makes more sense to use the full thing, and then have a local caching server. diffs would be more bandwidth where there are multiple installs from differnt starting packages.
Re: (Score:3, Informative)
More promising is some sort of system built on zsync [moria.org.uk] - there are some ideas here [ubuntu.com].
Re:Mirror anxiety (Score:5, Interesting)
"Is it just me or is the fun game of "pick your closest mirror" not very fun at all? Just download the damn thing at best possible speed. I don't care where you get it from. "
You are aware that "closest" in this context means "faster", aren't you?
"As if I'm in a position to pick the best site where to download something from. Give me a break. Apologies to the power users who can lick their Ethernet cable and tell which site will have the best download performance and availability."
Probably is too much a power user the one able to install the package "apt-spy" which will build a sources.list for you based on bandwith probes, isn't it?
Oh, and please, don't let parent post at +Insigthful when it's plain -Nonsense.
Re: (Score:2)
Probably is too much a power user the one able to install the package "apt-spy" which will build a sources.list for you based on bandwith probes, isn't it?
Hey, thanks for mentioning the apt-spy trick---this praticular power user didn't know about it.
And if it works so great, why don't I get it and its associated shorter download times by deafult? Then I don't have to spend time on that...
Also, If I bask in thy leetness will you consider the usability benefits of not asking the user too many questions, especially when they can be answered at the cost of a little bandwidth and CPU time?
Re:Mirror anxiety (Score:5, Insightful)
Is your point that a host that's connected via T-1 that's a mile away is faster than a host that's connected on an OC-3 3,000 miles away? That is, based on knowing the geographic location of a host, you're saying it's somehow an indication of how fast my download will complete? That's the only thing that matters to me -- when will I have my completed bits. My only point here is that the information given in mirror selection is not enough to pick the "fastest" way to get what I want. It lists the geographical location and that's it.
Yes. Yes it is. If such a list can be generated, then why not just generate it in normal operation or list the mirrors based on the output of that tool? Though I do appreciate the tip, and I will try it. Obviously bandwidth and availability varies on a day by day basis. So taking a snapshot at one point in time seems like it will get stale.
It's a legitimate end-user concern. "Which mirror should I select" should not be a user problem. The user wants his bits as soon as possible, which is a technical problem that has allegedly been solved with apt-spy. If that's the case, we should probably integrate that with the mirror selection process, and then you don't have to put up with all the "Nonsense".
Re: (Score:2)
Re: (Score:3, Insightful)
Upgrades of Linux distributions work much better than Windows distributions because of the library structure and package dependancy system.
If you try to upgrade a library on Linux to a new version that cannot coexist with a previous version that other apps depend on, the new package will be set up so that it tells you it needs to remove the old library and its dependent apps if you really want to proceed.
There's also not going to be a lot of garbage hanging around in a "registry". If a package doesn't work