Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Upgrades Debian Software Linux

Use apt-p2p To Improve Ubuntu 9.04 Upgrade 269

An anonymous reader writes "With Jaunty Jackalope scheduled for release in 12 days on April 23, this blog posting describes how to switch to apt-p2p in preparation for the upgrade. This should help significantly to reduce the load on the mirrors, smooth out the upgrade experience for all involved, and bypass the numerous problems that have occurred in the past on Ubuntu release day. Remember to disable all third-party repositories beforehand."
This discussion has been archived. No new comments can be posted.

Use apt-p2p To Improve Ubuntu 9.04 Upgrade

Comments Filter:
  • Website and Warning (Score:4, Informative)

    by Daengbo ( 523424 ) <daengbo AT gmail DOT com> on Sunday April 12, 2009 @07:31AM (#27547863) Homepage Journal

    The site [camrdale.org] doesn't have much information, but other sources I have read state that apt-p2p is very experimental. Use at your own peril!

    • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Sunday April 12, 2009 @07:36AM (#27547887) Homepage Journal

      The site looks badly outdated. The caveat I would add to your warnings is that the upload speed is uncapped by default. You'll want to limit this unless you want the world to be able to leech you hard. If I left this unlimited my ISP would fucking kill me.

    • I set it up about 6 months ago on my girlfriend's computer, and except for when I'm stealing internet and i've got an intermittent connection, it's worked solidly.

  • Alternate CD (Score:5, Informative)

    by elwin_windleaf ( 643442 ) on Sunday April 12, 2009 @07:32AM (#27547873) Homepage
    You can also upgrade Ubuntu with an alternate install CD. These can be downloaded via bittorrent, and usually trigger an "automatic update" prompt as soon as they are inserted into an existing Ubuntu system.
    • Re: (Score:3, Insightful)

      by drinkypoo ( 153816 )

      That will help a lot, but you're still going to have a lot to get from the mirrors on a typical system. Odds are, many of the packages in the ISO will be outdated by the time you get it :P I'm running apt-get update on my apt-p2p'd system and so far, so good.

      • Re: (Score:3, Interesting)

        by QuantumRiff ( 120817 )

        No, as he stated, you can get the alternate disc from bittorrent as well. Then use that to upgrade to 9.04. That would DRASTICALLY reduce the load on the mirrors..

        • by jedidiah ( 1196 )

          ...some people even package up the other repositories as buyware DVDs these days.
          So you could buy/torrent pretty much the whole ball of wax if you wanted to.

        • Re: (Score:3, Insightful)

          by drinkypoo ( 153816 )

          The person who modded you up obviously misunderstood my comment and/or the situation vis-a-vis updating your system in the same way you did: every package on your system is unlikely to be represented on the alternate install CD, and even some of those which are will be outdated by the time you get the ISO, so you will still be downloading numerous packages from the repositories. I didn't say it wouldn't help. You either didn't read my comment, didn't understand my comment, or don't understand the relationsh

  • good idea but... (Score:5, Interesting)

    by mrphoton ( 1349555 ) on Sunday April 12, 2009 @07:36AM (#27547889)
    I had wondered for a while why yum and apt did not do this by default. It would seem a great ideal. However.... I recently tried to down load fedora 11 alpha via bit torrent using a BT internet connection in the UK. It worked great for about 10Mb (@90-100kb/s), then the download speed gradually ground to a halt. (5kb/s) When I tried a direct download of the same iso the speed bumped back up to a steady 100kb/s. I concluded BT was throttling my bit torrent connection of a legal download to a very slow speed.
    So my point is sounds like a great idea but if it is enabled by default it had better have some way to detect bandwidth throttling of p2p networks and revert to http transfer.
    • Re:good idea but... (Score:4, Interesting)

      by nurb432 ( 527695 ) on Sunday April 12, 2009 @08:45AM (#27548207) Homepage Journal

      I had wondered for a while why yum and apt did not do this by default. *snip*

      Because it would be wrong to default to forcing a person to share their limited resources.

    • Re: (Score:3, Insightful)

      by Fruit ( 31966 )
      It could be that your uploading is killing your download speed. See one of the other comments for instructions on how to limit upload speed if you hadn't already.
    • by Teun ( 17872 )
      I assume you reported this issue with BT and what was their response?
    • The beginning flood was the BT program searching for sources and announcing itself to the network.

      If you would have kept it running, it would have gone up again. Or there were not as much high-bandwidth users, so compete with the direct download server.

      At least, that is my experience with P2P programs.

    • You shouldn't be using BT unless you have an alternative.

      If you can switch, consider doing so and let market forces punish BT.

      If BT is entrenched, then I pity you, and encourage you to use evasive maneuvers, preferably without breaking TOS.

      Either way, you may wish to make it a political issue and complain loudly to whatever you britons have for our yankee FTC or FCC. I'd guess "ministry of communication" but I'm just a yankee.

      • Re: (Score:3, Interesting)

        by RalphSleigh ( 899929 )
        As far as I can tell, British Telecom's retail ISP (BT) throttle Bittorrent (BT) to around 10 kb/sec down during peak times, but leaving torrents on overnight works well, as they unthrottle around midnight, and I can usually max out my 8Mb/sec ADSL with bittorrent overnight. This is the only limitation I have come across so am pretty happy with them as an ISP.
  • Slashdotted... (Score:3, Informative)

    by Anonymous Coward on Sunday April 12, 2009 @07:44AM (#27547921)

    mirror here: http://74.125.77.132/search?q=cache:3gY3Bq4EKnMJ:blog.chenhow.net/os/linux/ubuntu/using-apt-p2p-for-faster-upgrades-from-intrepid-to-jaunty/+http://blog.chenhow.net/os/linux/ubuntu/using-apt-p2p-for-faster-upgrades-from-intrepid-to-jaunty&cd=1&hl=nl&ct=clnk&gl=nl

  • by senorpoco ( 1396603 ) on Sunday April 12, 2009 @07:51AM (#27547953)
    I have yet to have an Ubuntu distro update smoothly, ever. But that won't stop me, onward I will plunge headlong into it with abandon. I don't like my data anyway.
    • by mangu ( 126918 ) on Sunday April 12, 2009 @08:59AM (#27548275)

      I have yet to have an Ubuntu distro update smoothly, ever.

      Me too. Often it's quicker to do a full install from scratch.

      But that won't stop me, onward I will plunge headlong into it with abandon. I don't like my data anyway.

      That's why my systems always have at least two different partitions: one for "/" and another for "/home". I can reformat my system partition and still have my data intact.

      • I have never understood why Ubuntu doesn't format partitions this way by default when you use the default "automatic" partitioning scheme..

    • To be honest me neither, there is always something not quite right, with a brand new distro, Intrepid broke sound for a number of people. Was it Hardy with the evolution bug that maxed out the CPU. I think it's always going to be that way but its usually fixed within a month at most. Die hard Ubuntu users hold up their hands in horror and say things like thats it I'm moving back to Windows but it's all good fun and we all get busy fixing the problems and finding cures and occasionally reverting back to th

  • Currently on 8.04, I'll be upgrading to 8.10 sometime after 9.04 is released.

    Staying 6 months behind is a reasonable compromise. Let the lab rats (er, enthusiasts!) debug the new stuff first. Last time I checked 8.10 in a VM there was something like 320MB worth of updated packages.

    As for the packages themselves, run a local apt proxy like approx [debian.org], especially if you have more than one Debian or Ubuntu system. It keeps a copy of every .deb you download, and automatically purges the ones that are outdated.

    • My advice if you like the stable releases is to skip the .10 releases altogether. I always like to chase the bleeding edge, although that is kind of a dangerous place to be. My most salient advice is to make a bootable, external-disk full system backup before attempting an Ubuntu upgrade. It's pretty easy once you figure out grub.

      • Agreed. 9.04 (Beta) feels way stabler than 8.10 (Beta), just like 8.04 (Beta) felt way stabler than 7.10 (Beta).

      • My most salient advice is to make a bootable, external-disk full system backup before attempting an Ubuntu upgrade.

        I'm far too lazy for that, so I tend to just back everything up to a folder -- even a folder on the same system. It's much easier to downgrade if the entire thing falls apart, but I've never had an Ubuntu upgrade completely kill my filesystem or hard drive.

        • Re: (Score:3, Insightful)

          by drinkypoo ( 153816 )

          I've never had an Ubuntu upgrade completely kill my filesystem or hard drive.

          "I know what you're asking yourself..."

    • <fair-warning><personal-experience><ymmv/>

      During the 8.10 upgrade, at some point, the cpu frequency selector will get stuck on the "Ondemand" setting, which during an OS upgrade pretty much means "use all the speed the CPU can give".

      On my computer, that meant having it shut off midway the upgrade as I raced to downclock it screaming at policy-tool getting in the way ("I AM &@%!ING ROOT WHAT DO YOU MEAN I AM NOT ALLOWED"). If you need downclocking too, be wary.

      I didn't experience this o
      • Re: (Score:3, Informative)

        by vadim_t ( 324782 )

        ondemand actually happens to be the best governor.

        In theory, "powersave", by keeping the CPU frequency at a minimum would save some power in comparison. In practice, it doesn't. This is because doing anything at all prevents the CPU from entering the lowest power using modes (which go beyond simply dropping in frequency).

        So it's more efficient to make the CPU run at full blast, do whatever needs to be done, then go to sleep (C3, not suspend to RAM), than to do the same work at a lower clock speed, keeping t

        • I am sorry I wasn't clear enough. The problem is that my CPU will overheat and the computer will shutoff if the CPU runs at full blast for too long, which is exactly what happens with ondemand during a distro upgrade.

          • by karnal ( 22275 )

            Sounds like you have a hardware problem, not a software problem.

            Can you limit the max bus speed within the bios to hard-limit the max speed of the processor?

          • by vadim_t ( 324782 )

            Ok, I can suggest some workarounds then.

            First, you can manually force the governor with this as root:

            echo "powersave" > /sys/devices/system/cpu/cpu0/cpufreq/scaling_governor

            If that still doesn't work, you can try upgrading from the commandline, with:

            $ sudo do-release-upgrade -d

            Since this is on the commandline, you can press Ctrl+Z at any time to temporarily interrupt the upgrade process. Then use the "fg" command to resume it again.

            Still, overheating isn't something that should normally happen. Maybe

  • Slower to start (Score:2, Interesting)

    by Nomaxxx ( 1136289 )
    I've used apt-p2p as an apt-get replacement for a short time. It often downloads faster than the standard method but is slower to start downloading. So it's not great when you have many small packages to install. But for a full system upgrade I guess it's a good alternative. Especially on (or close to) launch date when you're sure that update manager will go idle midway through the upgrade. Other alternative is to wait for a week or too after release date when servers are less busy.
  • Good citizenship (Score:3, Insightful)

    by AlecC ( 512609 ) <aleccawley@gmail.com> on Sunday April 12, 2009 @08:00AM (#27547993)

    What I like about this is not so much the potentially faster upgrade as the ability to contribute a bit to others. The six-monthly upgrades are are rate enough that I don't mind if they are a bit slow - not that they have been. But I am very conscious that I am using other people's freely given bandwidth and I am pleased to be able to give some back.

    Does anybody know if I can force my various machines to cross-peer from each other? If I update one first, I don't want the others searching the Net for peers - they should just copy from the first.

    • Re: (Score:3, Informative)

      by Daengbo ( 523424 )

      You should just set up an apt-cache on one and direct the others to fetch from the first. There are several to choose from. Search for "apt proxy."

  • Bandwidth usage (Score:5, Interesting)

    by Chris_Jefferson ( 581445 ) on Sunday April 12, 2009 @08:04AM (#27548011) Homepage
    I'm concerned that after reading the article, and apt-p2p's FAQ page, that I can't find any guide to how much upload bandwidth this thing will use. While I'm all for sharing, I find it important to cap my upload speed so my connection performs well on other stuff I'm doing, and also stop uploading once I'm at 1:1 sharing or so. Some of us pay if we use too much bandwidth!
    • Re:Bandwidth usage (Score:4, Informative)

      by Mr_Perl ( 142164 ) on Sunday April 12, 2009 @08:21AM (#27548069) Homepage

      Can't help you with the paying for extra bandwidth, but the wondershaper [lartc.org] has helped my limited speed home network remain responsive during downloads.

    • Re:Bandwidth usage (Score:4, Informative)

      by nurb432 ( 527695 ) on Sunday April 12, 2009 @08:46AM (#27548213) Homepage Journal

      It will obliterate your monthly use cap.

      This mode of distribution only works in a perfect world, which few of us live in now.

    • Re: (Score:3, Interesting)

      by stevied ( 169 ) *

      Just installed it, there's an option in /etc/apt-p2p.conf to limit the upload bandwidth. I haven't tested it yet, however ..

    • I'm concerned that after reading the article, and apt-p2p's FAQ page, that I can't find any guide to how much upload bandwidth this thing will use. While I'm all for sharing, I find it important to cap my upload speed so my connection performs well on other stuff I'm doing, and also stop uploading once I'm at 1:1 sharing or so. Some of us pay if we use too much bandwidth!

      I hope that there is still an option to limit the downloads to mirrors, otherwise at company I work at I will probably be unable install/u

      • by eldepeche ( 854916 ) on Sunday April 12, 2009 @11:56AM (#27549311)
        The option is called "not using apt-p2p." I don't remember the exact syntax, but I think there's a switch in the file /etc/apt/this/is/the/default.behavior
      • Just keep using apt. There is no need to use this tool, it's voluntary. My ISP caps my uploads and downloads, but hosts an uncapped Ubuntu mirror so sorry to all the potential leechers but I'm sticking with traditional apt.

        You could also explain that p2p is not inherently bad, just another protocol but it will probably fall on deaf ears.

  • by wjh31 ( 1372867 ) on Sunday April 12, 2009 @08:04AM (#27548013) Homepage
    p2p is a method used exclusively by criminals, there's no way im going to be using this method.
    • Whoosh.

      Mod this +5 Funny so we can see it for the sarcasm it is.

      (If it's not sarcasm, of course, you are a sad little man.)

      • by harry666t ( 1062422 ) <harry666t AT gmail DOT com> on Sunday April 12, 2009 @11:34AM (#27549205)
        It's not funny and it's not sarcasm. It's the same cliche meme repeated a thousandth time. LOL, p2p is helping terrorism, mod me funny. I could write a python one-liner that would produce more varied content than most of these +5, funnies all over here on /.

        I even actually wrote it:

        $ python -c 'print (lambda words, random: " ".join([random.choice(words)[:-1] for i in range(random.randint(1,10))]))(open("/usr/share/dict/words").readlines(), __import__("random"))+"."'

        angoras lawgiver's Father's approbations uninteresting inferring Antonio's Clotho's chlorine.
  • Slashdotted? (Score:5, Informative)

    by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Sunday April 12, 2009 @08:09AM (#27548025) Homepage Journal

    It worked for me. But in case it really is slashdotted here's the story, from memory (let's test those theories eh?)

    1. apt-get install apt-p2p (Not in Hardy and older repos IIRC... for you late/sporadic upgraders)
    2. Back up your /etc/apt/sources.list and then edit the file, s/\/\//\/\/localhost:9977\// (hope I got that right -- Guess I could have just used # or something eh?)
    3. Not in the guide: edit /etc/apt-p2p/apt-p2p.conf and set UPLOAD_LIMIT ... just in case. :) You probably have to /etc/init.d/apt-p2p restart after that.
    4. apt-get update
    5. Then make the update... But it's not time for that yet.
    • Try this, based on your note. (I don't have an Ubuntu system in hand). rsync -avH /etc/apt/ /etc/apt.orig/ --delete # Makes sure you have an up-to-date copy apt-get install apt-p2p # or use synaptic for a GUI interface sed -i 's%//%/localhost:9977/%g' /etc/apt/sources.list apt-get update
      • I'm sorry, I should not have accidentally used HTML format:

                rsync -avH /etc/apt/ /etc/apt.orig/ --delete # Makes sure you have an up-to-date copy
                apt-get install apt-p2p # or use synaptic for a GUI interface
                sed -i 's%//%/localhost:9977/%g' /etc/apt/sources.list apt-get update

        • You still got the apt-get update on the wrong line, and dropped a / from your regexp. But also we both forgot to update files in /etc/apt/sources.list.d. With your help:
          for i in /etc/apt/sources.list.d/*.list
          do
          sudo sed -i 's%//%//localhost:9977/%g' "$i"
          done

          And of course, tweak *.list as appropriate. My directory had *.list and *.list.save files... I removed the latter :)

          • Thank you: I didn't have an Ubuntu in hand, but was amused by the attempts to backspace-manage the '/' in the original command, and the handwaving for the "sed with this". The 'sed -i' command is one of the more useful feature additions to sed of the last 20 years.
  • by Antique Geekmeister ( 740220 ) on Sunday April 12, 2009 @08:09AM (#27548027)

    Many primary Linux download sites wind up taking an unreasonable amount of traffic from default setups. If you want to contribute back to the OS's and packages that you find so useful, consider setting up a local mirror to share with the world at large. If you can't justify that, at least consider setting up an internal rsync mirror anytime you have a dozen or more boxes to make updates and downloads much faster for your site, and configure your local machines to point to that local mirror.

    This turns out to be especially useful for PXE installaters and cluster setups, for any Linux or other OS. There's nothing like having 100 internal Linux machines all trying to update OpenOffice at the same time from an external primary site, through a corporate DSL line, to ensure that many of the updates will fail.

  • Why upgrade? (Score:5, Insightful)

    by wiresquire ( 457486 ) on Sunday April 12, 2009 @08:11AM (#27548033) Journal

    If it works, why upgrade at all?

    Ubuntu 8.04 is a Long Term Support (LTS) release. It will have any security patches until the next LTS release, which is typically every 18 months. So, why not just wait for 9.10?

    ws

    • Re: (Score:3, Informative)

      by Aladrin ( 926209 )

      For the same reason that you'll upgrade to 9.10 instead of waiting for 11.04: Features.

      Sure, it'll have all the bugfixes for years, but it won't have any of the new features.

      (In case anyone has forgotten, LTS are supported for 3 years on the desktop, so there's no 'need' to upgrade every 18 months.)

    • by WormholeFiend ( 674934 ) on Sunday April 12, 2009 @08:36AM (#27548139)

      AFAIK, women prefer men who have all the latest upgrades

    • by AusIV ( 950840 )
      Some people prefer more recent software. On my laptop, I'm looking forward to OpenOffice 3.0, some upgrades to Tomboy, Pidgin, (tentatively) Amarok, and several other packages I use. The Open Source world moves quickly with most software, and running something that's close to two years old may mean you're missing a lot of features.

      Certainly, there are systems you'd rather keep up and running in a known-good configuration than try new software that may or may not work as well. I have a MythTV backend / web

    • Mostly because I'm already on 8.10, and I don't like being a full year out of date.

      Some of that is actually legitimate -- for instance, various development tools and games would be falling out of date. Some of it's just impulsive -- I occasionally regret updating to 8.10, as KDE4 was really not ready.

    • If you have 8.10, you cannot upgrade to 9.10 directly. You must upgrade to 9.04 first. That's why you cannot simply wait for 9.10.

    • Better boot performance, new notification system, OpenOffice 3, Firefox 3.1, ext4.

      Besides those, it's been my experience that any new version of any Linux distro, and especially Ubuntu, that each new version supports hardware that previously didn't work, or took an act of a command-line-god to get working.

      My policy is, I keep the LTS release on my server, and upgrade it when a new LTS comes out. However, on my desktops and laptops, I always upgrade to the latest release. If you look at the list of new fea

  • I honestly suggest to upgrade when the RC is out (1). That's one week before the actual release date, or in other words Thursday. FYI, when I upgraded to the Alpha 6 I had to download 1.3 GBs; torrenting as much is still going to take a lot of time.

    The Release Candidate is typically identical to the "gold" release; also you will help Canonical in testing everything runs as good as it should. If you install apt-p2p (2) you'll even get the warm fuzzy feeling of being a seed for the new packages. :D

    The upgrade process is identical -- the only difference is in starting it. Hit Alt-F2 and use "update-manager -d" then hit "Upgrade".

    (1) Or hell, upgrade /right now/. I'm using the beta and it is rather stable and experience tells me the beta is always pretty near to what goes gold.
    (2) I wouldn't use apt-p2p to upgrade to a dev version as you will find far less peers. However installing it afterwards should let you act as a seed for those packages.

    • Is Alt-F2 "run" in your shiny little world? It never has been for me, and still isn't. On the other hand, I did use Compiz and gmrun to make Super+R run programs.

      • Yes. I'm sorry your little word isn't as shiny as mine (despite my 3 years old shiny little world has a 1.73 GHz CPU that shuts off when it goes at 1.73 GHz, a non-working firefox-3.0, a malfunctioning usplash, a gedit that doesn't quite like zsh, three failed updates, etc.), but my point is very simple.

        If you want to run the latest and greatest software (and you want to run all the risks this takes) you may as well get it a bit sooner than the rest so that you can report bugs before the actual release, and

  • Irony (Score:5, Informative)

    by digitalderbs ( 718388 ) on Sunday April 12, 2009 @08:44AM (#27548199)
    that a site advising the use of p2p to prevent the meltdown of servers has itself been slashdotted.

    On a side note : web data and pages themselves could be p2p distributed too, no? Say a peer gets a webpage's hash (containing html and images) and the date/time of expiry for a webpage from a server. If other peers have that page (html+images), and it's up to date, you could download their copy. Otherwise, the server sends a fresh copy to you, and you seed it for others. Not being in computer science, I'm sure this has been proposed before and that there are glaring shortcomings I have missed.
  • I plan to upgrade directly from the Ubuntu servers, but I'm only going to hit their servers once for the three machines I'm upgrading. I use apt-cacher [ubuntu.com], which stores packages on the local network once they've been downloaded by something on the network, then sends out the cached version when it's requested again. It doesn't help much for the odd day-to-day package installation, but it makes significant upgrades much faster after the first system. You have to configure all of the systems to use the proxy, bu
    • by Aladrin ( 926209 )

      I used that when I was running multiple Kubuntu machines in my house and it was definitely worthwhile, and fairly easy to set up.

      I was actually wondering if anyone would recommend this method.

  • howtoforge (Score:2, Informative)

    by lems1 ( 163074 )

    The original link was dead. This is from howtoforge:

    http://www.howtoforge.com/ubuntu-using-apt-p2p-for-faster-upgrades-from-hardy-to-intrepid [howtoforge.com]

Children begin by loving their parents. After a time they judge them. Rarely, if ever, do they forgive them. - Oscar Wilde

Working...