It would be a good alternative to have, although we know various ISPs (e.g. comcast) are interfering with bittorrent, so it has good and bad sides.<br><br><div class="gmail_quote">On Mon, Jun 9, 2008 at 5:19 PM, Wade Brainerd <<a href="mailto:wadetb@gmail.com">wadetb@gmail.com</a>> wrote:<br>
<blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">Would it be worth setting up a BitTorrent tracker for builds? <br><br>BitTorrent is very reliable, excellent at resuming, doesn't require command line use, and can take advantage of local seeds.<br>
<br>Best,<br>Wade<br>
<br><div class="gmail_quote">2008/6/9 Carol Lerche <<a href="mailto:cafl@msbit.com" target="_blank">cafl@msbit.com</a>>:<div><div></div><div class="Wj3C7c"><br><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
Here is what I did when I was fighting battles with an unreliable connection a few months ago. (It doesn't solve the cost of downloading, but it does solve the lost connection problem). Use <br><br>wget -c url-of-the-wanted-file<br>
<br>This command line program will allow you to resume the download again and again when the inevitable happens and the connection is lost. If you find that it downloads more reliably if you limit the bandwidth, there is another option to set a rate limit, namely:<br>
<br>wget --limit-rate=20k -c url-of-the-wanted-file<br><br>(the rate is given in bytes per second, so the above example is 20,000 bytes per second).<br><br>Living at the far end of the DSL reaches, wget is my favorite program!<br>
<br>Regards,<br><br>Carol Lerche<div><div></div><div><br><br><div class="gmail_quote">On Mon, Jun 9, 2008 at 12:29 AM, James Cameron <<a href="mailto:quozl@laptop.org" target="_blank">quozl@laptop.org</a>> wrote:<br>
<blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
No, FTP is not more reliable, if you have a problem with HTTP downloads<br>
of the image you will have the same problems with FTP.<br>
<br>
Yes, it can be made available in smaller chunks, but it is best to<br>
download it with a restartable downloader, which recovers from<br>
interruptions. wget on Linux can do this, as can rsync. There are<br>
restartable downloaders available for other operating systems. Ability<br>
to restart at last known point after an interruption is inherent in the<br>
HTTP, FTP and rsync protocols. But many HTTP clients do not use the<br>
feature.<br>
<br>
Yes, there is a better way of making new builds and updates available,<br>
and that is the olpc-update mechanism. It restarts from where it was<br>
disconnected as well, since it uses rsync.<br>
<font color="#888888"><br>
--<br>
James Cameron mailto:<a href="mailto:quozl@us.netrek.org" target="_blank">quozl@us.netrek.org</a> <a href="http://quozl.netrek.org/" target="_blank">http://quozl.netrek.org/</a><br>
_______________________________________________<br>
Devel mailing list<br>
<a href="mailto:Devel@lists.laptop.org" target="_blank">Devel@lists.laptop.org</a><br>
<a href="http://lists.laptop.org/listinfo/devel" target="_blank">http://lists.laptop.org/listinfo/devel</a><br>
</font></blockquote></div><br><br clear="all"><br></div></div><font color="#888888">-- <br>"Always do right," said Mark Twain. "This will gratify some people and astonish the rest."
</font><br>_______________________________________________<br>
Devel mailing list<br>
<a href="mailto:Devel@lists.laptop.org" target="_blank">Devel@lists.laptop.org</a><br>
<a href="http://lists.laptop.org/listinfo/devel" target="_blank">http://lists.laptop.org/listinfo/devel</a><br>
<br></blockquote></div></div></div><br>
</blockquote></div><br><br clear="all"><br>-- <br>"Always do right," said Mark Twain. "This will gratify some people and astonish the rest."