Home_greyopenFATE - openSUSE feature tracking > #120326
Dashboard | Search | Sign up | Login

Please login or register to be able to edit or vote this feature.

Resume download

Feature state

openSUSE-10.2
Rejected Information
openSUSE-10.3
Rejected Information
openSUSE-11.0
Rejected Information
openSUSE-11.1
Rejected Information
openSUSE-11.2
Rejected Information

Description

YaST/YOU times out too easy when downloading large packages like kde-base (14 MB) over a single ISDN connection. Please cache the half downloaded package so I don't have to start from the beginning again.

See
http://bugzilla.novell.com/show_bug.cgi?id=suse9740 http://bugzilla.novell.com/show_bug.cgi?id=278507   https://bugzilla.novell.com/show_bug.cgi?id=545242

Discussion


icons/user_comment.png G. P. wrote: (12 years ago)

Klaus, do you now whether this is still an issue?

icons/user_comment.png K. K. wrote: (12 years ago)

We still have very large packages (kernel, OpenOffice_org) which might not download completely in one go.

icons/user_comment.png S. V. wrote: (10 years ago)

Related to commit-refactoring.

icons/user_comment.png F. L. wrote: (10 years ago)

Klaus, are you still running ISDN? just kidding :-)

Stano, please your opinion on workload - is this easily achievable? if not, there are hgher priorities.

icons/user_comment.png S. V. wrote: (9 years ago)

Jiri, could we get estimate for this?

icons/user_comment.png J. S. wrote: (9 years ago)

Since curl itself does support resuming download, then this feature should not be hard to be implemented.

icons/user_comment.png R. B. wrote: (9 years ago)

I also vote for this feature.

icons/user_comment.png D. M. wrote: (9 years ago)

It is also a great problem when you use slow mirror. download.opensuse.org redirects me to one of yandex mirrors (score 20). And I have timeout on big packages.

icons/user_comment.png P. J. wrote: (9 years ago)

I'd vote at least for a way to change the timeout settings for YaST or it doesn't solve the problem?

icons/user_comment.png A. A. wrote: (9 years ago)

I'd like to vote this feature implemented in 11.2. You surely want to have resume capability if you have unreliable yet slow internet connection for say, updating KDE 4.2 :)

icons/user_comment.png D. M. wrote: (9 years ago)

Please stop this "I vote for this" or "+1" or "mee too" comments. There is no voting system in FATE yet, but following a discussion about "I want this too" makes hard to evaluate features.

icons/user_comment.png J. M. wrote: (9 years ago)

Could this be accomplished using a bittorrent backend instead of curl ?

icons/user_comment.png J. E. wrote: (9 years ago)

ISDN is already slow as it is. I would not want to spend more time downloading just because of the metadata traffic that is going to happen. Not to mention what happens if there are no peers around or they configured themselves to upload-limit themselves. Still, most download.opensuse.org downloads are faster than a torrent for me.

icons/user_comment.png P. B. wrote: (9 years ago)

Caching is one thing. But even using retries in curl would help, see "curl --retries".

icons/user_comment.png P. J. wrote: (9 years ago)

It could be accomplished by using aria for downloading packages?

There would be no more problems with timeouts and bad checksums ?

icons/user_comment.png J. K. wrote: (9 years ago)

Actually we are already using aria in current development branch, so this is not so urgent anymore. Still, the download can be interrupted also in other ways than connection timeout, e.g. user decision, sudden power outage, etc...

Does anyone know whether aria supports resuming? (Implementing this for the curl backend is not important anymore).

icons/user_comment.png M. K. wrote: (9 years ago)

Yes, aria supports resuming -- even better than wget (don't know about curl), because aria uses the file size to check whether the to-be-downloaded file changed.

icons/user_comment.png P. P. wrote: (9 years ago)

It does. See section
"Resuming Download" in its man page ; and also note the -c option.

icons/user_comment.png T. J. wrote: (8 years ago)

Perhaps as an addendum to this feature, I'd like to see the option to set a number of automatic retries.

If Yast tells me it needs to download 3 gigs worth of packages, I don't want to watch for one package to timeout, hanging the whole process up. I'd like to configure it so that it will automatically retry the package X times, then skip the package and move on.

icons/user_comment.png M. P. wrote: (8 years ago)

Good thing would be here to watch network status (maybe by ping'ing the download server each few minutes, and resume download after it is back again.

icons/user_comment.png E. A. wrote: (8 years ago)

In fact, the system should keep on trying to download this particular package, and in the meantime try to download the next package in the list. I've had cases where a single package would fail to download (for unknown reason, since the same URL worked fine in firefox on the same machine), while the rest of the packages went fine. I also find 'curl' less robust and featurefull than 'wget', I wonder why the first was chosen as a backend...

icons/user_comment.png K. K. wrote: (8 years ago)

Yup, this should be there. One place Ubuntu kicks openSUSE ass

icons/user_comment.png V. P. wrote: (8 years ago)

Here in China I also experience timeouts sometimes, even over a regular ADSL connection (512 Mbps). It happens with my home connection as well as at my company's office.

Resuming download would definitely be helpful, because sometimes I have to resume the same big package several times before it can finish.

icons/user_comment.png D. P. wrote: (8 years ago)

It must be so simple to allow aria2c to use it's resume download feature, I don't understand why it hasn't been implemented yet. I've just completed a three day 2.1 gig zypper dup on a beta trial adsl line which caused many failures. The new download in advance feature would have made zypper into a super upgrade package if it wasn't blemished by libzypps inability to resume interupted downloads. I had to use aria2c manually to download two large packages directly into libzypps package cache to complete the upgrade.

see https://bugzilla.novell.com/show_bug.cgi?id=545242

unfortunately the bug has been closed due to this feature request but not using aria2c's resume feature spoils an otherwise streamlined zypper

icons/user_comment.png G. D. wrote: (8 years ago)

I see aria2c does have a continue feature. aria2c --help option shows the following
-c, --continue
Continue downloading a partially downloaded file. Use this option to resume a download
started by a web browser or another program which downloads files sequentially from the
beginning. Currently this option is only applicable to http(s)/ftp downloads.     

Possible Values: true,false. Default: false. Tags: basic,ftp,http

Last change: 7 years ago
Voting
Score: 115
  • Negative: 1
  • Neutral: 4
  • Positive: 116
Tags
Feature Export
Application-xmlXML   Text-x-logPlaintext   PrinterPrint