aria2 can be much better configured than wget, and I once suggested some options, which may help with the download of the virus definition files:
viewtopic.php?f=3&t=5019But maybe these options should be revised again:
I still think, that the idea is correct, to let aria2 restart each download from scratch, rather than resuming the same download. But using 10 simultaneous connections will spoil that approach and make everything worse: If aria2 starts to download 3 different versions of the same file from 10 different servers, it can easily break things:
- Code: Select all
2015-06-26 21:31:30.901678 ERROR - [AbstractCommand.cc:303]CUID#6 - Download aborted. URI=http://download.microsoft.com/download/DefinitionUpdates/mpas-feX64.exe
Exception: [AbstractCommand.cc:303] errorCode=8 URI=http://download.microsoft.com/download/DefinitionUpdates/mpas-feX64.exe
-> [HttpResponse.cc:109] errorCode=8 Invalid range header. Request: 537049-0/35967248, Response: 537049-35967247/35992344
2015-06-26 21:31:30.902328 NOTICE - [RequestGroupMan.cc:392]Download GID#1 not complete: ../client/wddefs/x64-glb/mpas-feX64.exe
Actually, we should have
two sets of options:
- One set of options optimized for speed. This could be used for large files, which never change.
- Another set of options for the slower, but more reliable download of problematic files. Multiple simultaneous connections should not be used in this case.
This could be sorted out like this:
Common options for timestamping, timeouts and logging:
- Code: Select all
--allow-overwrite=true --auto-file-renaming=false --remote-time=true --conditional-get=true --max-tries=10 --retry-wait=10 --timeout=60 --log=..\download.log --log-level=notice
The logfile should always be used with aria2, because it writes a status line to the terminal window and more detailed information to the logfile. This means, aria2 can divert the information to both places. With wget, the complete output is either to the terminal window or to the logfile.
Optimized options for large files, which don't change:
- Code: Select all
--max-connection-per-server=5
Failsafe options for problematic files:
- Code: Select all
--always-resume=false --max-resume-failure-tries=0
SummaryMultiple connections may not work well for the virus definition downloads. So far, I suggest:
- Find the file custom\SetAria2EnvVars.cmd, which is created by the script ActivateAria2Downloads.cmd
- Delete the options -x10 -j10 -s10 -k1M -R
The meaning of these options is:
- Code: Select all
-x, --max-connection-per-server=<NUM>
The maximum number of connections to one server for each down‐
load. Default: 1
-j, --max-concurrent-downloads=<N>
Set maximum number of parallel downloads for every static
(HTTP/FTP) URI, torrent and metalink. See also --split option.
Default: 5
-s, --split=<N>
Download a file using N connections. If more than N URIs are
given, first N URIs are used and remaining URIs are used for
backup. If less than N URIs are given, those URIs are used more
than once so that N connections total are made simultaneously.
The number of connections to the same host is restricted by
--max-connection-per-server option. See also --min-split-size
option. Default: 5
-k, --min-split-size=<SIZE>
aria2 does not split less than 2*SIZE byte range. For example,
let's consider downloading 20MiB file. If SIZE is 10M, aria2 can
split file into 2 range [0-10MiB) and [10MiB-20MiB) and download
it using 2 sources(if --split >= 2, of course). If SIZE is 15M,
since 2*15M > 20MiB, aria2 does not split file and download it
using 1 source. You can append K or M (1K = 1024, 1M = 1024K).
Possible Values: 1M -1024M Default: 20M
-R, --remote-time[=true|false]
Retrieve timestamp of the remote file from the remote HTTP/FTP
server and if it is available, apply it to the local file.
Default: false
The option -R is actually the same as --remote-time=true, but I prefer the long options, because they are more self-explaining.