Project Name | Stars | Downloads | Repos Using This | Packages Using This | Most Recent Commit | Total Releases | Latest Release | Open Issues | License | Language |
---|---|---|---|---|---|---|---|---|---|---|
Downloader | 859 | 4 | 2 months ago | 58 | May 06, 2022 | 10 | mit | C# | ||
Fast, cross-platform and reliable multipart downloader with asynchronous progress events for .NET applications. | ||||||||||
Upload | 41 | 20 | 14 | 9 years ago | 1 | March 12, 2015 | 3 | JavaScript | ||
file upload and progress api | ||||||||||
Afncontext | 16 | 8 years ago | mit | Objective-C | ||||||
Aws S3 Multipart Js Uploader | 14 | 4 years ago | 1 | mit | JavaScript | |||||
We wrote this based on another project. It allows you to upload directly to AWS from the browser, only using the server to generate the URLs | ||||||||||
Nodejs Multipart Form Upload | 12 | 4 years ago | ||||||||
hey, what? | ||||||||||
Node S3 Proxy | 10 | 11 years ago | JavaScript | |||||||
an example of streaming multipart uploads directly to s3, displaying a real-time progress bar | ||||||||||
Multipart_request | 10 | 3 years ago | 5 | bsd-3-clause | Java | |||||
Native Multipart Request to get upload progress on Flutter. Something that's not available on dart:http. | ||||||||||
Aws S3 Browser Multipart Uploader | 1 | 10 years ago | mit | JavaScript | ||||||
Upload large files to s3 in multipart from browser | ||||||||||
Oprime Transcription Heavy | 1 | 12 years ago | JavaScript | |||||||
a transcription server which can be run on experimentor's mac (comes with all libraries = very heavy) | ||||||||||
Node Superupload | 1 | 12 years ago | JavaScript | |||||||
Upload demo with progress reporting |
🚀 Fast, cross-platform and reliable multipart downloader with .Net Core supporting 🚀
Downloader is a modern, fluent, asynchronous, testable and portable library for .NET. This is a multipart downloader with asynchronous progress events.
This library can added in your .Net Core v2
and later or .Net Framework v4.5
or later projects.
Downloader is compatible with .NET Standard 2.0 and above, running on Windows, Linux, and macOS, in full .NET Framework or .NET Core.
For a complete example see Downloader.Sample project from this repository.
ChunkCount
to define the parts count of the download file.in-memory
or on-disk
mode.JSON
or Binary
)PM> Install-Package Downloader
dotnet add package Downloader
var downloadOpt = new DownloadConfiguration()
{
ChunkCount = 8, // file parts to download, default value is 1
ParallelDownload = true // download parts of file as parallel or not. Default value is false
};
Note: Do not use all of the below options in your applications, just add which one you need.
var downloadOpt = new DownloadConfiguration()
{
// usually, hosts support max to 8000 bytes, default values is 8000
BufferBlockSize = 10240,
// file parts to download, default value is 1
ChunkCount = 8,
// download speed limited to 2MB/s, default values is zero or unlimited
MaximumBytesPerSecond = 1024*1024*2,
// the maximum number of times to fail
MaxTryAgainOnFailover = 5,
// release memory buffer after each 50 MB
MaximumMemoryBufferBytes = 1024 * 1024 * 50,
// download parts of file as parallel or not. Default value is false
ParallelDownload = true,
// number of parallel downloads. The default value is the same as the chunk count
ParallelCount = 4,
// timeout (millisecond) per stream block reader, default values is 1000
Timeout = 1000,
// set true if you want to download just a specific range of bytes of a large file
RangeDownload = false,
// floor offset of download range of a large file
RangeLow = 0,
// ceiling offset of download range of a large file
RangeHigh = 0,
// clear package chunks data when download completed with failure, default value is false
ClearPackageOnCompletionWithFailure = true,
// minimum size of chunking to download a file in multiple parts, default value is 512
MinimumSizeOfChunking = 1024,
// Before starting the download, reserve the storage space of the file as file size, default value is false
ReserveStorageSpaceBeforeStartingDownload = true;
// config and customize request headers
RequestConfiguration =
{
Accept = "*/*",
CookieContainer = cookies,
Headers = new WebHeaderCollection(), // { your custom headers }
KeepAlive = true, // default value is false
ProtocolVersion = HttpVersion.Version11, // default value is HTTP 1.1
UseDefaultCredentials = false,
// your custom user agent or your_app_name/app_version.
UserAgent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64)",
Proxy = new WebProxy() {
Address = new Uri("http://YourProxyServer/proxy.pac"),
UseDefaultCredentials = false,
Credentials = System.Net.CredentialCache.DefaultNetworkCredentials,
BypassProxyOnLocal = true
}
}
};
var downloader = new DownloadService(downloadOpt);
// Provide `FileName` and `TotalBytesToReceive` at the start of each downloads
downloader.DownloadStarted += OnDownloadStarted;
// Provide any information about chunker downloads,
// like progress percentage per chunk, speed,
// total received bytes and received bytes array to live streaming.
downloader.ChunkDownloadProgressChanged += OnChunkDownloadProgressChanged;
// Provide any information about download progress,
// like progress percentage of sum of chunks, total speed,
// average speed, total received bytes and received bytes array
// to live streaming.
downloader.DownloadProgressChanged += OnDownloadProgressChanged;
// Download completed event that can include occurred errors or
// cancelled or download completed successfully.
downloader.DownloadFileCompleted += OnDownloadFileCompleted;
string file = @"Your_Path\fileName.zip";
string url = @"https://file-examples.com/fileName.zip";
await downloader.DownloadFileTaskAsync(url, file);
DirectoryInfo path = new DirectoryInfo("Your_Path");
string url = @"https://file-examples.com/fileName.zip";
// download into "Your_Path\fileName.zip"
await downloader.DownloadFileTaskAsync(url, path);
// After download completion, it gets a MemoryStream
Stream destinationStream = await downloader.DownloadFileTaskAsync(url);
When you want to resume a download quickly after pausing a few seconds. You can call the Pause
function of the downloader service. This way, streams stay alive and are only suspended by a locker to be released and resumed whenever you want.
// Pause the download
DownloadService.Pause();
// Resume the download
DownloadService.Resume();
The DownloadService
class has a property called Package
that stores each step of the download. To stopping the download you must call the CancelAsync
method. Now, if you want to continue again, you must call the same DownloadFileTaskAsync
function with the Package
parameter to resume your download. For example:
// At first, keep and store the Package file to resume
// your download from the last download position:
DownloadPackage pack = downloader.Package;
Stop or cancel download:
// This function breaks your stream and cancels progress.
downloader.CancelAsync();
Resuming download after cancelation:
await downloader.DownloadFileTaskAsync(pack);
So that you can even save your large downloads with a very small amount in the Package and after restarting the program, restore it again and start continuing your download. The packages are your snapshot of the download instance. Only the downloaded file addresses will be included in the package and you can resume it whenever you want. For more detail see StopResumeDownloadTest method
Note: Sometimes a server does not support downloading in a specific range. That time, we can't resume downloads after canceling. So, the downloader starts from the beginning.
For easy and fluent use of the downloader, you can use the DownloadBuilder
class. Consider the following examples:
Simple usage:
await DownloadBuilder.New()
.WithUrl(@"https://host.com/test-file.zip")
.WithDirectory(@"C:\temp")
.Build()
.StartAsync();
Complex usage:
IDownload download = DownloadBuilder.New()
.WithUrl(@"https://host.com/test-file.zip")
.WithDirectory(@"C:\temp")
.WithFileName("test-file.zip")
.WithConfiguration(new DownloadConfiguration())
.Build();
download.DownloadProgressChanged += DownloadProgressChanged;
download.DownloadFileCompleted += DownloadFileCompleted;
download.DownloadStarted += DownloadStarted;
download.ChunkDownloadProgressChanged += ChunkDownloadProgressChanged;
await download.StartAsync();
download.Stop(); // cancel current download
Resume the existing download package:
await DownloadBuilder.Build(package).StartAsync();
Resume the existing download package with a new configuration:
await DownloadBuilder.Build(package, config).StartAsync();
var download = DownloadBuilder.New()
.Build()
.WithUrl(url)
.WithFileLocation(path);
await download.StartAsync();
download.Pause(); // pause current download quickly
download.Resume(); // continue current download quickly
If your URL server does not provide the file size in the response header (Content-Length
).
The Downloader cannot split the file into multiple parts and continues its work with one chunk.
If the server return Accept-Ranges: none
in the responses header then that means the server does not support download in range and
the Downloader cannot use multiple chunking and continues its work with one chunk.
At first, the Downloader sends a GET request to the server to fetch the file's size in the range.
If the server does not provide Content-Range
in the header then that means the server does not support download in range.
Therefore, the Downloader has to continue its work with one chunk.
Serialization is the process of converting an object's state into information that can be stored for later retrieval or that can be sent to another system. For example, you may have an object that represents a document that you wish to save. This object could be serialized to a stream of binary information and stored as a file on disk. Later the binary data can be retrieved from the file and deserialized into objects that are exact copies of the original information. As a second example, you may have an object containing the details of a transaction that you wish to send to a non-.NET system. This information could be serialised to XML before being transmitted. The receiving system would convert the XML into a format that it could understand.
In this section, we want to show how to serialize download packages to JSON
text or Binary
, after stopping download to keep download data and resuming that every time you want.
You can serialize packages even using memory storage for caching download data which is used MemoryStream
.
Serializing the package to JSON
is very simple like this:
var packageJson = JsonConvert.SerializeObject(package);
Deserializing into the new package:
var newPack = JsonConvert.DeserializeObject<DownloadPackage>(packageJson);
For more detail see PackageSerializationTest method
To serialize or deserialize the package into a binary file, first you need serialize to JSON and next save it with BinaryWriter.
NOTE: The BinaryFormatter type is dangerous and is not recommended for data processing. Applications should stop using BinaryFormatter as soon as possible, even if they believe the data they're processing to be trustworthy. BinaryFormatter is insecure and can't be made secure. So, BinaryFormatter is deprecated and we can no longer support it. Reference
Welcome to contribute, feel free to change and open a PullRequest to develop branch. You can use either the latest version of Visual Studio or Visual Studio Code and .NET CLI for Windows, Mac and Linux.
For GitHub workflow, check out our Git workflow below this paragraph. We are following the excellent GitHub Flow process, and would like to make sure you have all of the information needed to be a world-class contributor!
The general process for working with Downloader is:
git remote add upstream git://github.com/bezzad/downloader
)git checkout vX.Y.Z
)git checkout -b myBranch
).git push origin myBranch
)vX.Y.Z
) rather than master
.We accept pull requests from the community. But, you should never work on a clone of master, and you should never send a pull request from master - always from a branch. Please be sure to branch from the head of the latest vX.Y.Z develop
branch (rather than master
) when developing contributions.
Licensed under the terms of the MIT License
Thanks go to these wonderful people (List made with contrib.rocks):