hasoffers.blogg.se

Google drive upload large files
Google drive upload large files









google drive upload large files

need to change the position of the input stream, otherwise we can continue from the current position.ĬancellationToken.ThrowIfCancellationRequested() If the number of bytes received by the server isn't equal to the amount of bytes the client sent, we We can change the stream position and read bytes from the last point. Stream length is known and it supports seek and position operations. In this case, return empty initialized values. has uploaded and it tries to fetch the next chunk which doesn't exist. Because we fetch next chunk and upload at the same time, this can fail when the last chunk Private void PrepareNextChunkKnownSizeCustom(Stream stream, CancellationToken cancellationToken, long bytesSent, out byte chunk)ĬhkSize = (int) Math.Min(StreamLength - bytesSent, (long) ChunkSize) / Prepares the given request with the next chunk in case the steam length is known. UpdateProgress(new ResumableUploadProgress(ex, BytesServerReceived)) Private async Task GetUploadType(FileSystemInfo fileOnNas)įiles.Q = $"name=] - Exception occurred while uploading media", UploadUri) Var updateFileTask = updateRequest.UploadAsync() įiles.Fields = "files(id, name, md5Checksum, mimeType, kind)" UpdateRequest.ResponseReceived += Upload_ResponseReceived UpdateRequest.ProgressChanged += Upload_ProgressChanged UpdateRequest.ChunkSize = chunkSizeMb * 1024 * 1024 Var createFileTask = insertRequest.UploadAsync() InsertRequest.ResponseReceived += Upload_ResponseReceived InsertRequest.ProgressChanged += Upload_ProgressChanged InsertRequest.ChunkSize = chunkSizeMb * 1024 * 1024 Var uploadArgs = await GetUploadType(fileInfo) Var uploadStream = new FileStream(fileInfo.FullName, FileMode.Open, FileAccess.Read) Public async Task Upload(FileInfo fileInfo, int maximumTransferRate = 0, int chunkSizeMb = 10) / Indicates the maximum transfer rate for the upload, in MB/s. Notice the upload pauses for the duration of the fetch from disk, which is understandable - however, I'm wondering if there is any way to fetch the next chunk off the disk while the previous chunk is uploading, as this could significantly improve my upload times? using 2 See this graph of download (fetching chunk off NAS) and upload (sending to Google Drive).

google drive upload large files

The code works fine, however I'm losing a lot of performance due to the fact that each chunk has to be downloaded from the drive, before being uploaded. My Code uses this wrapper to upload these large files in chunks of 40mb. I'm using Google Drive API for uploading large disk image files ( >100 GB).











Google drive upload large files