如何用ajax以小块形式上传文件并检查是否失败,重新上传失败的部分


How to upload a file with ajax in small chunks and check for fails, re-upload the parts that failed.

我有一个用户上传的文件,我想实现以下目标。

  1. 将文件分成大约1兆字节的小块
  2. 上传每个区块,等待它完成后再开始上传下一个区块
  3. 对于每个区块,获取成功或失败报告
  4. 重新上传失败的区块
  5. 以百分比形式获取进度

以下是一些粗略的JavaScript。我真的迷路了。在线获取了一些代码并尝试修改。

$.chunky = function(file, name){        
                var loaded = 0;
                var step = 1048576//1024*1024;
                var total = file.size;
                var start = 0;
                var reader = new FileReader();
                reader.onload = function(e){
                var d = {file:reader.result}
                $.ajax({
                    url:"../record/c/index.php",
                    type:"POST", 
                    data:d}).done(function(r){
                    $('.record_reply_g').html(r);
                    loaded += step;                 
                    $('.upload_rpogress').html((loaded/total) * 100);
                        if(loaded <= total){
                            blob = file.slice(loaded,loaded+step);
                            reader.readAsBinaryString(blob);
                        } else {
                            loaded = total;
                        }
                })              
                };
                var blob = file.slice(start,step);
                reader.readAsBinaryString(blob);
            }

我怎样才能达到上述目的。如果有可行的解决方案,请解释发生了什么。

您没有为任何区块上传失败做任何事情。

$.chunky = function(file, name){        
    var loaded = 0;
    var step = 1048576//1024*1024; size of one chunk
    var total = file.size;  // total size of file
    var start = 0;          // starting position
    var reader = new FileReader();
    var blob = file.slice(start,step); //a single chunk in starting of step size
    reader.readAsBinaryString(blob);   // reading that chunk. when it read it, onload will be invoked
    reader.onload = function(e){            
        var d = {file:reader.result}
        $.ajax({
            url:"../record/c/index.php",
            type:"POST", 
            data:d                     // d is the chunk got by readAsBinaryString(...)
        }).done(function(r){           // if 'd' is uploaded successfully then ->
                $('.record_reply_g').html(r);   //updating status in html view
                loaded += step;                 //increasing loaded which is being used as start position for next chunk
                $('.upload_rpogress').html((loaded/total) * 100);
                if(loaded <= total){            // if file is not completely uploaded
                    blob = file.slice(loaded,loaded+step);  // getting next chunk
                    reader.readAsBinaryString(blob);        //reading it through file reader which will call onload again. So it will happen recursively until file is completely uploaded.
                } else {                       // if file is uploaded completely
                    loaded = total;            // just changed loaded which could be used to show status.
                }
            })              
        };
}

编辑

要再次上传失败的区块,您可以执行以下操作:

var totalFailures = 0;
reader.onload = function(e) {
    ....
}).done(function(r){
    totalFailures = 0;
    ....
}).fail(function(r){   // if upload failed
   if((totalFailure++) < 3) { // atleast try 3 times to upload file even on failure
     reader.readAsBinaryString(blob);
   } else {                   // if file upload is failed 4th time
      // show message to user that file uploading process is failed
   }
});

我修改了afzalex的答案,使其使用readAsArrayBuffer(),并将块作为文件上传。

    var loaded = 0;
    var reader = new FileReader();
    var blob = file.slice(loaded, max_chunk_size);
    reader.readAsArrayBuffer(blob);
    reader.onload = function(e) {
      var fd = new FormData();
      fd.append('filedata', new File([reader.result], 'filechunk'));
      fd.append('loaded', loaded);
      $.ajax(url, {
        type: "POST",
        contentType: false,
        data: fd,
        processData: false
      }).done(function(r) {
        loaded += max_chunk_size;
        if (loaded < file.size) {
          blob = file.slice(loaded, loaded + max_chunk_size);
          reader.readAsArrayBuffer(blob);
        }
      });
    };