1 Reply Latest reply on Feb 18, 2011 1:53 PM by sophiakitty

    file chunking and uploading/reasembling

    sophiakitty

      i'm working on a chunked file uploader that talks to a nginx modul that handles resumable/chunked uploads.

       

      right now i'm trying to verify that the file chunking process is working correctly. however i've run into some strange things with my local tests. for starters it works perfectly on all of the smaller file's i've tried. they've all passed the md5 checksum comparisons.

       

      however for all of the larger files i've tried they fail the md5 checksum comparison, and a lot of the time the videos won't play. i've noticed that upping the chunk size from 5mb to 100mb has fixed the playback issue for at least one of the videos i've tested with.

       

      as far as i can tell it's loading in the original file in chunks and storing those chunks into files. i can watch it's progress through the file and the numbers all match up as expected.

       

      here's my source code for the chunking and rebuilding:

       

              public function makeChunks():void{
                  status = "Preparing";
                  trace("VideoData::makeChunks");
                  fileStream = new FileStream();
                  fileStream.readAhead = chunkSize;
                  fileStream.addEventListener(ProgressEvent.PROGRESS,onOpenProgress);
                  fileStream.addEventListener(Event.COMPLETE,onOpenComplete);
                  fileStream.openAsync(file,FileMode.READ);
              }
              private function onOpenProgress(e:ProgressEvent):void{
                  if(fileStream.bytesAvailable >= fileStream.readAhead){
                      trace("onOpenProgress |",fileStream.position,"|",e.bytesLoaded,"/",e.bytesTotal,"(",file.size,")|----|",fileStr eam.bytesAvailable,"/",fileStream.readAhead);
                      var cChunk:ByteArray = new ByteArray();
                      fileStream.readBytes(cChunk);
                      trace("--",chunkIndex,"*",cChunk.length,chunkIndex*cChunk.length);
                      trace("--",fileStream.bytesAvailable,"/",fileStream.readAhead);
                     
                      var tFile:File = File.applicationStorageDirectory.resolvePath(file.name+"_chunks/"+chunkIndex+".part");
                      var wStream:FileStream = new FileStream();
                     
                      wStream.open(tFile,FileMode.WRITE);
                      wStream.writeBytes(cChunk);
                      wStream.close();
                     
                      chunkIndex++;
                      //fileStream.position += currentChunk.length;
                      trace("---------------",chunkIndex,"",cChunk.bytesAvailable);
                      //dispatchEvent(new MediohEvent("LocalVideosChanged",true,true));
                      current_progress = e.bytesLoaded / e.bytesTotal;
                  }
                 
              }
              private function onOpenComplete(e:Event):void{
                  trace("onOpenComplete |",fileStream.position,"/",file.size,"|",file.size-fileStream.position,"|----|",fileStrea m.bytesAvailable,"/",fileStream.readAhead);
                  if(fileStream.bytesAvailable > 0){
                      var cChunk:ByteArray = new ByteArray();
                      fileStream.readBytes(cChunk);
                     
                      var tFile:File = File.applicationStorageDirectory.resolvePath(file.name+"_chunks/"+chunkIndex+".part");
                      var wStream:FileStream = new FileStream();
                     
                      wStream.open(tFile,FileMode.WRITE);
                      wStream.writeBytes(cChunk);
                      wStream.close();
                      trace("chunking complete---------------",chunkIndex,"bytes length",cChunk.length,"bytes length",cChunk.length);
                  }
                  trace("--",chunkIndex,"*",cChunk.length,chunkIndex*cChunk.length);
                  trace("--",fileStream.bytesAvailable,"/",fileStream.readAhead);
                  fileStream = null;
                  chunk_path = file.name+"_chunks/";
                  needs_chunks = false;
                  status = "Uploading";
                  current_progress = 0;
                  dispatchEvent(new MediohEvent("LocalVideosChanged",true,true));
                  dispatchEvent(new Event("saveVideos",true,true));
                  rebuild();
              }

       

              private function rebuild():void{
                  var target_file:String = "C:\\Users\\sophia\\Videos\\backtogether\\"+file.name;
                  var folder:File =  File.applicationStorageDirectory.resolvePath(chunk_path);
                  var filesFound:Array = folder.getDirectoryListing();
                  trace("blah",filesFound);
                  var bigFile:File = new File(target_file);
                  var wStream:FileStream = new FileStream();
                  wStream.open(bigFile,FileMode.WRITE);
                  for(var i:int = 0; i < filesFound.length; i++){
                      var fStream:FileStream = new FileStream();
                      fStream.open(filesFound[i],FileMode.READ);
                      var bytes:ByteArray = new ByteArray();
                      fStream.readBytes(bytes);
                      fStream.close();
                      wStream.writeBytes(bytes);
                  }
                  wStream.close();
                  status = "Complete";
                  current_progress = 1;
                  date_uploaded = new Date();
                  clearChunks();
                  dispatchEvent(new Event("uploadComplete",true,true));
                  dispatchEvent(new Event("saveVideos",true,true));
              }

        • 1. Re: file chunking and uploading/reasembling
          sophiakitty Level 1

          i've found a work around. instead of splitting the file up into a bunch of smaller files that i can then read the bytes from to upload i just upload them as i read them from the file. and have it wait to processes the next block of bytes that get loaded until after the previous bytes finish uploading.

           

          i also determined that the number of chunks effects my ability to put them back together again locally.

           

          i tested with a file i had successfully uploaded before with a much smaller chunk size (1mb instead of 5mb)  and it uploaded successfully but my local attempt to rebuild failed.

           

          i'm still kinda currious as to why i can put together a file broken into a couple chunks but not one broken into many.