Ref T7149. We can't compute hashes of large files efficiently, but we can resume uploads by the same author, with the same name and file size, which are only partially completed. This seems like a reasonable heuristic that is unlikely to ever misfire, even if it's a little magical.
Details
Details
- Reviewers
btrahan - Maniphest Tasks
- T7149: Allow users to import data into a new Phacility instance
- Commits
- Restricted Diffusion Commit
rP32d8d675357c: Support resuming JS uploads of chunked files
- Forced chunking on.
- Started uploading a chunked file.
- Closed the browser window.
- Dropped it into a new window.
- Upload resumed (!!!)
- Did this again.
- Downloaded the final file, which successfully reconstructed the original file.
Diff Detail
Diff Detail
- Repository
- rP Phabricator
- Branch
- chunk5
- Lint
Lint Passed - Unit
Tests Passed - Build Status
Buildable 4876 Build 4894: [Placeholder Plan] Wait for 30 Seconds