Depends on D18828. Ref T7789. See https://discourse.phabricator-community.org/t/git-lfs-fails-with-large-images/584.
Currently, when you upload a large (>4MB) image, we may try to assess the dimensions for the image and for each individual chunk.
At best, this is slow and not useful. At worst, it fatals or consumes a ton of memory and I/O we don't need to be using.
Instead:
- Don't try to assess dimensions for chunked files.
- Don't try to assess dimensions for the chunks themselves.
- Squelch errors for bad data, etc., that gd can't actually read, since we recover sensibly.