Page MenuHomePhabricator

Stop trying to assess the image dimensions of large files and file chunks

Authored by epriestley on Dec 13 2017, 2:48 PM.



Depends on D18828. Ref T7789. See

Currently, when you upload a large (>4MB) image, we may try to assess the dimensions for the image and for each individual chunk.

At best, this is slow and not useful. At worst, it fatals or consumes a ton of memory and I/O we don't need to be using.


  • Don't try to assess dimensions for chunked files.
  • Don't try to assess dimensions for the chunks themselves.
  • Squelch errors for bad data, etc., that gd can't actually read, since we recover sensibly.
Test Plan
  • Created a 2048x2048 PNG in Photoshop using the "Random Noise" filter which weighs 8.5MB.
  • Uploaded it.
  • Before patch: got complaints in log about imagecreatefromstring() failing, although the actual upload went OK in my environment.
  • After patch: clean log, no attempt to detect the size of a big image.
  • Also uploaded a small image, got dimensions detected properly still.

Diff Detail

rP Phabricator
Lint Not Applicable
Tests Not Applicable

Event Timeline

amckinley added inline comments.

chunk.jpg (417×467 px, 46 KB)

This revision is now accepted and ready to land.Dec 14 2017, 5:54 PM
This revision was automatically updated to reflect the committed changes.