Page MenuHomePhabricator

Enormous change detection considers commits individually and can fail when many large commits are pushed simultaneously
Open, NormalPublic


See PHI657. Perhaps see also T8951 for some context.

When you push changes, we normally load them into memory so that we can run Herald content rules against them. When changes are too large to hold in memory (today, larger than 1GB) we fall back to an "enormous change" mode where the change is flagged specially, not held in memory, and rejected by default.

This mode is evaluated for each commit individually, but you can push two 750MB changes in the same push for a total of 1.5GB. We don't require the push as a whole to fly under these limits, and can end up trying to hold an arbitrarily large amount of content in memory if no individual change is larger than 1GB.

Instead, we should apply the limits to the push as a whole, and issue guidance about breaking pushes apart. (Or perhaps we can just purge the changeset cache as we go, since we don't have any actual need to hold everything in memory.)