See PHI1114. An install encountered a multi-megabyte document with approximately 11,000 replacement tokens (complex remarkup rules which store text and evaluate at display time) that required 17s to render.
Onsite investigation narrowed this down to a large amount of time spent in `restore()`, here.
Before this change, a document like this must call `str_replace()` on the full document for each token, so roughly `O(size of the document * number of tokens)` bytes are being shuffled around.
We can improve this dramatically by:
- incrementally expanding tokens, so most operations are on one token instead of the entire document (and the total document size has a much smaller performance effect); and
- replacing tokens in a single past with `preg_match()` + append + `implode()` instead of running `str_replace()` in a loop.