Page MenuHomePhabricator

Use repository identities, not denormalized strings, to identify authors for "Revision closed by commit X" stories
ClosedPublic

Authored by epriestley on Apr 14 2019, 5:51 PM.
Tags
None
Referenced Files
Unknown Object (File)
Sat, Nov 23, 2:14 AM
Unknown Object (File)
Fri, Nov 22, 5:26 AM
Unknown Object (File)
Thu, Nov 21, 9:23 AM
Unknown Object (File)
Wed, Nov 20, 1:23 AM
Unknown Object (File)
Tue, Nov 19, 6:33 AM
Unknown Object (File)
Fri, Nov 15, 5:50 AM
Unknown Object (File)
Mon, Nov 11, 1:49 PM
Unknown Object (File)
Thu, Nov 7, 8:32 AM
Subscribers
None

Details

Summary

Ref T12164. Ref T13276. Currently, the parsing pipeline copies the author and committer names and PHIDs into the transcaction record as metadata. They are then rendered directly from the metadata.

This makes planned changes to the parsing pipeline (to prevent races when multiple commits matching a single revision are pushed simultaneously) more difficult, and generally won't work with repository identities.

Instead, load the commit and use its identities.

Test Plan

Loaded a revision, saw the same story rendering for a "Closed by commit" story.

Diff Detail

Repository
rP Phabricator
Lint
Lint Not Applicable
Unit
Tests Not Applicable

Event Timeline

epriestley retitled this revision from Use repository identities, not denoralized strings, to identify authors for "Revision closed by commit X" stories to Use repository identities, not denormalized strings, to identify authors for "Revision closed by commit X" stories.Apr 14 2019, 6:38 PM
amckinley added inline comments.
src/applications/repository/worker/commitmessageparser/PhabricatorRepositoryCommitMessageParserWorker.php
253–264

I was wondering a while back if we'll ever want to go back and remove this metadata in a migration. It's probably not hurting anybody, but it might actually recover some meaningful space on installs with millions of commits.

This revision is now accepted and ready to land.Apr 17 2019, 6:29 PM

We could, possibly. There are some old "action" transactions we should also clean up at some point.

I suspect we may do a round of transaction data cleaning once Facts comes online, since Facts depends on doing ETL on the transaction record, and at least some issues are probably easier to fix by rewriting the transaction record than by adding hacky code to handle old transactions to Facts analyzers.

This revision was automatically updated to reflect the committed changes.