We're currently doing technical interviews by launching a Phacility instance with a prebuilt interview scenario (a skeleton third-party application and a set of tasks describing bugs to fix and features to implement), inviting candidates to the instance, and then having them work on tasks using the full Phabricator stack. For context, T12004 is an early version of this scenario.
Today, I'm just manually building these instances and copy/pasting task descriptions into them. Instead, we could build them in a more structured way by running a script to populate the scenario onto an instance over Conduit. It's fine if the interview scenario is public (a slightly outdated version is: rLOCATIONS). Everyone starts at a different point anyway (e.g., different prior knowledge of PHP and Phabricator application structure) so we can't directly compare one candidate's performance to another candidate's performance even if we tried to keep the scenario secret.
If a candidate walks in with a stack of local diffs that solve all of the interview problems that's a perfectly fine signal and I can just scramble to generate more scenario ahead of them. If they can solve it faster than I can generate it, more power to them.
If a candidate walks in with a stack of plagiarized local diffs that solve all of the interview problems, sincerely expecting to bamboozle their way into a cushy engineering position at Phacility, that's also a strong signal. I believe I would be able to see through this ruse by observing that they can not solve new scenario problems I generate on the fly.
(In an extreme case, because we're so remote friendly, a candidate could theoretically sub-contract their interview and job to someone else, taking a cut of the salary, I guess? This seems like a movie plot, not a real scenario, and there is no reason a qualified candidate who was capable of passing the interview on their own couldn't do the same thing -- if anything, they'd be better positioned to do it.)