Page Menu
Home
Phabricator
Search
Configure Global Search
Log In
Files
F14053144
D7776.id17592.diff
No One
Temporary
Actions
View File
Edit File
Delete File
View Transforms
Subscribe
Mute Notifications
Award Token
Flag For Later
Size
2 KB
Referenced Files
None
Subscribers
None
D7776.id17592.diff
View Options
Index: src/infrastructure/ssh/PhabricatorSSHPassthruCommand.php
===================================================================
--- src/infrastructure/ssh/PhabricatorSSHPassthruCommand.php
+++ src/infrastructure/ssh/PhabricatorSSHPassthruCommand.php
@@ -97,6 +97,18 @@
$channels = array($command_channel, $io_channel, $error_channel);
+ // We want to limit the amount of data we'll hold in memory for this
+ // process. See T4241 for a discussion of this issue in general.
+
+ $buffer_size = (1024 * 1024); // 1MB
+ $io_channel->setReadBufferSize($buffer_size);
+ $command_channel->setReadBufferSize($buffer_size);
+
+ // TODO: This just makes us throw away stderr after the first 1MB, but we
+ // don't currently have the support infrastructure to buffer it correctly.
+ // It's difficult to imagine this causing problems in practice, though.
+ $this->execFuture->getStderrSizeLimit($buffer_size);
+
while (true) {
PhutilChannel::waitForAny($channels);
@@ -104,7 +116,29 @@
$command_channel->update();
$error_channel->update();
- $done = !$command_channel->isOpen();
+ // If any channel is blocked on the other end, wait for it to flush before
+ // we continue reading. For example, if a user is running `git clone` on
+ // a 1GB repository, the underlying `git-upload-pack` may
+ // be able to produce data much more quickly than we can send it over
+ // the network. If we don't throttle the reads, we may only send a few
+ // MB over the I/O channel in the time it takes to read the entire 1GB off
+ // the command channel. That leaves us with 1GB of data in memory.
+
+ while ($command_channel->isOpen() &&
+ $io_channel->isOpenForWriting() &&
+ ($command_channel->getWriteBufferSize() >= $buffer_size ||
+ $io_channel->getWriteBufferSize() >= $buffer_size ||
+ $error_channel->getWriteBufferSize() >= $buffer_size)) {
+ PhutilChannel::waitForActivity(array(), $channels);
+ $io_channel->update();
+ $command_channel->update();
+ $error_channel->update();
+ }
+
+ // If the subprocess has exited and we've read everything from it,
+ // we're all done.
+ $done = !$command_channel->isOpenForReading() &&
+ $command_channel->isReadBufferEmpty();
$in_message = $io_channel->read();
if ($in_message !== null) {
@@ -133,12 +167,20 @@
// If the client has disconnected, kill the subprocess and bail.
if (!$io_channel->isOpenForWriting()) {
- $this->execFuture->resolveKill();
+ $this->execFuture
+ ->setStdoutSizeLimit(0)
+ ->setStderrSizeLimit(0)
+ ->setReadBufferSize(null)
+ ->resolveKill();
break;
}
}
- list($err) = $this->execFuture->resolve();
+ list($err) = $this->execFuture
+ ->setStdoutSizeLimit(0)
+ ->setStderrSizeLimit(0)
+ ->setReadBufferSize(null)
+ ->resolve();
return $err;
}
File Metadata
Details
Attached
Mime Type
text/plain
Expires
Sat, Nov 16, 12:54 PM (3 d, 2 h ago)
Storage Engine
blob
Storage Format
Encrypted (AES-256-CBC)
Storage Handle
6740225
Default Alt Text
D7776.id17592.diff (2 KB)
Attached To
Mode
D7776: Limit memory usage of `ssh-exec` during large pull operations
Attached
Detach File
Event Timeline
Log In to Comment