Skip to content

Conversation

@valga
Copy link
Contributor

@valgavalga commented Sep 1, 2017

At first, some background on this. I have a long-running script that maintains persistence on some connections, so if connection was reset, it retries with an exponential backoff. Sometimes it runs without issues and sometimes it eats a lot of memory and even becomes unresponsive and is getting killed by watchdog. Apart from maintaining connections, my script just reads incoming data and writes it into a file, so there are no memory leaks in my code.

After a research, I found that timed out connections produce a lot of garbage, that can be cleaned only with GC. Here is a test scenario:

$loop = React\EventLoop\Factory::create(); $loop->addPeriodicTimer(1, function () use ($loop){$connector = new \React\Socket\Connector($loop, array( 'timeout' => 0.5, 'dns' => false, 'tls' => false, 'unix' => false, )); // Use host and port combination that results in connection being timed out.$connector->connect('89.123.45.67:3128') ->then(function (\React\Socket\ConnectionInterface$stream){$stream->close(); printf("[+] %.3fMB\n", memory_get_usage() / 1024 / 1024)})->otherwise(function (\Exception$e){printf("[-] %.3fMB (%s)\n", memory_get_usage() / 1024 / 1024, $e->getMessage())})}); $loop->addTimer(60, function () use ($loop){printf( "%.3fMB => %d => %.3fMB\n", memory_get_usage() / 1024 / 1024, gc_collect_cycles(), memory_get_usage() / 1024 / 1024); $loop->stop()}); $loop->run();

Result:

[-] 0.755MB (Timed out after 0.5 seconds) [-] 0.786MB (Timed out after 0.5 seconds) [-] 0.810MB (Timed out after 0.5 seconds) ... [-] 2.114MB (Timed out after 0.5 seconds) [-] 2.138MB (Timed out after 0.5 seconds) [-] 2.162MB (Timed out after 0.5 seconds) 2.187MB => 9145 => 0.775MB 

After doing more research, I found that is caused by promise cancellation: the more ->then() you have on cancellable promise, the more garbage it will leave after cancellation. So I tweaked code a bit and got this result:

[-] 0.749MB (Timed out after 0.5 seconds) [-] 0.771MB (Timed out after 0.5 seconds) [-] 0.787MB (Timed out after 0.5 seconds) ... [-] 1.636MB (Timed out after 0.5 seconds) [-] 1.651MB (Timed out after 0.5 seconds) [-] 1.667MB (Timed out after 0.5 seconds) 1.684MB => 6195 => 0.761MB 

Now we have 30% less garbage to collect. There is still a room for improvement, but it needs to be done with the Promise component.

@kelunik
Copy link
Contributor

This should rather be fixed in react/promise, but the PR might be fine either way.

@valga
Copy link
ContributorAuthor

valga commented Sep 1, 2017

@kelunik Even if it will be fixed in react/promise, I see no point in creating these 2 promises, because it is just a waste of resources.

@valga
Copy link
ContributorAuthor

valga commented Sep 2, 2017

Ok, I found a problem with react/promise and it is bigger than I thought: basically you can't use exceptions as rejection reasons. See reactphp/promise#46 (comment) for more details on this.

@clueclue changed the title Reduce the amount of garbage produced by TcpConnector on cancellationReduce memory consumption for failed connectionsSep 4, 2017
@clue
Copy link
Member

clue commented Sep 4, 2017

Thanks for filing this ticket and the elaborate discussion in the related tickets! 👍

I've updated to title to reflect the current state of the discussion and IMO it makes sense to get this in as-is. To sum this up, this component does not "leak" any memory, but reducing the memory consumption by reducing the number of circular dependencies is certainly a good thing 👍

clue
clue approved these changes Sep 4, 2017
Copy link
Member

@WyriHaximusWyriHaximus left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM :shipit:

jsor
jsor approved these changes Sep 4, 2017
@clue
Copy link
Member

clue commented May 5, 2018

@valga Thank you for raising this discussion! This obscure memory behavior should be resolved as part of #161 once these issues have been resolved in the upstream components :shipit:

Sign up for freeto join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants

@valga@kelunik@clue@jsor@WyriHaximus