Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ErrorException: Caught ErrorException (HTTP/1.0 500 Internal Server Error): fwrite(): write of X bytes failed with errno=32 Broken pipe #1634

Closed
ehginanjar opened this issue Feb 11, 2022 · 25 comments

Comments

@ehginanjar
Copy link

ehginanjar commented Feb 11, 2022

Monolog version 2

Got these lot of errors when sudden burst traffic hit the service. Even though the request end up success with HTTP 200, unfortunately this make additional at least ~2s latency for every processed requests thus slowing down the service.

Stacktrace:

ErrorException: Caught ErrorException (HTTP/1.0 500 Internal Server Error): fwrite(): write of 111 bytes failed with errno=32 Broken pipe in /var/www/app/vendor/monolog/monolog/src/Monolog/Handler/StreamHandler.php:178
#0 [internal function]: Laravel\Lumen\Application->Laravel\Lumen\Concerns\{closure}()
#1 /var/www/app/vendor/monolog/monolog/src/Monolog/Handler/StreamHandler.php(178): fwrite()
#2 /var/www/app/vendor/monolog/monolog/src/Monolog/Handler/StreamHandler.php(162): Monolog\Handler\StreamHandler->streamWrite()
#3 /var/www/app/vendor/monolog/monolog/src/Monolog/Handler/AbstractProcessingHandler.php(48): Monolog\Handler\StreamHandler->write()
#4 /var/www/app/vendor/monolog/monolog/src/Monolog/Logger.php(327): Monolog\Handler\AbstractProcessingHandler->handle()
#5 /var/www/app/vendor/monolog/monolog/src/Monolog/Logger.php(565): Monolog\Logger->addRecord()
#6 /var/www/app/vendor/illuminate/log/Logger.php(174): Monolog\Logger->error()
#7 /var/www/app/vendor/illuminate/log/Logger.php(87): Illuminate\Log\Logger->writeLog()
#8 /var/www/app/vendor/illuminate/log/LogManager.php(543): Illuminate\Log\Logger->error()
#9 /var/www/app/vendor/laravel/lumen-framework/src/Exceptions/Handler.php(56): Illuminate\Log\LogManager->error()
#10 /var/www/app/app/Exceptions/Handler.php(51): Laravel\Lumen\Exceptions\Handler->report()
#11 /var/www/app/vendor/laravel/lumen-framework/src/Concerns/RegistersExceptionHandlers.php(104): App\Exceptions\Handler->report()
#12 /var/www/app/vendor/laravel/lumen-framework/src/Concerns/RoutesRequests.php(174): Laravel\Lumen\Application->sendExceptionToHandler()
#13 /var/www/app/vendor/laravel/lumen-framework/src/Concerns/RoutesRequests.php(109): Laravel\Lumen\Application->dispatch()
#14 /var/www/app/public/index.php(28): Laravel\Lumen\Application->run()
#15 {main}

I use stdout and stderr for channels. It's then collected by filebeat then sent to ES. This is standard EFK stack in kubernetes cluster.

I thought that the disk is full, but I was wrong. The disk is still far from full. Then I suspect two things:

  • the log may be too big to handle (2.2 million logs per day) so doing some filters become necessary, or
  • there's a misconfiguration on my config
        'stderr' => [
            'driver' => 'monolog',
            'handler' => StreamHandler::class,
            'with' => [
                'stream' => 'php://stderr',
            ],
        ],

        'stdout' => [
            'driver' => 'monolog',
            'handler' => StreamHandler::class,
            'formatter' => Monolog\Formatter\LineFormatter::class,
            'formatter_with' => [
                'format' => "[%datetime%] ".env('APP_NAME', 'app-name').".%level_name%: %message%\n",
                'dateFormat' => 'Y-m-d H:i:s',
            ],
            'with' => [
                'stream' => 'php://stdout',
            ],
        ],

I was wondering why it says HTTP 500 internal server error? It's from php error message right? 🤔

I just think this kind of monolog error doesn't affect the application.

Wondering if anyone face the same issue.
Any help is appreciated.

@ehginanjar ehginanjar added the Bug label Feb 11, 2022
@ehginanjar ehginanjar changed the title ErrorException: Caught ErrorException (HTTP/1.0 500 Internal Server Error): fwrite(): write of 111 bytes failed with errno=32 Broken pipe ErrorException: Caught ErrorException (HTTP/1.0 500 Internal Server Error): fwrite(): write of X bytes failed with errno=32 Broken pipe Feb 11, 2022
@Seldaek
Copy link
Owner

Seldaek commented Mar 7, 2022

I suspect the 500 internal server error part might be from Laravel\Lumen\Application->Laravel\Lumen\Concerns\{closure}().

I guess it does that whenever it catches an exception.

You could try to wrap the StreamHandler inside a WhatFailureGroupHandler to suppress these errors if stdout/stderr cannot be written to for some reason.

I am not sure why it breaks though, broken pipe could be because the stream isn't open anymore when it tries to write. If Lumen is processing more than one request in one process, you could also try to call $handler->close() on those StreamHandler between requests to make sure the stream gets reopened every time?

@Seldaek Seldaek added the Support label Mar 7, 2022
@Atantares
Copy link

Atantares commented Aug 18, 2022

@Seldaek
I have the same issue with Symfony. Running threads with requests. The thread writes logs to one file (one thread = one log).
$handler->close() or $handler->reset() doesn't help here. Part of code:

public function wait(string $message, int $retry, int $waitTimeout = null): void
{
    $waitTimeout = $this->calculateWaitTimeout($retry, $waitTimeout);
    // here's can be executed close() or reset() but it doesn't work. The error is the same
    $this->getLogger()->critical('Wait ' . $waitTimeout . ' seconds and try again' . PHP_EOL . $message);
    Functions::sleep($waitTimeout);
}

This function can be executed only one time. After that it throws exceptions like that:

fwrite(): Write of 8678 bytes failed with errno=32 Broken pipe ["/development/application/vendor/monolog/monolog/src/Monolog/Handler/StreamHandler.php",178,"
#0 /development/application/vendor/monolog/monolog/src/Monolog/Handler/StreamHandler.php(162): Monolog\\Handler\\StreamHandler->streamWrite(Resource id #514, Array)
#1 /development/application/vendor/monolog/monolog/src/Monolog/Handler/AbstractProcessingHandler.php(48): Monolog\\Handler\\StreamHandler->write(Array)
#2 /development/application/vendor/monolog/monolog/src/Monolog/Logger.php(327): Monolog\\Handler\\AbstractProcessingHandler->handle(Array)
#3 /development/application/vendor/monolog/monolog/src/Monolog/Logger.php(578): Monolog\\Logger->addRecord(500, 'Wait 2 seconds ...', Array)
#4 /development/application/src/API/V1/Rest/AbstractConnector.php(230): Monolog\\Logger->critical('Wait 2 seconds ...')\n

The stream is the same as in the issue header.
Monolog 2.x/3.x - the same...

@Seldaek
Copy link
Owner

Seldaek commented Aug 20, 2022

@Atantares do you have any details on the config? What's the StreamHandler configured to log to? Overall sharing the whole config would maybe help.

@Seldaek Seldaek closed this as not planned Won't fix, can't repro, duplicate, stale Feb 4, 2023
@MMSs
Copy link

MMSs commented Dec 5, 2023

I ran into this issue this week upon upgrading Monolog from 2.9 to 3.5, which is part of upgrading Lumen from 8 to 10.

This issue only showed up on production, not on local dev environment, nor on staging.

In all environments the log channel is configured to go to stderr LOG_CHANNEL=stderr, and I managed to confirm the log really is going to stderr, and not to a file.

What's also most frightening is that I got the same error @ehginanjar and @Atantares returned in the endpoint's response to end users with a 200 http status, and wasn't logged in the application's log, apparently because the stderr stream pipe got broken, and fwrite couldn't write anything.

@xuandung38
Copy link

xuandung38 commented Dec 8, 2023

@MMSs I encountered the same thing with laravel 10, do you found any solution?

@MMSs
Copy link

MMSs commented Dec 13, 2023

Unfortunately not yet @xuandung38.

I upgraded from php 8.0 to 8.2 along with Laravel from 8 to 10, which consecutively upgraded monolog library from 2.9 to 3.5.

And I tried a couple of solutions to fix the issue, but none of them worked:

  1. Reduce the log size by removing the trace log. The log now is approximately 2k-4k line/minute, before this upgrade the application was comfortably logging up to 30k line/minute.
  2. Downgrade Monolog to 3.0, the minimum required version by Laravel 10, specifically illuminate/log:10.
  3. Downgrade PHP to 8.1

@ls-tyler-roach
Copy link

I've run into this issue as well as of this week. We deployed to production PHP 8.1 upgrade to a Laravel 8 project and immediately ran into this error. This error did not appear at all in local or staging environments.

@Seldaek
Copy link
Owner

Seldaek commented Dec 15, 2023

Sorry but until someone gets to the bottom of this, I doubt I can do much. I'm not sure why this seems to be an issue with Laravel and not other projects.

@ls-tyler-roach
Copy link

This started happening to us after upgrading to PHP 8.1, bumped Monolog to 2.9.2 and stayed on Lumen 8. I'm at a loss though to figure out what suddenly triggered the issue.

@Seldaek
Copy link
Owner

Seldaek commented Dec 15, 2023

Bumped Monolog to 2.9.2 from which version? If you can try to downgrade it again and that fixes it at least it'd give us a range of changes to look through for a regression.. Because 2.x to 3.x like the other reports above is quite a large range.

@ls-tyler-roach
Copy link

We were on 2.9.1. I'll try downgrading and see if I can replicate the issue

@Seldaek
Copy link
Owner

Seldaek commented Dec 15, 2023

Ok, that might be due to this change which would have made it swallow errors in some cases before? 2.9.1...2.9.2#diff-a1ddc5c4ead6773b8670f9be5007cbe0239638aedbcf12166acf075a9b8742a2 see also #1815

@Seldaek
Copy link
Owner

Seldaek commented Dec 15, 2023

Seems unlikely to me that this is it but who knows. Anyway would be good to know if it's 2.9.2 which caused it or you upgrade to PHP 8.1

@ls-tyler-roach
Copy link

I'll downgrade and test it out and let you know. Does this mean that potentially that on 2.9.1 we were missing some logs that were being swallowed?

@Seldaek
Copy link
Owner

Seldaek commented Dec 15, 2023

I'm not entirely sure but it might be if that's indeed what caused it.. My gut feeling says no tho

@MMSs
Copy link

MMSs commented Dec 18, 2023

What I found in debugging this is that is_resource here

if (!is_resource($this->stream)) {
doesn't detect if the resource STDERR/STDOUT, that was previously opened on this line
$stream = fopen($url, 'a');
, is writable. So although is_resource($this->stream) is true, fwrite still fails to write to it.

I don't know what causes the pipe to break only in production, or why only with the latest version of illuminate/log, but my suspicion is that in production multiple php-fpm processes are constantly trying to open new handlers, leading the system to overload and break previously opened ones.

So the solution that we went with for now is replacing StreamHandler with ErrorLogHandler in Laravel's bootstrap, which uses error_log() instead of fwrite()

$app->extend(\Psr\Log\LoggerInterface::class, function ($logger) {
    $logger->popHandler();
    $logger->pushHandler(new \Monolog\Handler\ErrorLogHandler());
    return $logger;
});

@msobkowiak-olx
Copy link

I don't know what causes the pipe to break only in production, or why only with the latest version of illuminate/log, but my suspicion is that in production multiple php-fpm processes are constantly trying to open new handlers, leading the system to overload and break previously opened ones.

That is likely what can happen, to prove it either way you can try playing around with sysctl and pipe/open files limits:

sysctl -w fs.file-max=xxx

If the library does not recycle the stream handlers at large scale the process can locally run out of the open files limit easily. OS can close the pipe at any time really (even if its local) due to the kernel limits.

@ls-tyler-roach
Copy link

We downgraded our package to 2.9.1 which is what it was at before we did the PHP 8.1 upgrade and we're still seeing this error for some reason. Could it be tied to something that changed in 8.1?

@Seldaek
Copy link
Owner

Seldaek commented Dec 20, 2023

Could be.. but only way to be sure is for you to revert that I guess.. not sure how easy it'd be for you. I don't see anything that seems relevant in the 8.1 changelog but who knows.. small things are sometimes not mentioned.

@ls-tyler-roach
Copy link

We did roll back our PHP 8.1 upgrade and went back to 8.0 along with Monolog 2.9.1 and the issue stopped occurring.

@Seldaek
Copy link
Owner

Seldaek commented Dec 20, 2023

Damn.. ok, this just got way more interesting. Without clear repro case though this is gonna be hard to report to the php project.

@Seldaek
Copy link
Owner

Seldaek commented Dec 20, 2023

What stream URL do you use btw? php://stdout or a file? What is it writing into?

@ls-tyler-roach
Copy link

We stream it via stout and stderr which then goes out to Datadog.

@norud
Copy link

norud commented Dec 20, 2023

it's weird on local using laragon domino.test/endpoint i get that error but if through php artisan serve http://127.0.0.1:8000/endpoint the error does not occur
You can get the logs using php artisan serve and it doesn't give the error, my case was fwrite(): Write of 3619 bytes failed with errno=9 Bad file descriptor

@TheLevti
Copy link

TheLevti commented Mar 6, 2024

We stream it via stout and stderr which then goes out to Datadog.

Same issue and same use case.

Seldaek added a commit that referenced this issue Nov 11, 2024
* Close file handle after each write, refs #1862, refs #1860, refs #1634

* Modify patch to retry writes once if they fail, then throw if not

* Fix php7.2 build
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

9 participants