Node.js, Pipes, and Disappearing Bytes

102 pointsposted 9 months ago
by mooreds

9 Comments

jitl

9 months ago

Here's how I solved this problem in Notion's internal command line tools:

    function flushWritableStream(stream: NodeJS.WritableStream) {
      return new Promise(resolve => stream.write("", resolve)).catch(
        handleFlushError,
      )
    }
    
    /**
     * In NodeJS, process.stdout and process.stderr behave inconsistently depending on the type
     * of file they are connected to.
     *
     * When connected to unix pipes, these streams are *async*, so we need to wait for them to be flushed
     * before we exit, otherwise we get truncated results when using a Unix pipe.
     *
     * @see https://nodejs.org/api/process.html#process_a_note_on_process_i_o
     */
    export async function flushStdoutAndStderr() {
      await Promise.all([
        flushWritableStream(process.stdout),
        flushWritableStream(process.stderr),
      ])
    }

    /**
     * If `module` is the NodeJS entrypoint:
     *
     * Wait for `main` to finish, then exit 0.
     * Note that this does not wait for the event loop to drain;
     * it is suited to commands that run to completion.
     *
     * For processes that must outlive `main`, see `startIfMain`.
     */
    if (require.main === module) {
      await main(argv)
      await flushStdoutAndStderr()
      setTimeout(() => process.exit(0))
    }

userbinator

9 months ago

When I read the title and the first sentence I immediately thought of partial writes, one of those things that a lot of people seem to ignore until things stop working, often intermittently. I don't work with Node.js, but I've had to fix plenty of network code that had the same bug of not handling partial writes correctly.

I thought gpt-4o and o1-preview would be able to do this pretty easily, but surprisingly not.

You're surprised that AI doesn't work? I'm not.

fovc

9 months ago

POSIX is weird, but NodeJS streams are designed to be misused

hipadev23

9 months ago

I’m confused. If process.stdout.write() returns false when the pipe is full, do you not need to loop and call it again or something analogous? Or does it continue operating on the write in the background and that’s why waiting for the .finished() event works?

Is there a reason it doesn’t use standard nodejs promise semantics (await process.stdout.write)? So probably best solution is util.promisify()?

pdr94

9 months ago

reat investigation! This highlights a crucial aspect of Node.js's asynchronous nature when dealing with pipes. It's a reminder that we should always be mindful of how Node handles I/O operations differently based on the output destination.

The key takeaway is the behavior difference between synchronous (files/TTYs) and asynchronous (pipes) writes in Node.js. This explains why `process.exit()` can lead to truncated output when piping.

For those facing similar issues, remember to handle the `drain` event or use a more robust streaming approach to ensure all data is written before exiting. This post is a valuable resource for debugging similar "mysterious" pipe behavior in Node.js applications.

arctek

9 months ago

fsync doesn't work here right because unix pipes are in memory? I've had luck elsewhere with nodejs and WriteableStreams that refuse to flush their buffers before a process.exit() using fsync on the underlying file descriptors.

molsson

9 months ago

process.stdout._handle.setBlocking(true)

...is a bit brutal but works. Draining the stream before exiting also kind of works but there are cases where drain will just permanently block.

async function drain(stream) { return new Promise((resolve) => stream.on('drain', resolve)) }

benatkin

9 months ago

This is clickbait. The process exiting without unflushed output doesn't mean disappearing bytes. The bytes were there but the program left without them.