My Node & Python backend is running just fine, but I now encountered an issue where if a JSON I\'m sending from Python back no Node is too long, it gets split into two a
The emitted data is chunked, so if you want to parse a JSON
you will need to join all the chunks, and on end
perform JSON.parse
.
By default, pipes for stdin, stdout, and stderr are established between the parent Node.js process and the spawned child. These pipes have limited (and platform-specific) capacity. If the child process writes to stdout in excess of that limit without the output being captured, the child process will block waiting for the pipe buffer to accept more data.
In linux each chunk is limited to 65536
bytes.
In Linux versions before 2.6.11, the capacity of a pipe was the same as the system page size (e.g., 4096 bytes on i386). Since Linux 2.6.11, the pipe capacity is 65536 bytes.
let result = '';
pythonProcess.stdout.on('data', data => {
result += data.toString();
// Or Buffer.concat if you prefer.
});
pythonProcess.stdout.on('end', () => {
try {
// If JSON handle the data
console.log(JSON.parse(result));
} catch (e) {
// Otherwise treat as a log entry
console.log(result);
}
});