What if? Say you want to execute something in a python sub-process and you want to read and resend all output data, no matter how big they are?
The simplest solution to use pipe is BAD idea - pipe has it's own buffer and what's worse size of this buffer is constant and cannot be changed by using some smart fnctl call any more. Let's run a simple test:
On my laptop I'm not able to send more than 64kB at a time, moreover its completely hanging:
Hmmm... So how can you send more than that? Let's say this way:
Use a simple manager, man! More info here.
There is also possibility to use shared memory objects (read this), but those are rather for storing not so huge amount of data.
Disclaimer: Before use, read the contents of the package insert or consult your physician or pharmacist because each drug used improperly threatens your life or health.
Brak komentarzy:
Prześlij komentarz