top button
Flag Notify
    Connect to us
      Site Registration

Site Registration

stdout of child process as an input of another thread in Python?

+2 votes
994 views

I want to use the stdout of child process as an input of another thread, but some how I am not able to read the stdout. Using Popen I have created a child process and stdout of it I have redirected to PIPE (because I don't want that to be printed on with my main process). Now using this statement,"Line=Proc.stdout.readline()" I want to read stdout of child process and I have to do operation on that. but somehow i am not able to read from stdout of child process.

Following is the code snapshot -

Proc = Popen(["python.exe","filename.py"],stdout=PIPE)

while True:
            Line = Proc.stdout.readline()
            print Line

Here,
Filename.py(Child process) is continuously printing "Hello World!!" in place of reading stdout of child process.

posted May 29, 2015 by Chirag Gangdev

Share this question
Facebook Share Button Twitter Share Button LinkedIn Share Button
You have a look called while true which is the infinite loop. You have to break the loop to avoid infinite run ...For example
while True:
    n = raw_input("Please enter 'hello':")
    if n.strip() == 'hello':
        break
yeah that i know sir..
I guess u didn't understand my question.
i want to use the stdout of child process as an input of another thread.
but some how i am not able to read the stdout.
Using Popen i have created a child process and stdout of it i have redirected to PIPE. (because i don't want that to be printed on with my main process)
now using this statement,"Line=Proc.stdout.readline()" i want to read stdout of child process and i have to do opertaion on that. but somehow i am not able to read from stdout of child process

2 Answers

+1 vote
 
Best answer

Er, yes, well that happens. In all likelihood, what's happening is that your OS has recognized that your subprocess's output isn't connected to a terminal. And therefore, for the sake of efficiency, it is buffering its output until either the program ends or the buffers are full. The buffers can be very big.

Arraigning for the subprocess to end will likely help. If that's not practical and you have any control over it, arranging for it to flush its stdout or run unbuffered or the like may help, but not always.

The pty module:
https://docs.python.org/2/library/pty.htm

can also help to persuade your OS not to buffer a subprocess's output.

answer May 29, 2015 by anonymous
+1 vote

Your description is not sufficiently clear:

If "filename.py" is your child process, why should it read the stdout of the child process (i.e. itself).

I suggest, you describe what "filename.py" (your child) does and what behavior you observe for the parent.

answer May 30, 2015 by anonymous
Similar Questions
0 votes

I want to create a random float array of size 100, with the values in the array ranging from 0 to 5. I have tried random.sample(range(5),100) but that does not work. How can i get what i want to achieve?

+2 votes

How we can send mail with attachment in Python? Is it any prerequisite for it?

+2 votes

The problem description:

There are set of processes running on my system say process_set_1. Build a process agent which runs an asynchronous socket listening to incoming requests from process_set_1. Each process sends an id. The agent stores these ids in a dictionary and sends response that ur id has been accepted. Now agent process receives some data from another process (which is some sort of sister process for the agent). This data contains id along with a command which is to be sent by the agent to the process_set_1 through HTTP client over AF_UNIX socket, since each process in process_set_1 has a HTTP listening CLI. The process agent sends an HTTP request by recognizing id stored in dictionary to the process_set_1. A service running in the process_set_1 routes this HTTP command to the respective process.

Now my problem is the HTTP request which to be sent must go through AF_UNIX socket. I got this solution.

class UnixStreamHTTPConnection(httplib.HTTPConnection):

    def __init__(self, path, host='localhost/rg_cli',port=None, strict=None,
                 timeout=None):
        httplib.HTTPConnection.__init__(self, host, port=port, strict=strict,
                                        timeout=timeout)
        self.path=path

    def connect(self):
        self.sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
        self.sock.connect(self.path)

But this cannot work since normal socket will not work and thus i thought of asyncore module in python. To use asyncore module again i will have to subclass asyncore.dispatcher. This class also contains connect() method.

Another problem is I don't know how asyncore module works and thus not able to find a way to mix the work of 1) listening forever, accept the connection, store id and the sock_fd.
2) accept data from the process' agent sister process, retrieve the sock_fd by matching the id in the dictionary and send it through the AF_UNIX socket.

Please help since i have already spent 2 days digging it out. Sorry if could explain my problem very well.

0 votes

Hi,

I have a list of arbitrary length, and I need to split it up into equal size chunks. There are some obvious ways to do this, like keeping a counter and two lists, and when the second list fills up, add it to the first list and empty the second list for the next round of data, but this is potentially extremely expensive.

I was wondering if anyone had a good solution to this for lists of any length

This should work:

l = range(1, 1000)
print chunks(l, 10) -> [ [ 1..10 ], [ 11..20 ], .., [ 991..999 ] ]

I was looking for something useful in itertools but I couldn't find anything obviously useful.

Appretiate your help.

+1 vote

For example:

a=[-15,-30,-10,1,3,5]

I want to find a negative and a positive minimum.

example: negative
print(min(a)) = -30

positive
print(min(a)) = 1
...