Python: How can I run python functions in parallel?

You could use threading or multiprocessing .

Up vote 5 down vote favorite 1 share g+ share fb share tw.

I researched first and couldn't find an answer to my question. I am trying to run multiple functions in parallel in Python. I have something like this: files.

Py import common #common is a util class that handles all the IO stuff dir1 = 'C:\folder1' dir2 = 'C:\folder2' filename = 'test. Txt' addFiles = 25, 5, 15, 35, 45, 25, 5, 15, 35, 45 def func1(): c = common.Common() for I in range(len(addFiles)): c. CreateFiles(addFilesi, filename, dir1) c.

GetFiles(dir1) time. Sleep(10) c. RemoveFiles(addFilesi, dir1) c.

GetFiles(dir1) def func2(): c = common.Common() for I in range(len(addFiles)): c. CreateFiles(addFilesi, filename, dir2) c. GetFiles(dir2) time.

Sleep(10) c. RemoveFiles(addFilesi, dir2) c. GetFiles(dir2) I want to call func1 and func2 and have them run at the same time.

The functions do not interact with each other or on the same object. Right now I have to wait for func1 to finish before func2 to start. How do I do something like below: process.

Py from files import func1, func2 runBothFunc(func1(), func2()) I want to be able to create both directories pretty close to the same time because every min I am counting how many files are being created. If the directory isn't there it will throw off my timing. Python link|improve this question edited Aug 26 '11 at 16:56 asked Aug 26 '11 at 15:46Lamar McAdory313.

Docs.python.org – wiso Aug 26 '11 at 16:58 1 You might want to re-architect this; if you are counting the number of files/folders every minute, you are creating a race condition. What about having each function update a counter, or use a lockfile to ensure that the periodic process doesn't update the count until both functions have finished executing? – Phoenix Aug 26 '11 at 17:02.

You could use threading or multiprocessing. Due to peculiarities of CPython, threading is unlikely to achieve true parallelism. For this reason, multiprocessing is generally a better bet.

Here is a complete example: from multiprocessing import Process def func1(): print 'func1: starting' for I in xrange(10000000): pass print 'func1: finishing' def func2(): print 'func2: starting' for I in xrange(10000000): pass print 'func2: finishing' if __name__ == '__main__': p1 = Process(target=func1) p1.start() p2 = Process(target=func2) p2.start() p1.join() p2.join() The mechanics of starting/joining child processes can easily be encapsulated into a function along the lines of your runBothFunc: def runInParallel(*fns): proc = for fn in fns: p = Process(target=fn) p.start() proc. Append(p) for p in proc: p.join() runInParallel(func1, func2).

I used your code but the functions still didn't start at the same time. – Lamar McAdory Aug 26 '11 at 16:26 1 @Lamar McAdory: Please explain what exactly you mean by "at the same time", perhaps giving a concrete example of what you did, what you were expecting to happen, and what actually happened. – aix Aug 26 '11 at 16:30 2 @Lamar: You can never have any guarantee of "exactly the same time" and thinking you can is just plain wrong.

Depending on how many cpus you have, the load of the machine, the timing of many things happening on the computer will all have an influence on the time the threads/process start. Also, since the processes are started right after creation, the overhead of creating a process also has to be calculated in the time difference you see. – Martin Aug 26 '11 at 16:41 Ok I will edit my question and use the actual code.

– Lamar McAdory Aug 26 '11 at 16:46 @Lamar McAdory: There is no way to ensure perfect synchronicity of the execution of two functions. Perhaps it is worth re-evaluating the overall approach to see if there's a better way to achieve what you're trying to do. – aix Aug 26 '11 at 20:02.

There's no way to guarantee that two functions will execute in sync with each other which seems to be what you want to do. The best you can do is to split up the function into several steps, then wait for both to finish at critical synchronization points using Process. Join like @aix's answer mentions.

This is better than time. Sleep(10) because you can't guarantee exact timings. With explicitly waiting, you're saying that the functions must be done executing that step before moving to the next, instead of assuming it will be done within 10ms which isn't guaranteed based on what else is going on on the machine.

I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.

Related Questions