Does multiprocessing.pool.imap has a variant (like starmap) that allows for multiple arguments?


I am doing some calculations on large collections of bytes. The process runs on chunks of bytes. I am trying to use parallel processing using multiprocessing for performance enhancement. Initially I tried to use but that only allows single argument, then I found about pool.starmap. But pool.starmap gives results only when all the processes have finished. I want results as they come (sort of). I am trying to use pool.imap which does provide results as processes finish but does not allow multiple arguments (my function requires 2 arguments). Also, the sequence of result is important.

Some sample code below:

pool = mp.Pool(processes=4)
y = []
for x in pool.starmap(f, zip(da, repeat(db))):

The above code works, but only gives the results once all the processes have completed. I cannot see any progress. This is why I tried to use pool.imap, works well but with only single argument:

pool = mp.Pool(processes=4)
y = []
for x in pool.imap(f, da)):

On multiple arguments raises the following exception:

TypeError: f() missing 1 required positional argument: 'd'

Looking for simple way to achieve all 3 requirements:

  1. parallel processing using multiple parameters/arguments
  2. manage to see progress while the processes are running
  3. ordered results.



I can answer the first two question pretty quickly. I think you should be able to handle the third question after understanding the first two.

1. Parrallel Processing with Multiple Arguments

I’m not sure about the whole “starmap” equivalent but here’s an alternative. What I’ve done in the past is condense my arguments into a single data object like a list. For example, if you want to pass three arguments to your map_function, you could append those arguments into a list, and then use the list with the .map() or .imap() function.

def map_function(combo):
    a = combo[0]
    b = combo[1]
    c = combo[2]
    return a + b + c

if '__name__' == '__main__':
    combo = []
    combo[0] = arg_1
    combo[1] = arg_2
    combo[2] = arg_3

    pool = Pool(processes=4), combo)

2. Tracking Progress

A good way to do this is using multiprocessing‘s shared value. I actually asked this (almost) same exact question about a month ago. This allows you to manipulate the same variable from the different processes created by your map function. For the sake of learning, I’m going to let you read and figure out the shared state solution on your own. If you’re still having trouble after a few attempts, I’ll be more than happy to help you, but I beleive that teaching yourself how to understand something is much more valuable than me giving you the answer.

Hope this helps!!

Answered By – Austin A

This Answer collected from stackoverflow, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply

(*) Required, Your email will not be published