Sharing a single connection to mongo db between all celery workers

Issue

I have a series of celery workers which carry out tasks, stored using REDIS; in particular I have three categories of workers which respectively carry out three categories of tasks. The tasks belonging to each of these categories require access to a mongo DB. For reasons of efficiency I would like to use a single connection to the DB to be used by all workers. So far I’ve tried both passing as argument to "send_task" the connection in the following way:

myclient = MongoClient('localhost:27017')
celeryWorker.send_task('tasks.beampolyline', myclient)

but clearly it returns as error that the object is not JSON serializable; both trying to share the myclient object among all the workers but with poor results. Any idea ? I feel I am very close to the solution but I am stuck on both approaches I am trying to use.

Solution

One solution is to define module level connection. Suppose worker.py as your worker module:

from mongodb import MongoClient

shared_connection = MongoClient()

@app.task(...)
def task(self, a, b):
    ...  # use shared_connection inside task
    return 

In this solution, connection is shared between threads/processes of this worker.

Answered By – Pouya Esmaeili

This Answer collected from stackoverflow, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply

(*) Required, Your email will not be published