Setting up large Flask application using factory pattern is very convinient, because it prevents a code being run at import time and provides more flexible way to setup application.
Celery is a good and must have tool for running asynchonious tasks, but it might be a little tricky to configure it in a large application.
Problem
Imagine a following project layout
Simple flask app is configured in factories.application:
In factories.celery Please note, that we need to override TaskBase, co celery can access flask app contenxt, such as current_app. :
tasks.py just contains basic task. Please note, that task binded to celery instance. @shared_task could’ve been used, but this approach is less magical:
If controllers will import tasks.py there’s a circular import.
Solutions
Import tasks inside a view function. Violates PEP8
Import tasks using importlib.
Change factories.celery and tasks to point to Celery instance placeholder, which will be configured by factory.
Personally, I prefer a last option because it is clean and flexible and comes at little cost.
Introducing Celery placeholder
Let’s add file, called extensions where celery instance will be intialized at import time:
This instance is a placeholder, it doesn’t know which broker to use, but it ill be configured in factories:
tasks.py:
And in controllers:
And file with celery worker:
Conclusion
The only disadvantage of this approach is that celery instance is singleton, which is run at import time, but it should not be a problem for the most of the applications.