Setting up large Flask application using factory pattern is very convinient, because it prevents a code being run at import time and provides more flexible way to setup application.
Celery is a good and must have tool for running asynchonious tasks, but it might be a little tricky to configure it in a large application.
Imagine a following project layout
Simple flask app is configured in
factories.celery Please note, that we need to override
TaskBase, co celery can access flask app contenxt, such as
tasks.py just contains basic task. Please note, that task binded to celery instance.
@shared_task could’ve been used, but this approach is less magical:
If controllers will import tasks.py there’s a circular import.
- Import tasks inside a view function. Violates PEP8
- Import tasks using importlib.
tasks to point to Celery instance placeholder, which will be configured by factory.
Personally, I prefer a last option because it is clean and flexible and comes at little cost.
Introducing Celery placeholder
Let’s add file, called
extensions where celery instance will be intialized at import time:
This instance is a placeholder, it doesn’t know which broker to use, but it ill be configured in factories:
And in controllers:
And file with celery worker:
The only disadvantage of this approach is that celery instance is singleton, which is run at import time, but it should not be a problem for the most of the applications.
Working proof of concept is available on github