app.celery
Description
Runs Celery and registers Celery tasks.
Modules
Classes
|
Functions
|
- class app.celery.ContextTask
- AsyncResult(task_id, **kwargs)
Get AsyncResult instance for the specified task.
- Parameters
task_id (str) – Task id to get result for.
- exception MaxRetriesExceededError(*args, **kwargs)
The tasks max restart limit has been exceeded.
- args
- with_traceback()
Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.
- exception OperationalError
Recoverable message transport connection error.
- args
- with_traceback()
Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.
- Request = 'celery.worker.request:Request'
Request class used, or the qualified name of one.
- Strategy = 'celery.worker.strategy:default'
Execution strategy used, or the qualified name of one.
- _app = <MyCelery __main__>
The application instance associated with this task class.
- _backend = None
- _default_request = None
Some may expect a request to exist even if the task hasn’t been called. This should probably be deprecated.
- _exec_options = None
- classmethod _get_app()
- _get_exec_options()
- _get_request()
Get current request object.
- abstract = True
Deprecated attribute
abstract
here for compatibility.
- acks_late = False
When enabled messages for this task will be acknowledged after the task has been executed, and not just before (the default behavior).
Please note that this means the task may be executed twice if the worker crashes mid execution.
The application default can be overridden with the :setting:`task_acks_late` setting.
- acks_on_failure_or_timeout = True
When enabled messages for this task will be acknowledged even if it fails or times out.
Configuring this setting only applies to tasks that are acknowledged after they have been executed and only if :setting:`task_acks_late` is enabled.
The application default can be overridden with the :setting:`task_acks_on_failure_or_timeout` setting.
- classmethod add_around(attr, around)
- add_to_chord(sig, lazy=False)
Add signature to the chord the current task is a member of.
New in version 4.0.
Currently only supported by the Redis result backend.
- Parameters
sig (Signature) – Signature to extend chord with.
lazy (bool) – If enabled the new task won’t actually be called, and
sig.delay()
must be called manually.
- add_trail(result)
- after_return(status, retval, task_id, args, kwargs, einfo)
Handler called after the task returns.
- Parameters
status (str) – Current task state.
retval (Any) – Task return value/exception.
task_id (str) – Unique id of the task.
args (Tuple) – Original arguments for the task.
kwargs (Dict) – Original keyword arguments for the task.
einfo (ExceptionInfo) – Exception information.
- Returns
The return value of this handler is ignored.
- Return type
None
- classmethod annotate()
- app = <MyCelery __main__>
- apply(args=None, kwargs=None, link=None, link_error=None, task_id=None, retries=None, throw=None, logfile=None, loglevel=None, headers=None, **options)
Execute this task locally, by blocking until the task returns.
- Parameters
args (Tuple) – positional arguments passed on to the task.
kwargs (Dict) – keyword arguments passed on to the task.
throw (bool) – Re-raise task exceptions. Defaults to the :setting:`task_eager_propagates` setting.
- Returns
pre-evaluated result.
- Return type
celery.result.EagerResult
- apply_async(args=None, kwargs=None, task_id=None, producer=None, link=None, link_error=None, shadow=None, **options)
Apply tasks asynchronously by sending a message.
- Parameters
args (Tuple) – The positional arguments to pass on to the task.
kwargs (Dict) – The keyword arguments to pass on to the task.
countdown (float) – Number of seconds into the future that the task should execute. Defaults to immediate execution.
eta (datetime) – Absolute time and date of when the task should be executed. May not be specified if countdown is also supplied.
expires (float, datetime) – Datetime or seconds in the future for the task should expire. The task won’t be executed after the expiration time.
shadow (str) – Override task name used in logs/monitoring. Default is retrieved from
shadow_name()
.connection (kombu.Connection) – Re-use existing broker connection instead of acquiring one from the connection pool.
retry (bool) – If enabled sending of the task message will be retried in the event of connection loss or failure. Default is taken from the :setting:`task_publish_retry` setting. Note that you need to handle the producer/connection manually for this to work.
retry_policy (Mapping) – Override the retry policy used. See the :setting:`task_publish_retry_policy` setting.
time_limit (int) – If set, overrides the default time limit.
soft_time_limit (int) – If set, overrides the default soft time limit.
queue (str, kombu.Queue) – The queue to route the task to. This must be a key present in :setting:`task_queues`, or :setting:`task_create_missing_queues` must be enabled. See guide-routing for more information.
exchange (str, kombu.Exchange) – Named custom exchange to send the task to. Usually not used in combination with the
queue
argument.routing_key (str) – Custom routing key used to route the task to a worker server. If in combination with a
queue
argument only used to specify custom routing keys to topic exchanges.priority (int) – The task priority, a number between 0 and 9. Defaults to the
priority
attribute.serializer (str) – Serialization method to use. Can be pickle, json, yaml, msgpack or any custom serialization method that’s been registered with
kombu.serialization.registry
. Defaults to theserializer
attribute.compression (str) – Optional compression method to use. Can be one of
zlib
,bzip2
, or any custom compression methods registered withkombu.compression.register()
. Defaults to the :setting:`task_compression` setting.link (Signature) – A single, or a list of tasks signatures to apply if the task returns successfully.
link_error (Signature) – A single, or a list of task signatures to apply if an error occurs while executing the task.
producer (kombu.Producer) – custom producer to use when publishing the task.
add_to_parent (bool) – If set to True (default) and the task is applied while executing another task, then the result will be appended to the parent tasks
request.children
attribute. Trailing can also be disabled by default using thetrail
attributeignore_result (bool) – If set to False (default) the result of a task will be stored in the backend. If set to True the result will not be stored. This can also be set using the
ignore_result
in the app.task decorator.publisher (kombu.Producer) – Deprecated alias to
producer
.headers (Dict) – Message headers to be included in the message.
- Returns
Promise of future evaluation.
- Return type
celery.result.AsyncResult
- Raises
TypeError – If not enough arguments are passed, or too many arguments are passed. Note that signature checks may be disabled by specifying
@task(typing=False)
.kombu.exceptions.OperationalError – If a connection to the transport cannot be made, or if the connection is lost.
Note
Also supports all keyword arguments supported by
kombu.Producer.publish()
.
- property backend
- before_start(task_id, args, kwargs)
Handler called before the task starts.
New in version 5.2.
- Parameters
task_id (str) – Unique id of the task to execute.
args (Tuple) – Original arguments for the task to execute.
kwargs (Dict) – Original keyword arguments for the task to execute.
- Returns
The return value of this handler is ignored.
- Return type
None
- classmethod bind(app)
- chunks(it, n)
Create a
chunks
task for this task.
- default_retry_delay = 180
Default time in seconds before a retry of the task should be executed. 3 minutes by default.
- delay(*args, **kwargs)
Star argument version of
apply_async()
.Does not support the extra options enabled by
apply_async()
.- Parameters
*args (Any) – Positional arguments passed on to the task.
**kwargs (Any) – Keyword arguments passed on to the task.
- Returns
Future promise.
- Return type
celery.result.AsyncResult
- expires = None
Default task expiry time.
- from_config = (('serializer', 'task_serializer'), ('rate_limit', 'task_default_rate_limit'), ('priority', 'task_default_priority'), ('track_started', 'task_track_started'), ('acks_late', 'task_acks_late'), ('acks_on_failure_or_timeout', 'task_acks_on_failure_or_timeout'), ('reject_on_worker_lost', 'task_reject_on_worker_lost'), ('ignore_result', 'task_ignore_result'), ('store_eager_result', 'task_store_eager_result'), ('store_errors_even_if_ignored', 'task_store_errors_even_if_ignored'))
- ignore_result = False
If enabled the worker won’t store task state and return values for this task. Defaults to the :setting:`task_ignore_result` setting.
- map(it)
Create a
xmap
task fromit
.
- max_retries = 3
Maximum number of retries before giving up. If set to
None
, it will never stop retrying.
- name = None
Name of the task.
- classmethod on_bound(app)
Called when the task is bound to an app.
Note
This class method can be defined to do additional actions when the task class is bound to an app.
- on_failure(exc, task_id, args, kwargs, einfo) None
Error handler.
This is run by the worker when the task fails.
- Parameters
exc (Exception) – The exception raised by the task.
task_id (str) – Unique id of the failed task.
args (Tuple) – Original arguments for the task that failed.
kwargs (Dict) – Original keyword arguments for the task that failed.
einfo (ExceptionInfo) – Exception information.
- Returns
The return value of this handler is ignored.
- Return type
None
- on_retry(exc, task_id, args, kwargs, einfo)
Retry handler.
This is run by the worker when the task is to be retried.
- Parameters
exc (Exception) – The exception sent to
retry()
.task_id (str) – Unique id of the retried task.
args (Tuple) – Original arguments for the retried task.
kwargs (Dict) – Original keyword arguments for the retried task.
einfo (ExceptionInfo) – Exception information.
- Returns
The return value of this handler is ignored.
- Return type
None
- on_success(retval, task_id, args, kwargs)
Success handler.
Run by the worker if the task executes successfully.
- Parameters
retval (Any) – The return value of the task.
task_id (str) – Unique id of the executed task.
args (Tuple) – Original arguments for the executed task.
kwargs (Dict) – Original keyword arguments for the executed task.
- Returns
The return value of this handler is ignored.
- Return type
None
- pop_request()
- priority = None
Default task priority.
- push_request(*args, **kwargs)
- rate_limit = None
None
(no rate limit), ‘100/s’ (hundred tasks a second), ‘100/m’ (hundred tasks a minute),`’100/h’` (hundred tasks an hour)- Type
Rate limit for this task type. Examples
- reject_on_worker_lost = None
Even if
acks_late
is enabled, the worker will acknowledge tasks when the worker process executing them abruptly exits or is signaled (e.g., :sig:`KILL`/:sig:`INT`, etc).Setting this to true allows the message to be re-queued instead, so that the task will execute again by the same worker, or another worker.
Warning: Enabling this can cause message loops; make sure you know what you’re doing.
- replace(sig)
Replace this task, with a new task inheriting the task id.
Execution of the host task ends immediately and no subsequent statements will be run.
New in version 4.0.
- Parameters
sig (Signature) – signature to replace with.
- Raises
~@Ignore – This is always raised when called in asynchronous context.
It is best to always use return self.replace(...) to convey –
to the reader that the task won't continue after being replaced. –
- property request
Get current request object.
- request_stack = <celery.utils.threads._LocalStack object>
Task request stack, the current request will be the topmost.
- resultrepr_maxsize = 1024
Max length of result representation used in logs and events.
- retry(args=None, kwargs=None, exc=None, throw=True, eta=None, countdown=None, max_retries=None, **options)
Retry the task, adding it to the back of the queue.
Example
>>> from imaginary_twitter_lib import Twitter >>> from proj.celery import app
>>> @app.task(bind=True) ... def tweet(self, auth, message): ... twitter = Twitter(oauth=auth) ... try: ... twitter.post_status_update(message) ... except twitter.FailWhale as exc: ... # Retry in 5 minutes. ... self.retry(countdown=60 * 5, exc=exc)
Note
Although the task will never return above as retry raises an exception to notify the worker, we use raise in front of the retry to convey that the rest of the block won’t be executed.
- Parameters
args (Tuple) – Positional arguments to retry with.
kwargs (Dict) – Keyword arguments to retry with.
exc (Exception) –
Custom exception to report when the max retry limit has been exceeded (default:
@MaxRetriesExceededError
).If this argument is set and retry is called while an exception was raised (
sys.exc_info()
is set) it will attempt to re-raise the current exception.If no exception was raised it will raise the
exc
argument provided.countdown (float) – Time in seconds to delay the retry for.
eta (datetime) – Explicit time and date to run the retry at.
max_retries (int) – If set, overrides the default retry limit for this execution. Changes to this parameter don’t propagate to subsequent task retry attempts. A value of
None
, means “use the default”, so if you want infinite retries you’d have to set themax_retries
attribute of the task toNone
first.time_limit (int) – If set, overrides the default time limit.
soft_time_limit (int) – If set, overrides the default soft time limit.
throw (bool) – If this is
False
, don’t raise the@Retry
exception, that tells the worker to mark the task as being retried. Note that this means the task will be marked as failed if the task raises an exception, or successful if it returns after the retry call.**options (Any) – Extra options to pass on to
apply_async()
.
- Raises
celery.exceptions.Retry – To tell the worker that the task has been re-sent for retry. This always happens, unless the throw keyword argument has been explicitly set to
False
, and is considered normal operation.
- run(*args, **kwargs)
The body of the task executed by workers.
- s(*args, **kwargs)
Create signature.
Shortcut for
.s(*a, **k) -> .signature(a, k)
.
- send_event(type_, retry=True, retry_policy=None, **fields)
Send monitoring event message.
This can be used to add custom event types in :pypi:`Flower` and other monitors.
- Parameters
type (str) – Type of event, e.g.
"task-failed"
.- Keyword Arguments
retry (bool) – Retry sending the message if the connection is lost. Default is taken from the :setting:`task_publish_retry` setting.
retry_policy (Mapping) – Retry settings. Default is taken from the :setting:`task_publish_retry_policy` setting.
**fields (Any) – Map containing information about the event. Must be JSON serializable.
- send_events = True
If enabled the worker will send monitoring events related to this task (but only if the worker is configured to send task related events). Note that this has no effect on the task-failure event case where a task is not registered (as it will have no task class to check this flag).
- serializer = 'json'
The name of a serializer that are registered with
kombu.serialization.registry
. Default is ‘json’.
- shadow_name(args, kwargs, options)
Override for custom task name in worker logs/monitoring.
Example
from celery.utils.imports import qualname def shadow_name(task, args, kwargs, options): return qualname(args[0]) @app.task(shadow_name=shadow_name, serializer='pickle') def apply_function_async(fun, *args, **kwargs): return fun(*args, **kwargs)
- Parameters
args (Tuple) – Task positional arguments.
kwargs (Dict) – Task keyword arguments.
options (Dict) – Task execution options.
- si(*args, **kwargs)
Create immutable signature.
Shortcut for
.si(*a, **k) -> .signature(a, k, immutable=True)
.
- signature(args=None, *starargs, **starkwargs)
Create signature.
- Returns
- object for
this task, wrapping arguments and execution options for a single task invocation.
- Return type
signature
- signature_from_request(request=None, args=None, kwargs=None, queue=None, **extra_options)
- soft_time_limit = None
Soft time limit. Defaults to the :setting:`task_soft_time_limit` setting.
- starmap(it)
Create a
xstarmap
task fromit
.
- start_strategy(app, consumer, **kwargs)
- store_eager_result = False
- store_errors_even_if_ignored = False
When enabled errors will be stored even if the task is otherwise configured to ignore results.
- subtask(args=None, *starargs, **starkwargs)
Create signature.
- Returns
- object for
this task, wrapping arguments and execution options for a single task invocation.
- Return type
signature
- subtask_from_request(request=None, args=None, kwargs=None, queue=None, **extra_options)
- throws = ()
Tuple of expected exceptions.
These are errors that are expected in normal operation and that shouldn’t be regarded as a real error by the worker. Currently this means that the state will be updated to an error state, but the worker won’t log the event as an error.
- time_limit = None
Hard time limit. Defaults to the :setting:`task_time_limit` setting.
- track_started = False
If enabled the task will report its status as ‘started’ when the task is executed by a worker. Disabled by default as the normal behavior is to not report that level of granularity. Tasks are either pending, finished, or waiting to be retried.
Having a ‘started’ status can be useful for when there are long running tasks and there’s a need to report what task is currently running.
The application default can be overridden using the :setting:`task_track_started` setting.
- trail = True
If enabled the request will keep track of subtasks started by this task, and this information will be sent with the result (
result.children
).
- typing = True
Enable argument checking. You can set this to false if you don’t want the signature to be checked when calling the task. Defaults to
app.strict_typing
.
- update_state(task_id=None, state=None, meta=None, **kwargs)
Update task state.
- Parameters
task_id (str) – Id of the task to update. Defaults to the id of the current task.
state (str) – New state.
meta (Dict) – State meta-data.
- class app.celery.MyCelery(main=None, loader=None, backend=None, amqp=None, events=None, log=None, control=None, set_as_current=True, tasks=None, broker=None, include=None, changes=None, config_source=None, fixups=None, task_cls=None, autofinalize=True, namespace=None, strict_typing=True, **kwargs)
- AsyncResult
Create new result instance.
See also
celery.result.AsyncResult
.
- Beat
celery beat scheduler application.
See also
@Beat
.
- GroupResult
Create new group result instance.
See also
celery.result.GroupResult
.
- IS_WINDOWS = False
- IS_macOS = False
- Pickler
alias of
celery.app.utils.AppPickler
- ResultSet
- SYSTEM = 'Linux'
- Task
Base task class for this app.
- WorkController
Embeddable worker.
See also
@WorkController
.
- Worker
Worker application.
See also
@Worker
.
- _acquire_connection(pool=True)
Helper for
connection_or_acquire()
.
- _add_periodic_task(key, entry)
- _after_fork()
- _after_fork_registered = False
- _autodiscover_tasks(packages, related_name, **kwargs)
- _autodiscover_tasks_from_fixups(related_name)
- _autodiscover_tasks_from_names(packages, related_name)
- _canvas
- _conf = None
- _connection(url, userid=None, password=None, virtual_host=None, port=None, ssl=None, connect_timeout=None, transport=None, transport_options=None, heartbeat=None, login_method=None, failover_strategy=None, **kwargs)
- _ensure_after_fork()
- _finalize_pending_conf()
Get config value by key and finalize loading the configuration.
Note
- This is used by PendingConfiguration:
as soon as you access a key the configuration is read.
- _fixups = None
- _get_backend()
- _get_default_loader()
- _load_config()
- _local = None
Thread local storage.
- _pool = None
- _rgetattr(path)
- _sig_to_periodic_task_entry(schedule, sig, args=(), kwargs=None, name=None, **opts)
- _task_from_fun(fun, name=None, base=None, bind=False, **options)
- add_defaults(fun)
Add default configuration from dict
d
.If the argument is a callable function then it will be regarded as a promise, and it won’t be loaded until the configuration is actually needed.
This method can be compared to:
>>> celery.conf.update(d)
with a difference that 1) no copy will be made and 2) the dict will not be transferred when the worker spawns child processes, so it’s important that the same configuration happens at import time when pickle restores the object on the other side.
- add_periodic_task(schedule, sig, args=(), kwargs=(), name=None, **opts)
- amqp
@amqp
.- Type
AMQP related functionality
- amqp_cls = 'celery.app.amqp:AMQP'
- annotations
- autodiscover_tasks(packages=None, related_name='tasks', force=False)
Auto-discover task modules.
Searches a list of packages for a “tasks.py” module (or use related_name argument).
If the name is empty, this will be delegated to fix-ups (e.g., Django).
For example if you have a directory layout like this:
foo/__init__.py tasks.py models.py bar/__init__.py tasks.py models.py baz/__init__.py models.py
Then calling
app.autodiscover_tasks(['foo', 'bar', 'baz'])
will result in the modulesfoo.tasks
andbar.tasks
being imported.- Parameters
packages (List[str]) – List of packages to search. This argument may also be a callable, in which case the value returned is used (for lazy evaluation).
related_name (Optional[str]) – The name of the module to find. Defaults to “tasks”: meaning “look for ‘module.tasks’ for every module in
packages
.”. IfNone
will only try to import the package, i.e. “look for ‘module’”.force (bool) – By default this call is lazy so that the actual auto-discovery won’t happen until an application imports the default modules. Forcing will cause the auto-discovery to happen immediately.
- property backend
Current backend instance.
- backend_cls = None
- broker_connection(hostname=None, userid=None, password=None, virtual_host=None, port=None, ssl=None, connect_timeout=None, transport=None, transport_options=None, heartbeat=None, login_method=None, failover_strategy=None, **kwargs)
Establish a connection to the message broker.
Please use
connection_for_read()
andconnection_for_write()
instead, to convey the intent of use for this connection.- Parameters
url – Either the URL or the hostname of the broker to use.
hostname (str) – URL, Hostname/IP-address of the broker. If a URL is used, then the other argument below will be taken from the URL instead.
userid (str) – Username to authenticate as.
password (str) – Password to authenticate with
virtual_host (str) – Virtual host to use (domain).
port (int) – Port to connect to.
ssl (bool, Dict) – Defaults to the :setting:`broker_use_ssl` setting.
transport (str) – defaults to the :setting:`broker_transport` setting.
transport_options (Dict) – Dictionary of transport specific options.
heartbeat (int) – AMQP Heartbeat in seconds (
pyamqp
only).login_method (str) – Custom login method to use (AMQP only).
failover_strategy (str, Callable) – Custom failover strategy.
**kwargs – Additional arguments to
kombu.Connection
.
- Returns
the lazy connection instance.
- Return type
kombu.Connection
- bugreport()
Return information useful in bug reports.
- builtin_fixups = {'celery.fixups.django:fixup'}
- close()
Clean up after the application.
Only necessary for dynamically created apps, and you should probably use the
with
statement instead.Example
>>> with Celery(set_as_current=False) as app: ... with app.connection_for_write() as conn: ... pass
- property conf
Current configuration.
- config_from_cmdline(argv, namespace='celery')
- config_from_envvar(variable_name, silent=False, force=False)
Read configuration from environment variable.
The value of the environment variable must be the name of a module to import.
Example
>>> os.environ['CELERY_CONFIG_MODULE'] = 'myapp.celeryconfig' >>> celery.config_from_envvar('CELERY_CONFIG_MODULE')
- config_from_object(obj, silent=False, force=False, namespace=None)
Read configuration from object.
Object is either an actual object or the name of a module to import.
Example
>>> celery.config_from_object('myapp.celeryconfig')
>>> from myapp import celeryconfig >>> celery.config_from_object(celeryconfig)
- Parameters
silent (bool) – If true then import errors will be ignored.
force (bool) – Force reading configuration immediately. By default the configuration will be read only when required.
- connection(hostname=None, userid=None, password=None, virtual_host=None, port=None, ssl=None, connect_timeout=None, transport=None, transport_options=None, heartbeat=None, login_method=None, failover_strategy=None, **kwargs)
Establish a connection to the message broker.
Please use
connection_for_read()
andconnection_for_write()
instead, to convey the intent of use for this connection.- Parameters
url – Either the URL or the hostname of the broker to use.
hostname (str) – URL, Hostname/IP-address of the broker. If a URL is used, then the other argument below will be taken from the URL instead.
userid (str) – Username to authenticate as.
password (str) – Password to authenticate with
virtual_host (str) – Virtual host to use (domain).
port (int) – Port to connect to.
ssl (bool, Dict) – Defaults to the :setting:`broker_use_ssl` setting.
transport (str) – defaults to the :setting:`broker_transport` setting.
transport_options (Dict) – Dictionary of transport specific options.
heartbeat (int) – AMQP Heartbeat in seconds (
pyamqp
only).login_method (str) – Custom login method to use (AMQP only).
failover_strategy (str, Callable) – Custom failover strategy.
**kwargs – Additional arguments to
kombu.Connection
.
- Returns
the lazy connection instance.
- Return type
kombu.Connection
- connection_for_read(url=None, **kwargs)
Establish connection used for consuming.
See also
connection()
for supported arguments.
- connection_for_write(url=None, **kwargs)
Establish connection used for producing.
See also
connection()
for supported arguments.
- connection_or_acquire(connection=None, pool=True, *_, **__)
Context used to acquire a connection from the pool.
For use within a
with
statement to get a connection from the pool if one is not already provided.- Parameters
connection (kombu.Connection) – If not provided, a connection will be acquired from the connection pool.
- control
@control
.- Type
Remote control
- control_cls = 'celery.app.control:Control'
- create_task_cls()
Create a base task class bound to this app.
- property current_task
Instance of task being executed, or
None
.
- property current_worker_task
The task currently being executed by a worker or
None
.Differs from
current_task
in that it’s not affected by tasks calling other tasks directly, or eagerly.
- default_connection(connection=None, pool=True, *_, **__)
Context used to acquire a connection from the pool.
For use within a
with
statement to get a connection from the pool if one is not already provided.- Parameters
connection (kombu.Connection) – If not provided, a connection will be acquired from the connection pool.
- default_producer(producer=None)
Context used to acquire a producer from the pool.
For use within a
with
statement to get a producer from the pool if one is not already provided- Parameters
producer (kombu.Producer) – If not provided, a producer will be acquired from the producer pool.
- either(default_key, *defaults)
Get key from configuration or use default values.
Fallback to the value of a configuration key if none of the *values are true.
- events
@events
.- Type
Consuming and sending events
- events_cls = 'celery.app.events:Events'
- finalize(auto=False)
Finalize the app.
This loads built-in tasks, evaluates pending task decorators, reads configuration, etc.
- gen_task_name(name, module)
New task default automatic naming.
The default gen_task_name method builds task names based on absolute imports, for example:
- project /
/__init__.py /moduleA/
/__init.py /tasks.py
- /moduleB/
/__init.py /tasks.py
The default automatic naming is “project.moduleA.tasks.taskA”, “project.moduleA.tasks.taskB”, etc. This new default automatic naming forget “tasks” in all task names:
DEFAULT WAY NEW WAY project.moduleA.tasks.taskA project.moduleA.taskA project.moduleA.tasks.taskA project.moduleA.taskB project.moduleB.tasks.taskA project.moduleB.taskA
This method is only used when the tasks don’t have a name attribute defined, otherwise, the task name will be respect.
References
- loader
Current loader instance.
- loader_cls = None
- log
@log
.- Type
Logging
- log_cls = 'celery.app.log:Logging'
- main = None
Name of the __main__ module. Required for standalone scripts.
If set this will be used instead of __main__ when automatically generating task names.
- now()
Return the current time and date as a datetime.
- oid
Universally unique identifier for this app.
- on_after_configure = None
Signal sent after app has prepared the configuration.
- on_after_finalize = None
Signal sent after app has been finalized.
- on_after_fork = None
Signal sent by every new process after fork.
- on_configure = None
Signal sent when app is loading configuration.
- on_init()
Optional callback called at init.
- property pool
@pool
.Note
This attribute is not related to the workers concurrency pool.
- Type
Broker connection pool
- prepare_config(c)
Prepare configuration before it is merged with the defaults.
- producer_or_acquire(producer=None)
Context used to acquire a producer from the pool.
For use within a
with
statement to get a producer from the pool if one is not already provided- Parameters
producer (kombu.Producer) – If not provided, a producer will be acquired from the producer pool.
- property producer_pool
- register_task(task, **options)
Utility for registering a task-based class.
Note
This is here for compatibility with old Celery 1.0 style task classes, you should not need to use this for new projects.
- registry_cls = 'celery.app.registry:TaskRegistry'
- select_queues(queues=None)
Select subset of queues.
- Parameters
queues (Sequence[str]) – a list of queue names to keep.
- send_task(name, args=None, kwargs=None, countdown=None, eta=None, task_id=None, producer=None, connection=None, router=None, result_cls=None, expires=None, publisher=None, link=None, link_error=None, add_to_parent=True, group_id=None, group_index=None, retries=0, chord=None, reply_to=None, time_limit=None, soft_time_limit=None, root_id=None, parent_id=None, route_name=None, shadow=None, chain=None, task_type=None, **options)
Send task by name.
Supports the same arguments as
@-Task.apply_async()
.- Parameters
name (str) – Name of task to call (e.g., “tasks.add”).
result_cls (AsyncResult) – Specify custom result class.
- set_current()
Make this the current app for this thread.
- set_default()
Make this the default app for all threads.
- setup_security(allowed_serializers=None, key=None, cert=None, store=None, digest='sha256', serializer='json')
Setup the message-signing serializer.
This will affect all application instances (a global operation).
Disables untrusted serializers and if configured to use the
auth
serializer will register theauth
serializer with the provided settings into the Kombu serializer registry.- Parameters
allowed_serializers (Set[str]) – List of serializer names, or content_types that should be exempt from being disabled.
key (str) – Name of private key file to use. Defaults to the :setting:`security_key` setting.
cert (str) – Name of certificate file to use. Defaults to the :setting:`security_certificate` setting.
store (str) – Directory containing certificates. Defaults to the :setting:`security_cert_store` setting.
digest (str) – Digest algorithm used when signing messages. Default is
sha256
.serializer (str) – Serializer used to encode messages after they’ve been signed. See :setting:`task_serializer` for the serializers supported. Default is
json
.
- signature(*args, **kwargs)
Return a new
Signature
bound to this app.
- start(argv=None)
Run celery using argv.
Uses
sys.argv
if argv is not specified.
- steps = None
Custom bootsteps to extend and modify the worker. See extending-bootsteps.
- subclass_with_self(Class, name=None, attribute='app', reverse=None, keep_reduce=False, **kw)
Subclass an app-compatible class.
App-compatible means that the class has a class attribute that provides the default app it should use, for example:
class Foo: app = None
.- Parameters
Class (type) – The app-compatible class to subclass.
name (str) – Custom name for the target class.
attribute (str) – Name of the attribute holding the app, Default is ‘app’.
reverse (str) – Reverse path to this object used for pickling purposes. For example, to get
app.AsyncResult
, use"AsyncResult"
.keep_reduce (bool) – If enabled a custom
__reduce__
implementation won’t be provided.
- task(*args, **opts)
Decorator to create a task class out of any callable.
See Task options for a list of the arguments that can be passed to this decorator.
Examples
@app.task def refresh_feed(url): store_feed(feedparser.parse(url))
with setting extra options:
@app.task(exchange='feeds') def refresh_feed(url): return store_feed(feedparser.parse(url))
Note
App Binding: For custom apps the task decorator will return a proxy object, so that the act of creating the task is not performed until the task is used or the task registry is accessed.
If you’re depending on binding to be deferred, then you must not access any attributes on the returned object until the application is fully set up (finalized).
- task_cls = 'celery.app.task:Task'
- tasks
Task registry.
Warning
Accessing this attribute will also auto-finalize the app.
- property thread_oid
Per-thread unique identifier for this app.
- timezone
Current timezone for this app.
This is a cached property taking the time zone from the :setting:`timezone` setting.
- user_options = None
Custom options for command-line programs. See extending-commandoptions
- uses_utc_timezone()
Check if the application uses the UTC timezone.
- worker_main(argv=None)
Run celery worker using argv.
Uses
sys.argv
if argv is not specified.
- app.celery.make_celery(app: flask.app.Flask) celery.app.base.Celery