Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tasks are not getting added to database #19

Closed
sohaibfarooqi opened this issue Apr 29, 2017 · 44 comments
Closed

Tasks are not getting added to database #19

sohaibfarooqi opened this issue Apr 29, 2017 · 44 comments

Comments

@sohaibfarooqi
Copy link

Hi,
I have followed this link: http://docs.celeryproject.org/en/latest/django/first-steps-with-django.html#django-celery-results-using-the-django-orm-cache-as-a-result-backend
to set up django-celery-results for my project. Migrations are running fine and i can see table django_celery_results_taskresult got created.
The problem is when i try to execute a task, it runs fine but the table i mentioned above didn't get populated.
I have put CELERY_RESULT_BACKEND = 'django-db' and have placed django_celery_results in my INSTALLED_APPS. There is no error in logs either. Can anyone give me hint what i am doing wrong? Or guide me how can i debug this issue?

@steppo40
Copy link

Hi,
maybe is not your issue but following the docs I missed that a worker process is needed anyway for the task runner to work. Perhaps you're running the scheduler only.

celery -A <your project> worker
celery -A <your project> beat -S django

@rh0dium
Copy link

rh0dium commented May 20, 2017

I'm also seeing this as well.

@andreypanin
Copy link

andreypanin commented May 27, 2017

I'm new to Celery and I think there's a lack of proper documentation on how to use this package. I also can't get my task results stored.

When I do

app = Celery('myapp',
             broker='amqp://guest:[email protected]:5672//',
             result_backend='django-db',
             include=['apps.xxx.tasks',])

I get

ipdb> app.conf['broker_url']
'amqp://guest:[email protected]:5672//'
ipdb> app.conf['result_backend'] is None
True

Update: I got it working this way:

app = Celery('myapp', broker='amqp://guest:[email protected]:5672//')

app.conf.update(
    result_backend='django-db',
    include=['apps.xxx.tasks',],
)

Then in a shell:

In [1]: from apps.xxx.tasks import sayhello

In [2]: t = sayhello.delay()

In [3]: t.status
Out[3]: 'PENDING'

In [4]: t.status
Out[4]: 'SUCCESS'

In [6]: t.result
Out[6]: 'helloworld'

@sohaibfarooqi @rh0dium

@steppo40
Copy link

steppo40 commented May 27, 2017 via email

@sohaibfarooqi
Copy link
Author

Docs are bit confusing for new beginners. I also got stuck.
Commands that @steppo40 mentioned in his first comment will achieve the desired behavior.

@andreypanin
Copy link

@anapana My guess is that you're looking at the wrong DB or at the wrong table. Check TaskResult._meta.db_table.

@AnaPana-zz
Copy link

AnaPana-zz commented Jun 1, 2017

@andreypanin , thank you! Yes, just figured that out, I was looking into the wrong table, so deleted my comment. I'm having the same issue as people here, when django_celery_results_taskresult table is empty.

@AnaPana-zz
Copy link

Tried all of the proposed solutions and nothing worked, I'm also assuming that I'm using newer version of packages, since docs are outdated.
According to http://docs.celeryproject.org/en/latest/django/first-steps-with-django.html#django-celery-results-using-the-django-orm-cache-as-a-result-backend

CELERY_RESULT_BACKEND = 'django-db'

is wrong and it has to be

CELERY_RESULT_BACKEND = 'db+sqlite:///django-db'

since celery is looking for the prefix trying to find the right backend class: https://github.com/celery/celery/blob/master/celery/app/backends.py

My requirements:

celery==4.0.2
django-celery-beat==1.0.1
django-celery-results==1.0.1

Now I made it work, but it's ugly. If I explicitly pass the backend class to task decorator (for writing) and to celery.result.AsyncResult (for reading) I'm able to access the tasks stored in database using Django ORM:

# tasks.py
from project.celery import app
from django_celery_results.backends import DatabaseBackend

@app.task(name="task_name", backend=DatabaseBackend(app, url='sqlite:///django-db.sqlite3'))
def task():
    return 2+2
# views.py
from celery.result import AsyncResult
...
from django_celery_results.models import TaskResult
from django_celery_results.backends import DatabaseBackend

from project.celery import app
from project.main.tasks import task

class CeleryTestPage(View):
    def get(self, request):
        t = task.delay()
        t_result = AsyncResult(t.task_id, backend=DatabaseBackend(app, url='sqlite:///db.sqlite3'))
        HttpResponse(t_result.status)

class CeleryResults(View):
    def get(self, request):
        tasks = TaskResult.objects.all() # this now works
        return HttpResponse(tasks)

So I'm assuming django-celery-results doesn't give celery its database backend.

Now my question is: if celery itself already writes data to celery_taskmeta table, why would django-celery-results provide redundancy with its own table and model as opposed to providing the Django model for celery_taskmeta table.

The schema of those two tables are very similar:

sqlite> .schema django_celery_results_taskresult                                                                   
CREATE TABLE "django_celery_results_taskresult" ("id" integer NOT NULL PRIMARY KEY AUTOINCREMENT, "task_id" varchar(255) NOT NULL UNIQUE, "status" varchar(50) NOT NULL, "content_type" varchar(128) NOT NULL, "content_encoding" varchar(64) NOT NULL, "result" text NULL, "date_done" datetime NOT NULL, "traceback" text NULL, "hidden" bool NOT NULL, "meta" text NULL);
CREATE INDEX "django_celery_results_taskresult_662f707d" ON "django_celery_results_taskresult" ("hidden");
sqlite> .schema celery_taskmeta                                                                                    
CREATE TABLE celery_taskmeta (
        id INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT, 
        task_id VARCHAR(155), 
        status VARCHAR(50), 
        result BLOB, 
        date_done DATETIME, 
        traceback TEXT, 
        UNIQUE (task_id)
);

Am I missing something?

@sohaibfarooqi sohaibfarooqi changed the title Taks are not getting added to database Tasks are not getting added to database Jun 8, 2017
@mheppner
Copy link

mheppner commented Jul 5, 2017

@anapana Hm maybe this is being loaded somewhere else, but if CELERY_RESULT_BACKEND = 'django-db' was set and Celery didn't know about it, it should have raised this error message. However, it doesn't seem to be in core Celery anywhere. Maybe it gets registered on it's own through django-celery-results?

@Niyojan
Copy link

Niyojan commented Aug 3, 2017

same issue here, django_celery_results_taskresult empty.

@neilfrndes
Copy link

Had the same issue, what worked for me is: CELERY_RESULT_BACKEND = 'db+sqlite:///django-db'

thanks @anapana

Linux-4.10.0-32-generic-x86_64-with-Ubuntu-16.04-xenial 2017-08-22 12:06:18

@jameslao
Copy link

I'm using MariaDB and CELERY_RESULT_BACKEND = 'django-db' worked for me. However, the entry is only put in the table AFTER the job has finished, with success or not. Would it make sense to update the table right after the job is scheduled? Otherwise the "pending" status doesn't seem to be useful...

@wardal
Copy link
Contributor

wardal commented Oct 21, 2017

Generally, all works fine with sqlite and 'django-db'. All you need is to set it in settings:

CELERY_RESULT_BACKEND = 'django-db'

And as was mentioned above, you need to run celery worker to make it work:
celery -A <your project name> worker -S django -l info

For test purposes, you can run any task from python shell, and you will see this task in worker log and also new entry in django_celery_results_taskresult table in your database or in django-admin panel.

PS: I used SQLite and 'django-db' settings, so all above from my experience.

@XaviTorello
Copy link

It works as expected with CELERY_RESULT_BACKEND = 'django-db'

Tested on a devel box with sqlite backend

@saini
Copy link

saini commented Mar 16, 2018

I am facing a similar issue. I am trying to get the task status after submitting a task.
Here is my code:

            submitted_task = process_input.delay(filepath)
            print("submitted task id: {i}".format(i=submitted_task.task_id))
            result = process_input.AsyncResult(submitted_task.task_id)
            print("task state: {state}".format(state= result.get()))

My Django app crashes here with an error msg:

Django Version: 1.11.7
Exception Type: NotImplementedError
Exception Value: No result backend is configured. Please see the documentation for more information.

Does anyone here know what I am doing wrong? And how to fix it?

In settings.py I have following:

INSTALLED_APPS = [
    ....
    ...
    'django_celery_results',
]
CELERY_RESULT_BACKEND` = 'django-db'
CELERY_BROKER_URL = 'amqp://localhost'

I have run the command python manage.py migrate django_celery_results
I can see a table in my database (MySql), named django_celery_results_taskresult
I can also see the tasks and their status there. I can see all status as SUCCESS. I never get to see the PENDING status, even for a newly submitted task. A task only gets inserted into the table once it is completed.

@bobonthetop
Copy link

I am also facing the same issue with django_celery_results_taskresult created after migrate but always empty.
I did put django_celery_results in my installed APP, and my requirements are :
django=1.11.1 celery==4.1.0 django-celery-results==1.0.1
The CELERY_RESULT_BACKEND url looks like this in the celery debug info : mysql://user:pass@localhost/table.

This is the only way I found to make the results backend not empty (django-db didn't work)
Am I missing something ?

PS: I have no errors or crashes

@andrewuscf
Copy link

andrewuscf commented Apr 9, 2018

Did anyone figure out a solution? I am facing the similar issue.

@mrname
Copy link

mrname commented Apr 10, 2018

May or may not be similar to the issues of others, but my mistake was that my app was not loading the celery settings from my settings.py. This was due to the way I was creating the celery app in tasks.py. I was being lazy and passing configuration directly to the app constructor like:

app = Celery('stuff', broker_url='whatev')

and then decorating my tasks with @app.task.

In order to load the celery settings from settings.py I added the following line:

app.config_from_object('django.conf:settings', namespace='CELERY')

After doing this, task results are now being stored in the database.

@riparian-imakedonsky
Copy link

Experiencing same issue, django 2.0.6

@ghost
Copy link

ghost commented Aug 7, 2018

Hi,

I'm experiencing exactly this same issue. Tasks only get added to the DB table once they are completed.

Django 1.11.x.

@orzel
Copy link

orzel commented Oct 3, 2018

Same here, using old celery (3.1.25), django-celery-3.2.1. Tasks are added, but only once finished (success or failure). As a result i can't use state 'PENDING' and 'STARTED' as i did previously (it was working previously, but can't say which versions were used)

@meetbinoy
Copy link

Hi,
I have followed this link: http://docs.celeryproject.org/en/latest/django/first-steps-with-django.html#django-celery-results-using-the-django-orm-cache-as-a-result-backend
to set up django-celery-results for my project. Migrations are running fine and i can see table django_celery_results_taskresult got created.
The problem is when i try to execute a task, it runs fine but the table i mentioned above didn't get populated.
I have put CELERY_RESULT_BACKEND = 'django-db' and have placed django_celery_results in my INSTALLED_APPS. There is no error in logs either. Can anyone give me hint what i am doing wrong? Or guide me how can i debug this issue?

Remove or Comment if you have these lines in settings file.

# http://docs.celeryproject.org/en/latest/userguide/configuration.html#task-always-eager
CELERY_TASK_ALWAYS_EAGER = True
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#task-eager-propagates
CELERY_TASK_EAGER_PROPAGATES = True

@findsomeoneyys
Copy link

i also has same problem, my database is postgresql and finally i found that when add CELERY_RESULT_BACKEND_DB = postgresql://... ,everythings works well.
my settings just like this:

CELERY_BROKER_URL = ..
CELERY_RESULT_BACKEND = django-db
CELERY_RESULT_BACKEND_DB = postgresql://..

@flatcoke
Copy link

Same issue here, did anyone figure out a solution?

@pziarsolo
Copy link

In my case with a postgresql database and with @findsomeoneyys configuration tips, results are added to the database but only when the processes are finished.
Somebody knows how to solve this?
Thanks in advance!

celery-4.2.1
django_celery_results-1.0.4
Django-2.1.4

@qcaron
Copy link

qcaron commented Jan 29, 2019

+1
Using postgreSQL I got it to work at some point but not anymore. I cannot find the reason why. I will try not to use django-celery-results for the moment 😞

I am pretty sure I was using django-db as a backend at first.

celery==4.2.1
django-celery-results==1.0.4
Django==2.0.10

@nbeuchat
Copy link

nbeuchat commented Feb 16, 2019

I have the same issue with PostgreSQL and a Redis broker. My tasks are added to the Task result table only on success or failure. Any update?

broker_url = 'redis://:*****@*******:6379/0'
result_backend = 'django-db'
result_backend_db = 'postgresql://..'

django-celery-results==1.0.4
celery==4.2.1
Django==2.1.3

The only other status I can get is STARTED with the setting task_track_started = True

@lorinkoz
Copy link

After facing the same problem and inspecting this project's code, it looks like the fault is on celery configuration, as the backend method to store a task's results is not called until the end of the task.

@auvipy auvipy closed this as completed May 6, 2019
@SamCreamer
Copy link

May or may not be similar to the issues of others, but my mistake was that my app was not loading the celery settings from my settings.py. This was due to the way I was creating the celery app in tasks.py. I was being lazy and passing configuration directly to the app constructor like:

app = Celery('stuff', broker_url='whatev')

and then decorating my tasks with @app.task.

In order to load the celery settings from settings.py I added the following line:

app.config_from_object('django.conf:settings', namespace='CELERY')

After doing this, task results are now being stored in the database.

This is likely the solution for many of the people in this thread. Even if you explicitly write the arguments in the constructor, you still need to call app.config_from_object('django.conf:settings', namespace='CELERY') on a subsequent line to load the rest of the config from the settings.py file.

@auvipy
Copy link
Member

auvipy commented Nov 14, 2019

which version you are using?

@AWinterman
Copy link

why was this issue closed? is the problem fixed?

@dangelsaurus
Copy link

dangelsaurus commented Jan 1, 2020

why was this issue closed? is the problem fixed?

If I had to guess, it's because all the issues were related to specific implementations, and not an issue with django-celery-results. I just installed it by following the docs, and no issue. Just make sure you restart your celery workers afterwards.

might try posting on stackoverlow

@havanhuy1997
Copy link

Did someone find the problems of this issue ?

@Vibrat
Copy link

Vibrat commented Jan 31, 2020

why was this issue closed? is the problem fixed?

If I had to guess, it's because all the issues were related to specific implementations, and not an issue with django-celery-results. I just installed it by following the docs, and no issue. Just make sure you restart your celery workers afterwards.

might try posting on stackoverlow

@dangelsaurus
Can you show me the list of library, database with versions you use to setup?

@kiwipedro
Copy link

I had the problem where the entries were not appearing at all on the django_celery_results_taskresult table.

The problem was in the Django settings file. In VSCode launch.json I have a postgres db set up for my local dev env, however the fallback settings in Django was set to a local sqlite DB.

DATABASES = {
    'default': {
        'ENGINE': os.environ.get('APP_DB_ENGINE', 'django.db.backends.sqlite3'),
        'NAME': os.environ.get('APP_DB_NAME', os.path.join(BASE_DIR, 'db.sqlite3')),
        'USER': os.environ.get('APP_DB_USER', ''),
        'PASSWORD': os.environ.get('APP_DB_PASSWORD', ''),
        'HOST': os.environ.get('APP_DB_HOST', None),
        'PORT': os.environ.get('APP_DB_PORT', None)
    }

This does make sense if you think about it, you have to make the celery worker aware of your local environment variables (this is my next challenge...presumably doing something with the args in launch.json)

What is unfortunate is that it fails silently: my sqlite file doesn't exist, so it would be awesome for debugging if the celery_results package could report this in the terminal window.

@auvipy auvipy reopened this Dec 30, 2020
@AllexVeldman
Copy link
Contributor

you have to make the celery worker aware of your local environment variables (this is my next challenge...presumably doing something with the args in launch.json)

I think VSCode supports .env files through the envFile, I use this in my Django projects + django-environ.
And some bonus points for docker-compose also supporting it so my local Postgres server is also in sync

@gmagno
Copy link

gmagno commented Apr 18, 2021

Remove or Comment if you have these lines in settings file.

# http://docs.celeryproject.org/en/latest/userguide/configuration.html#task-always-eager
CELERY_TASK_ALWAYS_EAGER = True
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#task-eager-propagates
CELERY_TASK_EAGER_PROPAGATES = True

As @plachira mentions here, try to either set both CELERY_TASK_ALWAYS_EAGER and CELERY_TASK_EAGER_PROPAGATES to False or just comment them out (they are set to False by default).
If you leave them as True tasks are executed locally (synchronously) instead of being sent to the queue and run by a worker. That I believe, is the reason why django_celery_results is not able to pick the results and create the models.TaskResult entries, even though they are run.

This is a bit unfortunate though, "eager mode" is very useful when debugging issues. Perhaps someone else could shed some light on how to run tasks synchronously and still be able to use django_celery_results(?)

@h4k1m13or
Copy link

i managed to get it working on windows by adding --pool=solo :
celery -A <name> worker -l info --pool=solo

@auvipy auvipy closed this as completed Sep 20, 2021
@serozhenka
Copy link

Adding CELERY_RESULT_EXTENDED = True to the conf file and running worker with -E command solved it for me.

@auvipy auvipy pinned this issue Nov 18, 2022
@iMakedonsky
Copy link

@auvipy Have you considered making CELERY_TASK_EAGER_PROPAGATES==CELERY_TASK_ALWAYS_EAGER or at least making it true by default, and if yes, why not ?

Without it on local with eager you get no results in the db table, and no output in the console unless you read directly task_result.result for the traceback/output.

Super confusing.

@auvipy
Copy link
Member

auvipy commented Mar 6, 2023

would you mind sharing the improvement suggestion in the form of a PR? it will be easier for me to reasoning after see the code changed proposal

@batzkass
Copy link

batzkass commented Dec 19, 2023

Not directly linked to django-celery-results, but that might help...

For those having issue with the tasks being added to database (postgressql or other) only after they are finished, I just wanted to add that this is normal behavior. According to Celery documentation, tasks initial "STARTED" state are not tracked. This behavior can be changed:

It tooks me 2hrs to figure this out 😝

@MQ-xz
Copy link

MQ-xz commented Mar 10, 2024

Not directly linked to django-celery-results, but that might help...

For those having issue with the tasks being added to database (postgressql or other) only after they are finished, I just wanted to add that this is normal behavior. According to Celery documentation, tasks initial "STARTED" state are not tracked. This behavior can be changed:

It tooks me 2hrs to figure this out 😝

as @batzkass mentioned here adding CELERY_TASK_TRACK_STARTED=True in settings.py solved my issue, now task results are recorded on the start itself.

@batzkass thanks brother

@stalkerxxl
Copy link

CELERY_RESULT_BACKEND_DB = postgresql://..
adding this line to the config helped.. the results are written to the database

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests