5
import falcon
import json
from tasks import add
from waitress import serve


class tasksresource:
    def on_get(self, req, resp):
        """Handles GET requests"""
        self.result = add.delay(1, 2)
        self.context = {'ID': self.result.id, 'final result': self.result.ready()}
        resp.body = json.dumps(self.context)



api = falcon.API()
api.add_route('/result', tasksresource())
# api.add_route('/result/task', taskresult())
if __name__ == '__main__':
    serve(api, host='127.1.0.1', port=5555)

我如何从 json 有效负载(发布数据)获取任务 ID 并向其添加路由

4

2 回答 2

8

这里是一个小例子。文件结构:

/project
      __init__.py
      app.py # routes, falcon etc.
      tasks.py # celery 
example.py # script for demonstration how it works 

应用程序.py:

import json

import falcon
from tasks import add
from celery.result import AsyncResult


class StartTask(object):

    def on_get(self, req, resp):
        # start task
        task = add.delay(4, 4)
        resp.status = falcon.HTTP_200
        # return task_id to client
        result = {'task_id': task.id}
        resp.body = json.dumps(result)


class TaskStatus(object):

    def on_get(self, req, resp, task_id):
        # get result of task by task_id and generate content to client
        task_result = AsyncResult(task_id)
        result = {'status': task_result.status, 'result': task_result.result}
        resp.status = falcon.HTTP_200
        resp.body = json.dumps(result)


app = falcon.API()

# registration of routes
app.add_route('/start_task', StartTask())
app.add_route('/task_status/{task_id}', TaskStatus())

任务.py:

from time import sleep

import celery


app = celery.Celery('tasks', broker='redis://localhost:6379/0', backend='redis://localhost:6379/0')


@app.task
def add(x, y):
    """
    :param int x:
    :param int y:
    :return: int
    """
    # sleep just for demonstration
    sleep(5)

    return x + y

现在我们需要启动celery应用程序。转到project文件夹并运行:

celery -A tasks worker --loglevel=info

在此之后,我们需要启动Falcon应用程序。转到project文件夹并运行:

gunicorn app:app

好的。一切都准备好了。

example.py是小客户端,可以帮助理解:

from time import sleep

import requests
# start new task
task_info = requests.get('http://127.0.0.1:8000/start_task')
task_info = task_info.json()

while True:
    # check status of task by task_id while task is working
    result = requests.get('http://127.0.0.1:8000/task_status/' + task_info['task_id'])
    task_status = result.json()

    print task_status

    if task_status['status'] == 'SUCCESS' and task_status['result']:
        print 'Task with id = %s is finished' % task_info['task_id']
        print 'Result: %s' % task_status['result']
        break
    # sleep and check status one more time
    sleep(1)

只需调用python ./example.py,您应该会看到如下内容:

{u'status': u'PENDING', u'result': None}
{u'status': u'PENDING', u'result': None}
{u'status': u'PENDING', u'result': None}
{u'status': u'PENDING', u'result': None}
{u'status': u'PENDING', u'result': None}
{u'status': u'SUCCESS', u'result': 8}
Task with id = 76542904-6c22-4536-99d9-87efd66d9fe7 is finished
Result: 8

希望这对您有所帮助。

于 2017-02-01T21:02:07.863 回答
0

Danila Ganchar 的上述示例非常棒,非常有帮助。我正在使用带有 Python 3 的 celery 版本 4.3.0,我使用上面的示例收到的错误之一是在这一行:

task_result = AsyncResult(task_id)

我会收到的错误是:

AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for'

这可能是最近的更改,但是result.AsyncResult(或者只是AsyncResult在此示例中,因为他从 celery.result 导入了它)不知道您正在使用的后端。解决这个问题有两种解决方案:

1)您可以获取实际任务本身的 AsyncResult,add.AsyncResult(task_id)因为该add任务已经具有通过 @app.task 装饰器定义的后端。此示例中的缺点是您希望能够通过 Falcon 端点传入 task_id 来获取任何任务的结果,因此这是有限的

2) 首选方法是只将app参数传递给 AsyncResult 函数:

task = result.AsyncResult(id, app=app)

希望这可以帮助!

于 2019-10-15T18:17:54.313 回答