Airflow Celery worker: команда вернула ненулевой статус выхода 2 - PullRequest
0 голосов
/ 29 февраля 2020

Я верю, что у вас все хорошо.

Я получаю ниже ошибку и пытаюсь исправить это в течение нескольких часов, и мне повезло. Внизу бревна от сельдерея из потока рабочего воздуха.

    airflow command error: argument subcommand: invalid choice: 'tasks' (choose from 'backfill', 'list_dag_runs', 'list_tasks', 'clear', 'pause', 'unpause', 'trigger_dag', 'delete_dag', 'show_dag', 'pool', 'variables', 'kerberos', 'render', 'run', 'initdb', 'list_dags', 'dag_state', 'task_failed_deps', 'task_state', 'serve_logs', 'test', 'webserver', 'resetdb', 'upgradedb', 'checkdb', 'shell', 'scheduler', 'worker', 'flower', 'version', 'connections', 'create_user', 'delete_user', 'list_users', 'sync_perm', 'next_execution', 'rotate_fernet_key'), see help above.
usage: airflow [-h]
               {backfill,list_dag_runs,list_tasks,clear,pause,unpause,trigger_dag,delete_dag,show_dag,pool,variables,kerberos,render,run,initdb,list_dags,dag_state,task_failed_deps,task_state,serve_logs,test,webserver,resetdb,upgradedb,checkdb,shell,scheduler,worker,flower,version,connections,create_user,delete_user,list_users,sync_perm,next_execution,rotate_fernet_key}
               ...

positional arguments:
  {backfill,list_dag_runs,list_tasks,clear,pause,unpause,trigger_dag,delete_dag,show_dag,pool,variables,kerberos,render,run,initdb,list_dags,dag_state,task_failed_deps,task_state,serve_logs,test,webserver,resetdb,upgradedb,checkdb,shell,scheduler,worker,flower,version,connections,create_user,delete_user,list_users,sync_perm,next_execution,rotate_fernet_key}
                        sub-command help
    backfill            Run subsections of a DAG for a specified date range.
                        If reset_dag_run option is used, backfill will first
                        prompt users whether airflow should clear all the
                        previous dag_run and task_instances within the
                        backfill date range. If rerun_failed_tasks is used,
                        backfill will auto re-run the previous failed task
                        instances within the backfill date range.
    list_dag_runs       List dag runs given a DAG id. If state option is
                        given, it will onlysearch for all the dagruns with the
                        given state. If no_backfill option is given, it will
                        filter outall backfill dagruns for given dag id.
    list_tasks          List the tasks within a DAG
    clear               Clear a set of task instance, as if they never ran
    pause               Pause a DAG
    unpause             Resume a paused DAG
    trigger_dag         Trigger a DAG run
    delete_dag          Delete all DB records related to the specified DAG
    show_dag            Displays DAG's tasks with their dependencies
    pool                CRUD operations on pools
    variables           CRUD operations on variables
    kerberos            Start a kerberos ticket renewer
    render              Render a task instance's template(s)
    run                 Run a single task instance
    initdb              Initialize the metadata database
    list_dags           List all the DAGs
    dag_state           Get the status of a dag run
    task_failed_deps    Returns the unmet dependencies for a task instance
                        from the perspective of the scheduler. In other words,
                        why a task instance doesn't get scheduled and then
                        queued by the scheduler, and then run by an executor).
    task_state          Get the status of a task instance
    serve_logs          Serve logs generate by worker
    test                Test a task instance. This will run a task without
                        checking for dependencies or recording its state in
                        the database.
    webserver           Start a Airflow webserver instance
    resetdb             Burn down and rebuild the metadata database
    upgradedb           Upgrade the metadata database to latest version
    checkdb             Check if the database can be reached.
    shell               Runs a shell to access the database
    scheduler           Start a scheduler instance
    worker              Start a Celery worker node
    flower              Start a Celery Flower
    version             Show the version
    connections         List/Add/Delete connections
    create_user         Create an account for the Web UI (FAB-based)
    delete_user         Delete an account for the Web UI
    list_users          List accounts for the Web UI
    sync_perm           Update permissions for existing roles and DAGs.
    next_execution      Get the next execution datetime of a DAG.
    rotate_fernet_key   Rotate all encrypted connection credentials and
                        variables; see
                        https://airflow.readthedocs.io/en/stable/howto/secure-
                        connections.html#rotating-encryption-keys.

optional arguments:
  -h, --help            show this help message and exit

airflow command error: argument subcommand: invalid choice: 'tasks' (choose from 'backfill', 'list_dag_runs', 'list_tasks', 'clear', 'pause', 'unpause', 'trigger_dag', 'delete_dag', 'show_dag', 'pool', 'variables', 'kerberos', 'render', 'run', 'initdb', 'list_dags', 'dag_state', 'task_failed_deps', 'task_state', 'serve_logs', 'test', 'webserver', 'resetdb', 'upgradedb', 'checkdb', 'shell', 'scheduler', 'worker', 'flower', 'version', 'connections', 'create_user', 'delete_user', 'list_users', 'sync_perm', 'next_execution', 'rotate_fernet_key'), see help above.
[2020-03-01 00:11:41,941: ERROR/ForkPoolWorker-8] execute_command encountered a CalledProcessError
Traceback (most recent call last):
  File "/opt/rh/rh-python36/root/usr/lib/python3.6/site-packages/airflow/executors/celery_executor.py", line 69, in execute_command
    close_fds=True, env=env)
  File "/opt/rh/rh-python36/root/usr/lib64/python3.6/subprocess.py", line 311, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['airflow', 'tasks', 'run', 'airflow_worker_check_pipeline', 'dev_couchbase_backup', '2020-02-29T14:47:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/airflow/dags/project1/airflow_worker_check_pipeline.py']' returned non-zero exit status 2.
[2020-03-01 00:11:41,941: ERROR/ForkPoolWorker-8] None
[2020-03-01 00:11:41,996: ERROR/ForkPoolWorker-8] Task airflow.executors.celery_executor.execute_command[0e0c3d02-bdb3-4d16-a863-cbb3bb7a7137] raised unexpected: AirflowException('Celery command failed',)
Traceback (most recent call last):
  File "/opt/rh/rh-python36/root/usr/lib/python3.6/site-packages/airflow/executors/celery_executor.py", line 69, in execute_command
    close_fds=True, env=env)
  File "/opt/rh/rh-python36/root/usr/lib64/python3.6/subprocess.py", line 311, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['airflow', 'tasks', 'run', 'airflow_worker_check_pipeline', 'dev_couchbase_backup', '2020-02-29T14:47:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/airflow/dags/project1/airflow_worker_check_pipeline.py']' returned non-zero exit status 2.

During handling of the above exception, another exception occurred:

Версии планировщика воздушного потока и Mater: v2.0.0.dev0 docker платформа (Изображение -> apache / airflow master-ci) Версии рабочего потока воздушного потока: v1.10.9 ( установка вручную / не docker платформа)

Я подозреваю, что это может быть связано с несовпадением версий, и я попытался обновить рабочую версию воздушного потока, но, к сожалению, не смог найти эту версию

ERROR: Could not find a version that satisfies the requirement apache-airflow[celery]=={v2.0.0} (from versions: **1.10.9-bin, 1.8.1, 1.8.2rc1, 1.8.2, 1.9.0, 1.10.0, 1.10.1b1, 1.10.1rc2, 1.10.1, 1.10.2b2, 1.10.2rc1, 1.10.2rc2, 1.10.2rc3, 1.10.2, 1.10.3b1, 1.10.3b2, 1.10.3rc1, 1.10.3rc2, 1.10.3, 1.10.4b2, 1.10.4rc1, 1.10.4rc2, 1.10.4rc3, 1.10.4rc4, 1.10.4rc5, 1.10.4, 1.10.5rc1, 1.10.5, 1.10.6rc1, 1.10.6rc2, 1.10.6, 1.10.7rc1, 1.10.7rc2, 1.10.7rc3, 1.10.7, 1.10.8rc1, 1.10.8, 1.10.9rc1, 1.10.9**)
ERROR: No matching distribution found for apache-airflow[celery]=={v2.0.0}

Можно ли как-нибудь пройти эту задачу? Пожалуйста, помогите ..

1 Ответ

0 голосов
/ 02 марта 2020

Похоже, вы пытаетесь выполнить что-то вроде airflow tasks run .... Воздушный поток не имеет этого позиционного аргумента. Допустимые «команды» воздушного потока: backfill,list_dag_runs,list_tasks,clear,pause,unpause,trigger_dag,delete_dag,show_dag,pool,variables,kerberos,render,run,initdb,list_dags,dag_state,task_failed_deps,task_state,serve_logs,test,webserver,resetdb,upgradedb,checkdb,shell,scheduler,worker,flower,version,connections,create_user,delete_user,list_users,sync_perm,next_execution,rotate_fernet_key.

Насколько я знаю (я только что проверил дважды), на момент написания этого текста Apache Airflow v2.0.0 не было выпущено - только 1.10.9, который был выпущен на прошлой неделе.

Добро пожаловать на сайт PullRequest, где вы можете задавать вопросы и получать ответы от других членов сообщества.
...