У меня есть следующий код:
file_name = gcs_export_uri_template + '/' + TABLE_PREFIX + '_' + TABLE_NAME + '{}.json' #{} is required for the operator. if file is big it breakes it to more files as 1.json 2.json etc
import_orders_op = MySqlToGoogleCloudStorageOperator(
task_id='import_orders',
mysql_conn_id='sqlcon',
google_cloud_storage_conn_id='gcpcon',
provide_context=True,
sql=""" SELECT * FROM {{ params.table_name }} WHERE orders_id > {{ params.last_imported_id }} AND orders_id < {{ ti.xcom_pull('get_max_order_id') }} limit 10 """,
params={'last_imported_id': LAST_IMPORTED_ORDER_ID, 'table_name' : TABLE_NAME},
bucket=GCS_BUCKET_ID,
filename=file_name,
dag=dag)
Это хорошо работает.Однако обратите внимание, что запрос имеет limit 10
Когда я удаляю его как:
sql=""" SELECT * FROM {{ params.table_name }} WHERE orders_id > {{ params.last_imported_id }} AND orders_id < {{ ti.xcom_pull('get_max_order_id') }} """,
Сбой:
[2018-10-08 09:09:38,830] {base_task_runner.py:98} INFO - Subtask: Traceback (most recent call last):
[2018-10-08 09:09:38,830] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/bin/airflow", line 27, in <module>
[2018-10-08 09:09:38,830] {base_task_runner.py:98} INFO - Subtask: args.func(args)
[2018-10-08 09:09:38,830] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/dist-packages/airflow/bin/cli.py", line 392, in run
[2018-10-08 09:09:38,830] {base_task_runner.py:98} INFO - Subtask: pool=args.pool,
[2018-10-08 09:09:38,830] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/dist-packages/airflow/utils/db.py", line 50, in wrapper
[2018-10-08 09:09:38,831] {base_task_runner.py:98} INFO - Subtask: result = func(*args, **kwargs)
[2018-10-08 09:09:38,831] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/dist-packages/airflow/models.py", line 1493, in _run_raw_task
[2018-10-08 09:09:38,831] {base_task_runner.py:98} INFO - Subtask: result = task_copy.execute(context=context)
[2018-10-08 09:09:38,831] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/dist-packages/airflow/contrib/operators/mysql_to_gcs.py", line 89, in execute
[2018-10-08 09:09:38,831] {base_task_runner.py:98} INFO - Subtask: files_to_upload = self._write_local_data_files(cursor)
[2018-10-08 09:09:38,831] {base_task_runner.py:98} INFO - Subtask: File "/usr/local/lib/python2.7/dist-packages/airflow/contrib/operators/mysql_to_gcs.py", line 134, in _write_local_data_files
[2018-10-08 09:09:38,832] {base_task_runner.py:98} INFO - Subtask: json.dump(row_dict, tmp_file_handle)
[2018-10-08 09:09:38,832] {base_task_runner.py:98} INFO - Subtask: File "/usr/lib/python2.7/json/__init__.py", line 189, in dump
[2018-10-08 09:09:38,832] {base_task_runner.py:98} INFO - Subtask: for chunk in iterable:
[2018-10-08 09:09:38,832] {base_task_runner.py:98} INFO - Subtask: File "/usr/lib/python2.7/json/encoder.py", line 434, in _iterencode
[2018-10-08 09:09:38,832] {base_task_runner.py:98} INFO - Subtask: for chunk in _iterencode_dict(o, _current_indent_level):
[2018-10-08 09:09:38,832] {base_task_runner.py:98} INFO - Subtask: File "/usr/lib/python2.7/json/encoder.py", line 390, in _iterencode_dict
[2018-10-08 09:09:38,832] {base_task_runner.py:98} INFO - Subtask: yield _encoder(value)
[2018-10-08 09:09:38,833] {base_task_runner.py:98} INFO - Subtask: UnicodeDecodeError: 'utf8' codec can't decode byte 0xa0 in position 5: invalid start byte
Я могу только предположить, что причиной является имя_файла с {}.json
Может быть, если у него слишком много записей и ему нужно разбить файл, он не может?
Я использую Airflow 1.9.0
В чем здесь проблема?