Я запускаю конвейер потока данных от Composer-> airflow.Когда я запускаю следующий код:
(table_output | beam.io.Write(
beam.io.BigQuerySink(
<dataset.table>,
<schema>,
create_disposition = beam.io.BigQueryDisposition.CREATE_IF_NEEDED,
write_disposition='TRUNCATE')))
Это происходит с ошибкой: AttributeError: у объекта модуля нет атрибутов хранения
Чего мне не хватает?Вот трассировка стека:
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 766, in run
self._load_main_session(self.local_staging_directory)
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 482, in _load_main_session
pickler.load_session(session_file)
File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 254, in load_session
return dill.load_session(file_path)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 363, in load_session
module = unpickler.load()
File "/usr/lib/python2.7/pickle.py", line 864, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
value = func(*args)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 767, in _import_module
return getattr(__import__(module, None, None, [obj]), obj)
AttributeError: \'module\' object has no attribute \'storage\'