В воздушном потоке apache я написал PythonOperator, который использует pyspark для запуска задания в режиме кластера пряжи. Я инициализирую объект sparksession следующим образом.
spark = SparkSession \
.builder \
.appName("test python operator") \
.master("yarn") \
.config("spark.submit.deployMode","cluster") \
.getOrCreate()
Однако, когда я запускаю свой dag, я получаю исключение.
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/airflow/models/taskinstance.py", line 983, in _run_raw_task
result = task_copy.execute(context=context)
File "/usr/local/lib/python3.8/dist-packages/airflow/operators/python_operator.py", line 113, in execute
return_value = self.execute_callable()
File "/usr/local/lib/python3.8/dist-packages/airflow/operators/python_operator.py", line 118, in execute_callable
return self.python_callable(*self.op_args, **self.op_kwargs)
File "/catfish/dags/dags_dag_test_python_operator.py", line 39, in print_count
spark = SparkSession \
File "/usr/local/lib/python3.8/dist-packages/pyspark/sql/session.py", line 186, in getOrCreate
sc = SparkContext.getOrCreate(sparkConf)
File "/usr/local/lib/python3.8/dist-packages/pyspark/context.py", line 371, in getOrCreate
SparkContext(conf=conf or SparkConf())
File "/usr/local/lib/python3.8/dist-packages/pyspark/context.py", line 128, in __init__
SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
File "/usr/local/lib/python3.8/dist-packages/pyspark/context.py", line 320, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway(conf)
File "/usr/local/lib/python3.8/dist-packages/pyspark/java_gateway.py", line 105, in launch_gateway
raise Exception("Java gateway process exited before sending its port number")
Exception: Java gateway process exited before sending its port number
Я также устанавливаю PYSPARK_SUBMIT_ARGS , но у меня не работает!