Добавьте графическую рамку искры в pycharm - PullRequest
0 голосов
/ 20 апреля 2020

Я хочу добавить graphframe0.8.0 на pycharm.I Я добавил:

(Выполнить -> Изменить конфигурацию -> Выбрать конфигурацию -> Выбрать вкладку конфигурации -> Выбрать переменные среды -> Добавить PYSPARK_SUBMIT_ARGS)

и

sc=SparkSession.builder.appName("Hello").master("local[*]").config('spark.jars.packages', 'graphframes:graphframes:0.6.0-spark2.3.0-s_2.11').getOrCreate()

в pycharm, но Я получаю сообщение об ошибке , сообщающее, что графическая структура не была добавлена:

::::::::::::::::::::::::::::::::::::::::::::::
:: graphframes#graphframes;0.8.0-spark2.4.0-s_2.11: not found
::::::::::::::::::::::::::::::::::::::::::::::



:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS 
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: graphframes#graphframes;0.8.0-spark2.4.0-s_2.11: not found]
    at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1302)
    at org.apache.spark.deploy.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:54)
    at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:304)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:774)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Traceback (most recent call last):
   File "/home/spark/PycharmProjects/test/Hello.py", line 13, in <module>
     spark = SparkSession.builder.getOrCreate()
   File "/home/spark/PycharmProjects/test/venv/lib/python3.7/site-packages/pyspark/sql/session.py", line 173, in getOrCreate
     sc = SparkContext.getOrCreate(sparkConf)
   File "/home/spark/PycharmProjects/test/venv/lib/python3.7/site-packages/pyspark/context.py", line 367, in getOrCreate
     SparkContext(conf=conf or SparkConf())
   File "/home/spark/PycharmProjects/test/venv/lib/python3.7/site-packages/pyspark/context.py", line 133, in __init__
     SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
   File "/home/spark/PycharmProjects/test/venv/lib/python3.7/site-packages/pyspark/context.py", line 316, in _ensure_initialized
     SparkContext._gateway = gateway or launch_gateway(conf)
   File "/home/spark/PycharmProjects/test/venv/lib/python3.7/site-packages/pyspark/java_gateway.py", line 46, in launch_gateway
     return _launch_gateway(conf)
   File "/home/spark/PycharmProjects/test/venv/lib/python3.7/site-packages/pyspark/java_gateway.py", line 108, in _launch_gateway
     raise Exception("Java gateway process exited before sending its port number")
Exception: Java gateway process exited before sending its port number
...