Azure Подключение блока данных с помощью Databricks-connect - PullRequest
0 голосов
/ 18 июня 2020

Я использую https://docs.databricks.com/dev-tools/databricks-connect.html для подключения к azure блокам данных

 #creating environment dbconnect
 (base) C:\>conda create --name dbconnect python=3.7
 (base) C:\>conda activate dbconnect

 (dbconnect) C:\>pip install -U databricks-connect==6.5
 (dbconnect) C:\>databricks-connect configure

enter image description here

после предоставления конфигурации Я запускаю тест подключения блока данных, я получаю следующее исключение. Exception («Java процесс шлюза завершен перед отправкой номера порта») Исключение: Java процесс шлюза завершен до отправки номера порта

Как сделать решить эту проблему

 (dbconnect) C:\>databricks-connect test
 * PySpark is installed at c:\anaconda3\envs\dbconnect\lib\site-packages\pyspark
 * Checking SPARK_HOME
 * Checking java version
 Picked up _JAVA_OPTIONS: -Djavax.net.ssl.trustStore=C:\Windows\Sun\Java\Deployment\trusted.certs
 openjdk version "11" 2018-09-25  
 OpenJDK Runtime Environment 18.9 (build 11+28)
 OpenJDK 64-Bit Server VM 18.9 (build 11+28, mixed mode)
 WARNING: Java versions >8 are not supported by this SDK
 * Skipping scala command test on Windows
 * Testing python command
 Picked up _JAVA_OPTIONS: -Djavax.net.ssl.trustStore=C:\Windows\Sun\Java\Deployment\trusted.certs
 Picked up _JAVA_OPTIONS: -Djavax.net.ssl.trustStore=C:\Windows\Sun\Java\Deployment\trusted.certs
 WARNING: An illegal reflective access operation has occurred
 WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform 
  (file:/C:/Anaconda3/envs/dbconnect/Lib/site-packages/pyspark/jars/spark-unsafe_2.11-2.4.6- 
  SNAPSHOT.jar) to method java.nio.Bits.unaligned()
  WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
  WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access 
  operations
  WARNING: All illegal access operations will be denied in a future release
  Exception in thread "main" java.lang.ExceptionInInitializerError
    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
    at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)
    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:273)
    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:261)
    at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:791)
    at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:761)
    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:634)
    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2666)
    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2666)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2666)
    at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:79)
    at org.apache.spark.deploy.SparkSubmit.secMgr$lzycompute$1(SparkSubmit.scala:348)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$secMgr$1(SparkSubmit.scala:348)
    at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:356)
    at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:356)
    at scala.Option.map(Option.scala:146)
    at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:355)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:774)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 Caused by: java.lang.StringIndexOutOfBoundsException: begin 0, end 3, length 2
    at java.base/java.lang.String.checkBoundsBeginEnd(String.java:3319)
    at java.base/java.lang.String.substring(String.java:1874)
    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:52)
    ... 25 more
 Traceback (most recent call last):
 File "c:\anaconda3\envs\dbconnect\lib\runpy.py", line 193, in _run_module_as_main
 "__main__", mod_spec)
 File "c:\anaconda3\envs\dbconnect\lib\runpy.py", line 85, in _run_code
 exec(code, run_globals)
 File "C:\Anaconda3\envs\dbconnect\Scripts\databricks-connect.exe\__main__.py", line 7, in <module>
 File "c:\anaconda3\envs\dbconnect\lib\site-packages\pyspark\databricks_connect.py", line 262, in 
main
test()
File "c:\anaconda3\envs\dbconnect\lib\site-packages\pyspark\databricks_connect.py", line 231, in test
spark = SparkSession.builder.getOrCreate()
File "c:\anaconda3\envs\dbconnect\lib\site-packages\pyspark\sql\session.py", line 185, in getOrCreate
sc = SparkContext.getOrCreate(sparkConf)
File "c:\anaconda3\envs\dbconnect\lib\site-packages\pyspark\context.py", line 372, in getOrCreate
SparkContext(conf=conf or SparkConf())
File "c:\anaconda3\envs\dbconnect\lib\site-packages\pyspark\context.py", line 133, in __init__
SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
File "c:\anaconda3\envs\dbconnect\lib\site-packages\pyspark\context.py", line 321, in 
_ensure_initialized
SparkContext._gateway = gateway or launch_gateway(conf)
File "c:\anaconda3\envs\dbconnect\lib\site-packages\pyspark\java_gateway.py", line 46, in 
launch_gateway
return _launch_gateway(conf)
File "c:\anaconda3\envs\dbconnect\lib\site-packages\pyspark\java_gateway.py", line 108, in 
_launch_gateway
raise Exception("Java gateway process exited before sending its port number")
Exception: Java gateway process exited before sending its port number

1 Ответ

1 голос
/ 18 июня 2020

openjdk версия "11" 2018-09-25
OpenJDK Runtime Environment 18.9 (сборка 11 + 28) 64-разрядная серверная виртуальная машина OpenJDK 18.9 (сборка 11 + 28, смешанный режим)

Установить Java 8. 11 не поддерживается.

Также проверьте номер порта. Вероятно, это должно быть 8787 на Azure.

Могут быть другие проблемы, но я бы решил их в первую очередь.

...