У меня настроен автономный кластер Spark на двух windows машинах.
Чтобы выполнить JAR-программу на кластере:
E:\spark-2.3.0\bin>spark-submit --class scala.Property1 --master spark://19.1.12.123:6066 --deploy-mode cluster "C:\Users\singh\Desktop\321\prop1.jar" "C:\Users\singh\Desktop\321\sample.properties"
консоль рабочего узла
2020-01-22 12:36:54 INFO Worker:54 - Successfully registered with master spark: //19.1.12.123:7077 2020-01-22 12:38:12 INFO Worker:54 - Asked to launch driver driver-202001221232 50-0001 2020-01-22 12:38:12 INFO DriverRunner:54 - Copying user jar file:/C:/Users/singh/Desktop/321/prop1.jar to D:\spark-2.3.0\work\driver-20200122123250-0001\prop1.jar 2020-01-22 12:38:12 INFO Utils:54 - Copying C:\Users\singh\Desktop\321\prop1.jar to D:\spark-2.3.0\work\driver-20200122123250-0001\prop1.jar 2020-01-22 12:38:12 INFO DriverRunner:54 - Killing driver process! 2020-01-22 12:38:12 WARN Worker:66 - Driver driver-20200122123250-0001 failed with unrecoverable exception: java.nio.file.NoSuchFileException: C:\Users\singh\Desktop\321\prop1.jar
это программа:
object Property1 {
def main(args: Array[String]) {
val myProp= readProperties(args(0))
}
def readProperties(propertiesPath: String) = {
System.out.println("propertiesPath: "+propertiesPath);
val url = getClass.getResource("/" + propertiesPath)
val source = Source.fromFile(propertiesPath)
val properties = new Properties
properties.load(source.bufferedReader)
val name=properties.getProperty("name")
System.out.println("name: "+name);
}
}
sample.properties:
name=singh
пожалуйста, дайте правильное предложение ... спасибо. :)