Кажется, у меня проблемы с памятью при использовании пакета PySpark ML.Я пытаюсь использовать ALS.fit на 40 миллионов строк данных.Использование JDK-11 привело к ошибке:
"java.lang.NoSuchMethodError: sun.nio.ch.DirectBuffer.cleaner()Lsun/misc/Cleaner"
Это работало с 13 миллионами строк, так что я думаю, это проблема очистки памяти.
Я пробовал использовать java JDK-8, как предложеноздесь: Метод Apache Spark не найден sun.nio.ch.DirectBuffer.cleaner () Lsun / misc / Cleaner;
, но я все еще сталкиваюсь с ошибкой, поскольку кучи памяти недостаточно: Я получаю это сообщение об ошибке:
"... java.lang.OutOfMemoryError: Java heap space ..."
У кого-то есть идея, как обойти это?
Я использую Ubuntu 18.04 LTS и Python 3.6 и PySpark 2.4.2.
edit:
Вот как я исправил свою конфигурацию Spark Context:
conf = spark.sparkContext._conf.setAll([
("spark.driver.extraJavaOptions","-Xss800M"),
("spark.memory.offHeap.enabled", True),
("spark.memory.offHeap.size","4g"),
('spark.executor.memory', '4g'),
('spark.app.name', 'Spark Updated Conf'),
('spark.executor.cores', '2'),
('spark.cores.max', '2'),
('spark.driver.memory','6g')])
Я не уверен, имеет ли это смысл!
Это первые строки сообщения об ошибке:
[Stage 8:==================================================> (186 + 12) / 200]19/07/02 14:43:29 WARN MemoryStore: Not enough space to cache rdd_37_196 in memory! (computed 3.6 MB so far)
19/07/02 14:43:29 WARN MemoryStore: Not enough space to cache rdd_37_192 in memory! (computed 5.8 MB so far)
19/07/02 14:43:29 WARN BlockManager: Persisting block rdd_37_192 to disk instead.
19/07/02 14:43:29 WARN BlockManager: Persisting block rdd_37_196 to disk instead.
19/07/02 14:43:29 WARN MemoryStore: Not enough space to cache rdd_37_197 in memory! (computed 3.7 MB so far)
19/07/02 14:43:29 WARN BlockManager: Persisting block rdd_37_197 to disk instead.
19/07/02 14:43:29 WARN MemoryStore: Not enough space to cache rdd_37_196 in memory! (computed 3.6 MB so far)
[Stage 8:======================================================>(197 + 3) / 200]19/07/02 14:43:29 WARN MemoryStore: Not enough space to cache rdd_37_192 in memory! (computed 5.8 MB so far)
[Stage 9:> (0 + 10) / 10]19/07/02 14:43:37 WARN BlockManager: Block rdd_40_3 could not be removed as it was not found on disk or in memory
19/07/02 14:43:37 WARN BlockManager: Block rdd_40_4 could not be removed as it was not found on disk or in memory
19/07/02 14:43:37 WARN BlockManager: Block rdd_40_7 could not be removed as it was not found on disk or in memory
19/07/02 14:43:37 WARN BlockManager: Block rdd_41_3 could not be removed as it was not found on disk or in memory
19/07/02 14:43:37 WARN BlockManager: Block rdd_41_4 could not be removed as it was not found on disk or in memory
19/07/02 14:43:37 WARN BlockManager: Block rdd_41_7 could not be removed as it was not found on disk or in memory
19/07/02 14:43:38 ERROR Executor: Exception in task 7.0 in stage 9.0 (TID 435)
java.lang.OutOfMemoryError: Java heap space
19/07/02 14:43:39 WARN BlockManager: Block rdd_40_5 could not be removed as it was not found on disk or in memory
19/07/02 14:43:38 ERROR Executor: Exception in task 4.0 in stage 9.0 (TID 432)
java.lang.OutOfMemoryError: Java heap space
at scala.collection.mutable.ArrayBuilder$ofInt.mkArray(ArrayBuilder.scala:327)
[...]