Spark 2,4 и далее, вы можете использовать функции высшего порядка, доступные в sql.
scala> val df = Seq((1,Seq(2,4,5)),(5,Seq(6,5,3))).toDF("a","b")
df: org.apache.spark.sql.DataFrame = [a: int, b: array<int>]
scala> df.createOrReplaceTempView("ashima")
scala> spark.sql(""" select a, b, transform(b, x -> x * a) as result from ashima """).show(false)
+---+---------+------------+
|a |b |result |
+---+---------+------------+
|1 |[2, 4, 5]|[2, 4, 5] |
|5 |[6, 5, 3]|[30, 25, 15]|
+---+---------+------------+
scala>