You can achieve it with the below approach
val input_df = spark.sparkContext.parallelize(List(1, 2, 3, 4, 5)).toDF("col1")
input_df.show(false)
Input:
+----+
|col1|
+----+
|1 |
|2 |
|3 |
|4 |
|5 |
+----+
val output_df = input_df.rdd.map(x => x(0).toString()).map(x => (x, Range(0, x.toInt + 1).mkString(","))).toDF("col1", "col2")
output_df.withColumn("col2", split($"col2", ",")).show(false)
Output:
+----+------------------+
|col1|col2 |
+----+------------------+
|1 |[0, 1] |
|2 |[0, 1, 2] |
|3 |[0, 1, 2, 3] |
|4 |[0, 1, 2, 3, 4] |
|5 |[0, 1, 2, 3, 4, 5]|
+----+------------------+
Надеюсь, это поможет!