Использование API PySpark:
>>> df = spark.createDataFrame([("10.0.0.1", "session1,session2"), ("10.0.0.2", "session1,session3,session4")], ["ip", "session"])
>>> df.show(100, False)
+--------+--------------------------+
|ip |session |
+--------+--------------------------+
|10.0.0.1|session1,session2 |
|10.0.0.2|session1,session3,session4|
+--------+--------------------------+
>>> from pyspark.sql.functions import *
>>> df = df.withColumn("count", size(split(col("session"), ",")))
>>> df.show(100, False)
+--------+--------------------------+-----+
|ip |session |count|
+--------+--------------------------+-----+
|10.0.0.1|session1,session2 |2 |
|10.0.0.2|session1,session3,session4|3 |
+--------+--------------------------+-----+
Подробнее об API PySpark можно узнать здесь: https://spark.apache.org/docs/latest/api/python/pyspark.sql.html