Pyspark: плоская колонка Array - PullRequest
1 голос
/ 30 января 2020

Я анализирую Azure EventHub avro message. Последний столбец в массиве. Я пытаюсь сгладить это.

До:

{"records":[{"time":"2020-01-28T04:50:20.0975886Z","resourceId":"/SUBSCRIPTIONS/xxxxxxxxxxxx/RESOURCEGROUPS/xxxxx-xxxxxxxI/PROVIDERS/MICROSOFT.COMPUTE/DISKS/7C3E07DE8xxxxxxx-0-SCRATCHVOLUME","operationName":"MICROSOFT.COMPUTE/DISKS/DELETE","category":"Administrative","resultType":"Start","resultSignature":"Started.","durationMs":"0","callerIpAddress":"43.121.152.99","correlationId":"xxxxxxx"},{"time":"2020-01-28T04:50:20.1122888Z","resourceId":"/SUBSCRIPTIONS/xxxxxxxxxxxx/RESOURCEGROUPS/xxxxx-xxxxxxxI/PROVIDERS/MICROSOFT.COMPUTE/DISKS/7C3E07DE8xxxxxxx-0-SCRATCHVOLUME","operationName":"MICROSOFT.COMPUTE/DISKS/DELETE","category":"Administrative","resultType":"Success","resultSignature":"Succeeded.NoContent","durationMs":"14","callerIpAddress":"43.121.152.99","correlationId":"xxxxxxx"}]}

Это то, что я придумал, я думаю, что я очень близок. Я получил структуру и могу удалить первое значение «записи», но не могу обработать массив внутри него.

from pyspark.sql.types import StringType, IntegerType, StructType, StructField
from pyspark.sql.functions import from_json, col
from pyspark.sql.functions import explode, flatten
from pyspark.sql.types import StringType, StructField, StructType, BooleanType, ArrayType, IntegerType

# Creates a DataFrame from a specified directory
df = spark.read.format("avro").load("/mnt/test/xxxxxx/xxxxxxxx/31.avro")

# cast a binary column(Body) into string
df = df.withColumn("Body", col("Body").cast("string"))

sourceSchema= StructType([
        StructField("records", ArrayType(
            StructType([
                StructField("time", StringType(), True),
                StructField("resourceId", StringType(), True),
                StructField("operationName", StringType(), True),
                StructField("category", StringType(), True),
                StructField("resultType", StringType(), True),
                StructField("resultSignature", StringType(), True),
                StructField("durationMs", StringType(), True),
                StructField("callerIpAddress", StringType(), True),
                StructField("correlationId", StringType(), True)
            ])
        ), True)
    ])

df = df.withColumn("Body", from_json(df.Body, sourceSchema))

# Flatten Body
for c in df.schema['Body'].dataType:
    df2 = df.withColumn(c.name, col("Body." + c.name))
    display(df2)

После:

[{"time":"2020-01-28T04:50:20.0975886Z","resourceId":"/SUBSCRIPTIONS/xxxxxxxxxxxx/RESOURCEGROUPS/xxxxx-xxxxxxxI/PROVIDERS/MICROSOFT.COMPUTE/DISKS/7C3E07DE8xxxxxxx-0-SCRATCHVOLUME","operationName":"MICROSOFT.COMPUTE/DISKS/DELETE","category":"Administrative","resultType":"Start","resultSignature":"Started.","durationMs":"0","callerIpAddress":"43.121.152.99","correlationId":"xxxxxxx"},{"time":"2020-01-28T04:50:20.1122888Z","resourceId":"/SUBSCRIPTIONS/xxxxxxxxxxxx/RESOURCEGROUPS/xxxxx-xxxxxxxI/PROVIDERS/MICROSOFT.COMPUTE/DISKS/7C3E07DE8xxxxxxx-0-SCRATCHVOLUME","operationName":"MICROSOFT.COMPUTE/DISKS/DELETE","category":"Administrative","resultType":"Success","resultSignature":"Succeeded.NoContent","durationMs":"14","callerIpAddress":"43.121.152.99","correlationId":"xxxxxxx"}]

Ответы [ 2 ]

0 голосов
/ 12 февраля 2020

Я вижу, у многих есть этот вопрос, надеюсь, это поможет.

# Read Event Hub's stream
# if reading from file: Supported file formats are text, csv, json, orc, parquet

conf = {}
conf["eventhubs.connectionString"] = "Endpoint=sb://xxxxxxxxxxxx.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=jxxxxxxxxxxxx/xxxxxxxxxxxx=;EntityPath=eventhub"

# define dataframe for reading stream
read_df = (
  spark
    .readStream
    .format("eventhubs")
    .options(**conf)
    .option('multiLine', True)
    .option('mode', 'PERMISSIVE')
    .load()
)

# define struct for writing
sourceSchema= StructType([
        StructField("records", ArrayType(
            StructType([
                StructField("time", StringType(), True),
                StructField("resourceId", StringType(), True),
                StructField("operationName", StringType(), True),
                StructField("category", StringType(), True),
                StructField("resultType", StringType(), True),
                StructField("resultSignature", StringType(), True),
                StructField("durationMs", StringType(), True),
                StructField("callerIpAddress", StringType(), True),
                StructField("correlationId", StringType(), True)
            ])
        ), True)
    ])

# convert binary to string
decoded_df = read_df.select(F.from_json(F.col("body").cast("string"), sourceSchema).alias("payload"))

# write to memory
query1 = (
  decoded_df
    .writeStream
    .format("memory")
    .queryName("read_hub")
    .start()
)
0 голосов
/ 30 января 2020

возможно попробуйте это:

import pandas as pd
from pandas.io.json import json_normalize
s = {"records":[{"time":"2020-01-28T04:50:20.0975886Z","resourceId":"/SUBSCRIPTIONS/xxxxxxxxxxxx/RESOURCEGROUPS/xxxxx-xxxxxxxI/PROVIDERS/MICROSOFT.COMPUTE/DISKS/7C3E07DE8xxxxxxx-0-SCRATCHVOLUME","operationName":"MICROSOFT.COMPUTE/DISKS/DELETE","category":"Administrative","resultType":"Start","resultSignature":"Started.","durationMs":"0","callerIpAddress":"43.121.152.99","correlationId":"xxxxxxx"},{"time":"2020-01-28T04:50:20.1122888Z","resourceId":"/SUBSCRIPTIONS/xxxxxxxxxxxx/RESOURCEGROUPS/xxxxx-xxxxxxxI/PROVIDERS/MICROSOFT.COMPUTE/DISKS/7C3E07DE8xxxxxxx-0-SCRATCHVOLUME","operationName":"MICROSOFT.COMPUTE/DISKS/DELETE","category":"Administrative","resultType":"Success","resultSignature":"Succeeded.NoContent","durationMs":"14","callerIpAddress":"43.121.152.99","correlationId":"xxxxxxx"}]}
json_normalize(s).values

Результат, который вы получите:

array([[list([{'time': '2020-01-28T04:50:20.0975886Z', 'resourceId': '/SUBSCRIPTIONS/xxxxxxxxxxxx/RESOURCEGROUPS/xxxxx-xxxxxxxI/PROVIDERS/MICROSOFT.COMPUTE/DISKS/7C3E07DE8xxxxxxx-0-SCRATCHVOLUME', 'operationName': 'MICROSOFT.COMPUTE/DISKS/DELETE', 'category': 'Administrative', 'resultType': 'Start', 'resultSignature': 'Started.', 'durationMs': '0', 'callerIpAddress': '43.121.152.99', 'correlationId': 'xxxxxxx'}, {'time': '2020-01-28T04:50:20.1122888Z', 'resourceId': '/SUBSCRIPTIONS/xxxxxxxxxxxx/RESOURCEGROUPS/xxxxx-xxxxxxxI/PROVIDERS/MICROSOFT.COMPUTE/DISKS/7C3E07DE8xxxxxxx-0-SCRATCHVOLUME', 'operationName': 'MICROSOFT.COMPUTE/DISKS/DELETE', 'category': 'Administrative', 'resultType': 'Success', 'resultSignature': 'Succeeded.NoContent', 'durationMs': '14', 'callerIpAddress': '43.121.152.99', 'correlationId': 'xxxxxxx'}])]],
  dtype=object)
Добро пожаловать на сайт PullRequest, где вы можете задавать вопросы и получать ответы от других членов сообщества.
...