Я пытаюсь переписать запрос SQL в PySpark.Ниже приведен SQL-запрос:
SELECT
cs.Environment,
cs.AccountCode,
MIN(cs.StartDate) AS StartDate,
MIN(cs.FinalDate) AS FinalDate,
(
SELECT TOP 1 ItemCode
FROM [dbo].[Contracts]
WHERE
Environment = cs.Environment
AND AccountCode = cs.AccountCode
AND ContractType = 'C'
AND LinePackage = 1
AND InflowOutflow = 'Inflow'
AND EventDate <= GETDATE()
ORDER BY EventDate
) AS Package
FROM [dbo].[Contracts] cs
WHERE
cs.ContractType = 'C'
AND cs.LinePackage = 1
GROUP BY
cs.Environment,
cs.AccountCode
Мой код PySpark такой:
df = spark.sql(
"""select cs.environment, cs.accountcode,
min(cs.startdatets) as startdate, min(cs.finaldatets) as finaldate,
(select a.itemcode
from firstcomm as a
where a.environment = cs.environment and a.accountcode = cs.accountcode and a.contracttype = 'c' and a.eventdate <= current_date()
order by a.eventdate limit 1) as package
from firstcomm cs where cs.contracttype = 'c' and cs.linepackage = 1
group by cs.environment, cs.accountcode""")
, но я продолжаю получать эту ошибку:
AnalysisException: Accessing outer query column is not allowed in:
LocalLimit 1
+- Project [itemcode#3641]
+- Sort [eventdate#3629 ASC NULLS FIRST], true
+- Project [itemcode#3641, eventdate#3629]
+- Filter ((((environment#3628 = outer(environment#3628)) && (accountcode#3622 = outer(accountcode#3622))) && (contracttype#3626 = c)) && (((linepackage#3644 = 1) && (inflowoutflow#3637 = inflow)) && (eventdate#3629 <= current_date(Some(Zulu)))))
+- SubqueryAlias a
КстатиЯ использую Spark 2.2.1, который, как мне кажется, поддерживает подзапросы
Есть идеи, как решить эту проблему?или как мне переписать запрос, чтобы получить желаемый результат?