0
from pyspark.sql.window import Window
from pyspark.sql import functions as F
maxcol = func.udf(lambda row: F.max(row))
temp = [(("ID1", '2019-01-01', '2019-02-01')), (("ID2", '2018-01-01', '2019-05-01')), (("ID3", '2019-06-01', '2019-04-01'))]
t1 = spark.createDataFrame(temp, ["ID", "colA", "colB"])
maxDF = t1.withColumn("maxval", maxcol(F.struct([t1[x] for x in t1.columns[1:]])))

我想要的只是一个新列,其中包含 colA 和 ColB 的最大日期。我正在运行相同的代码,当我执行 maxDF.show 时,我遇到了以下错误:

 'NoneType' object has no attribute '_jvm'
4

2 回答 2

0

类似于此代码

from pyspark.sql import SparkSession
from pyspark.sql import functions as F
from pyspark.sql import column

spark = SparkSession.builder.appName("Python Spark").getOrCreate()

temp = [("ID1", '2019-01-01', '2019-02-01'), ("ID2", '2018-01-01', '2019-05-01'),
        ("ID3", '2019-06-01', '2019-04-01')]

t1 = spark.createDataFrame(temp, ["ID", "colA", "colB"])

maxDF = t1.withColumn("maxval", F.greatest(t1["colA"], t1["colB"]))
maxDF.show()

输出

| ID|      colA|      colB|    maxval|
+---+----------+----------+----------+
|ID1|2019-01-01|2019-02-01|2019-02-01|
|ID2|2018-01-01|2019-05-01|2019-05-01|
|ID3|2019-06-01|2019-04-01|2019-06-01|
+---+----------+----------+----------+
于 2019-09-26T19:14:22.213 回答
0

您也可以尝试这样的事情......首先使用to_date()转换为 Date 对象,然后比较:

from pyspark.sql.functions import *

temp = [(("ID1", '2019-01-01', '2019-02-01')), (("ID2", '2018-01-01', '2019-05-01')), (("ID3", '2019-06-01', '2019-04-01'))]
t1 = spark.createDataFrame(temp, ["ID", "colA", "colB"])
t2 = t1.select("ID", to_date(t1.colA).alias('colADate'), to_date(t1.colB).alias('colBDate'))
t3 = t2.withColumn('maxDateFromRow', when(t2.colADate > t2.colBDate, t2.colADate).otherwise(t2.colBDate))

t3.show()

返回:

+---+----------+----------+--------------+
| ID|  colADate|  colBDate|maxDateFromRow|
+---+----------+----------+--------------+
|ID1|2019-01-01|2019-02-01|    2019-02-01|
|ID2|2018-01-01|2019-05-01|    2019-05-01|
|ID3|2019-06-01|2019-04-01|    2019-06-01|
+---+----------+----------+--------------+
于 2019-09-26T21:21:45.627 回答